WorldWideScience

Sample records for clinical pre-test probability

  1. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie; Larsen, Torben Bjerregaard; Lundbye-Christensen, Søren; Elbrønd, Henrik; Thorlacius-Ussing, Ole

    2008-01-01

    clinical pre-test probability (PTP) can be safely used to rule out the tentative diagnosis of DVT in cancer patients. However, the accuracy in colorectal cancer patients is uncertain. This study assessed the diagnostic accuracy of a quantitative D-dimer assay in combination with the PTP score in ruling out...... preoperative DVT in colorectal cancer patients admitted for surgery. Preoperative D-dimer test and compression ultrasonography for DVT were performed in 193 consecutive patients with newly diagnosed colorectal cancer. Diagnostic accuracy indices of the D-dimer test were assessed according to the PTP score. The...... negative predictive value, positive predictive value, sensitivity and specificity were 99% (95% confidence interval (CI), 95-100%), 17% (95% CI, 9-26), 93% (95% CI, 68-100%) and 61% (95% CI, 53-68%), respectively. In conclusion, the combined use of pre-test probability and D-dimer test may be useful in...

  2. The accuracy of clinical and biochemical estimates in defining the pre-test probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Full text: The PIOPED survey confirmed the significance of the high probability ventilation/perfusion scan (HP V/Q scan) in establishing the diagnosis of pulmonary embolism (PE). In an interesting sentence, however, the authors indicated that 'the clinicians' assessment of the likelihood of PE (prior probability)' can substantially increase the predictive value of the investigation. The criteria used for this assessment were not published, and this statement conflicts with the belief that the clinical diagnosis of pulmonary embolism is unreliable. A medical history was obtained from 668 patients undergoing V/Q lung scans for suspected PE, and certain clinical features linked to PE were, when present, documented. These included pleuritic chest pain, haemoptysis, dyspnoea, clinical evidence of DVT, recent surgery and right ventricular strain pattern an ECG. D-Dimer levels and initial arterial oxygen saturation (PaO2) levels were also obtained. The prevalence of these clinical and biochemical criteria was then compared between HP (61) and normal (171) scans after exclusion of all equivocal or intermediate scan outcomes (436), (where lung scintigraphy was unable to provide a definite diagnosis). D-Dimer and/or oxygen saturation levels, were similarly compared in each group. A true positive result was scored for each clinical or biochemical criterion when linked with a high probability scan and, conversely, a false positive score when the scan outcome was normal. In this fashion, the positive predictive value (PPV) and, when appropriate, the negative predictive value (NPV) was obtained for each risk factor. In the context of PE, DVT and post-operative status prove the more reliable predictors of a high probability outcome. Where both features were present, the PPV rose to 0.57. A normal D-Dimer level was a better excluder of PE than a normal oxygen saturation level (NPV 0.78-v-0.44). Conversely, a raised D-Dimer, or reduced oxygen saturation, were both a little value in

  3. A methodological proposal to research patients’ demands and pre-test probabilities using paper forms in primary care settings

    Directory of Open Access Journals (Sweden)

    Gustavo Diniz Ferreira Gusso

    2013-04-01

    Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.

  4. Accuracy of dual-source CT coronary angiography: first experience in a high pre-test probability population without heart rate control

    Energy Technology Data Exchange (ETDEWEB)

    Scheffel, Hans; Alkadhi, Hatem; Desbiolles, Lotus; Frauenfelder, Thomas; Schertler, Thomas; Husmann, Lars; Marincek, Borut; Leschka, Sebastian [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Plass, Andre; Vachenauer, Robert; Grunenfelder, Juerg; Genoni, Michele [Clinic for Cardiovascular Surgery, Zurich (Switzerland); Gaemperli, Oliver; Schepis, Tiziano [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); University of Zurich, Center for Integrative Human Physiology, Zurich (Switzerland)

    2006-12-15

    The aim of this study was to assess the diagnostic accuracy of dual-source computed tomography (DSCT) for evaluation of coronary artery disease (CAD) in a population with extensive coronary calcifications without heart rate control. Thirty patients (24 male, 6 female, mean age 63.1{+-}11.3 years) with a high pre-test probability of CAD underwent DSCT coronary angiography and invasive coronary angiography (ICA) within 14{+-}9 days. No beta-blockers were administered prior to the scan. Two readers independently assessed image quality of all coronary segments with a diameter {>=}1.5 mm using a four-point score (1: excellent to 4: not assessable) and qualitatively assessed significant stenoses as narrowing of the luminal diameter >50%. Causes of false-positive (FP) and false-negative (FN) ratings were assigned to calcifications or motion artifacts. ICA was considered the standard of reference. Mean body mass index was 28.3{+-}3.9 kg/m{sup 2} (range 22.4-36.3 kg/m{sup 2}), mean heart rate during CT was 70.3{+-}14.2 bpm (range 47-102 bpm), and mean Agatston score was 821{+-}904 (range 0-3,110). Image quality was diagnostic (scores 1-3) in 98.6% (414/420) of segments (mean image quality score 1.68{+-}0.75); six segments in three patients were considered not assessable (1.4%). DSCT correctly identified 54 of 56 significant coronary stenoses. Severe calcifications accounted for false ratings in nine segments (eight FP/one FN) and motion artifacts in two segments (one FP/one FN). Overall sensitivity, specificity, positive and negative predictive value for evaluating CAD were 96.4, 97.5, 85.7, and 99.4%, respectively. First experience indicates that DSCT coronary angiography provides high diagnostic accuracy for assessment of CAD in a high pre-test probability population with extensive coronary calcifications and without heart rate control. (orig.)

  5. Assessing the clinical probability of pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Miniati, M. [Consiglio Nazionale delle Ricerche, Institute of Clinical Physiology, Pisa (Italy); Pistolesi, M. [University of Florence, Dept. of Section of Nuclear Medicine Critical Care, Florence (Italy)

    2001-12-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score {<=} 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score {>=} 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was

  6. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  7. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    OpenAIRE

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomog...

  8. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T.T. van der; Dulmen, S. van

    2014-01-01

    AIMS: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. BACKGROUND: Continuing professional education may be

  9. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    NARCIS (Netherlands)

    Noordman, J.; Weijden, T. van der; Dulmen, S. van

    2014-01-01

    Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be

  10. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    OpenAIRE

    Noordman, J.; van der Weijden, T; Van Dulmen, S.

    2014-01-01

    Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be necessary to refresh and reflect on the communication and motivational interviewing skills of experienced primary care practice nurses. A video-feedback method was designed to improve these skills...

  11. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If...

  12. Assessment of clinical utility of 18F-FDG PET in patients with head and neck cancer: a probability analysis

    International Nuclear Information System (INIS)

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  13. Clinical features of probable severe acute respiratory syndrome in Beijing

    Institute of Scientific and Technical Information of China (English)

    Hai-Ying Lu; Xiao-Yuan Xu; Yu Lei; Yang-Feng Wu; Bo-Wen Chen; Feng Xiao; Gao-Qiang Xie; De-Min Han

    2005-01-01

    AIM: To summarize clinical features of probable severe acute respiratory syndrome (SARS) in Beijing.METHODS: Retrospective cases involving 801 patients admitted to hospitals in Beijing between March and June 2003, with a diagnosis of probable SARS, moderate type.The series of clinical manifestation, laboratory and radiograph data obtained from 801 cases were analyzed. RESULTS: One to three days after the onset of SARS, the major clinical symptoms were fever (in 88.14% of patients), fatigue, headache, myalgia, arthralgia (25-36%), etc. The counts of WBC (in 22.56% of patients) lymphocyte (70.25%)and CD3, CD4, CD8 positive T cells (70%) decreased. From 4-7 d, the unspecific symptoms became weak; however, the rates of low respiratory tract symptoms, such as cough (24.18%), sputum production (14.26%), chest distress (21.04%) and shortness of breath (9.23%) increased, so did the abnormal rates on chest radiograph or CT. The low counts of WBC, lymphocyte and CD3, CD4, CD8 positiveT cells touched bottom. From 8 to 16 d, the patients presented progressive cough (29.96%), sputum production (13.09%), chest distress (29.96%) and shortness of breath (35.34%). All patients had infiltrates on chest radiograph or CT, some even with multi-infiltrates. Two weeks later, patients' respiratory symptoms started to alleviate, the infiltrates on the lung began to absorb gradually, the counts of WBC, lymphocyte and CD3, CD4, CD8 positive T cells were restored to normality.CONCLUSION: The data reported here provide evidence that the course of SARS could be divided into four stages, namely the initial stage, progressive stage, fastigium and convalescent stage.

  14. Bayesian probability of success for clinical trials using historical data.

    Science.gov (United States)

    Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F

    2015-01-30

    Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499

  15. The probabilities of psyquiatric hospitalization of mental health clinic patients

    OpenAIRE

    Leonardo Naves dos Reis; Julio Cesar Ribeiro Simplicio; Edilaine Cristina da Silva Gherardi-Donato; Ana Carolina Guidorizzi Zanetti

    2015-01-01

    The objective of this study is to evaluate the factors of prediction (diagnostic and socio- demographic characteristics) regarding psychiatric outpatient mental health among users. The study was conducted from secondary data, extracted from the charts and analyzed through logistic regression, to obtain the prediction equation of probability of psychiatric hospitalization. The diagnoses that showed statistical significance (p < 0.05) were bipolar affective disorder, schizophrenia, anxious ...

  16. The probabilities of psyquiatric hospitalization of mental health clinic patients

    Directory of Open Access Journals (Sweden)

    Leonardo Naves dos Reis

    2015-01-01

    Full Text Available The objective of this study is to evaluate the factors of prediction (diagnostic and socio- demographic characteristics regarding psychiatric outpatient mental health among users. The study was conducted from secondary data, extracted from the charts and analyzed through logistic regression, to obtain the prediction equation of probability of psychiatric hospitalization. The diagnoses that showed statistical significance (p < 0.05 were bipolar affective disorder, schizophrenia, anxious disorders and depression, and the first two showed a high magnitude association with the need of hospitalization. The age was inversely proportional to the need of hospitalization. The results found may stimulate specific actions and the psychiatric prevention of younger patients with schizophrenia and bipolar affective disorder.

  17. Assessment of clinical utility of {sup 18}F-FDG PET in patients with head and neck cancer: a probability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von [Division of Nuclear Medicine, University Hospital Zurich, Raemistrasse 100, 8091, Zurich (Switzerland); Steurer, Johann; Bachmann, Lucas M. [Horten Centre, University of Zurich, Zurich (Switzerland)

    2003-04-01

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  18. Ventilation-perfusion scanning and pulmonary angiography: correlation in clinical high-probability pulmonary embolism

    International Nuclear Information System (INIS)

    During a 3-year period, 173 clinically selected patients underwent pulmonary angiography to confirm or exclude acute pulmonary embolism. All patients had undergone ventilation-perfusion (V/Q) scanning (167 patients) or perfusion scanning alone (six) before angiography. Angiography was done because the results of the V/Q scanning did not satisfy the clinician's needs for certainty. The results of the V/Q and studies were compared to determine the relative accuracy of V/Q scanning in this clinical setting. Pulmonary embolism was found in seven (15%) of 47 patients with low-probability scans, 11 (32%) of 34 patients with intermediate-probability scans, 22 (39%) of 57 patients with indeterminate scans, and 23 (66%) of 35 patients with high-probability scans. In this clinically selected population, low-probability scans were more accurate in excluding pulmonary embolism than were high-probability scans in establishing that diagnosis

  19. Pre-testing advertisements for effectiveness of communication

    OpenAIRE

    O'Connor, Ciara

    1991-01-01

    The purpose of this study was to examine a technique of Pre- Testing Advertisements for Effectiveness, and to seek to provide a Psychological basis for such Pre-Testing. The results of this study suggest a possible system for Pre-Testing the effectiveness of advertisements. The approach taken in this study was to test the effectiveness of communication of advertisements. Three advertisements were tested on a sample of a target audience, that Bample consisting of 78 people taken from thr...

  20. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  1. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn [RaySearch Laboratories, Sveavägen 44, Stockholm SE-111 34 (Sweden); Forsgren, Anders [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, Stockholm SE-100 44 (Sweden)

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.

  2. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    International Nuclear Information System (INIS)

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality

  3. Clinical and radiological parameters of patients with lung thromboembolism, diagnosed by high probability ventilation / perfusion scintigraphies

    International Nuclear Information System (INIS)

    Background: pulmonary embolism (PE) remains an elusive diagnosis, and still causes too many unexpected deaths. Because of this, noninvasive investigations are done when pulmonary embolism is suspected. Objective: to determine the clinical and x-rays findings in patients with diagnosis of pulmonary embolism by high probability ventilation/perfusion (V/Q) lung scan. Materials and methods: inpatient medical records of 91 patients with clinical suspected PE and high and low probability V/Q lung scan were analyzed (PIOPED criterion). Results: there were statistics correlation with four clinical findings: hemoptysis (p value=0,02, odds ratio=8,925), taquicardia (p value=0,02 odds ratio=3,5), chest pain (p value=0,01, odds ratio=1,87), and recent surgery (p value=0,02, odds ratio=2,762). The 70,7% chest x-rays were normal (p value < 0,001). Conclusion: the clinical and x-rays findings in patients with diagnosis of PE by high probability V/Q lung scan were: hemoptysis, taquicardia, chest pain, recent surgery and normal chest x-ray. This is important because would help to choose the patients in whom the V/Q lung scan will have the maximal performance (Au)

  4. Low-probability ventilation-perfusion scintigrams: clinical outcomes in 99 patients

    International Nuclear Information System (INIS)

    To evaluate the reliability of low probability ventilation-perfursion (V-P) scintigrams in excluding pulmonary embolism (PE), the authors reviewed the clinical records of 99 consecutive patients whose V-P studies had been interpreted as indicative of a low probability of PE. None of the 99 patients were referred for pulmonary angiography. Seven of the hospitalized patients died during the index admission and seven additional hospitalized patients died 1-5 months after discharge from the hospital. None were thought clinically to have died as a result of PE, and autopsy disclosed no PE in two. Follow-up information was obtained for 69 surviving patients not treated with anticoagulants. None of these patients were thought clinically to have had PE during follow-up of a least 2 weeks duration (greater than 2 months in 93% and greater than 6 months in 75%). The results suggest that major short-term morbidity or death attributable to PE are quite infrequent in patients with low-probability V-P scintigrams

  5. Effects of pathogen-specific clinical mastitis on probability of conception in Holstein dairy cows.

    Science.gov (United States)

    Hertl, J A; Schukken, Y H; Welcome, F L; Tauer, L W; Gröhn, Y T

    2014-11-01

    The objective of this study was to estimate the effects of pathogen-specific clinical mastitis (CM), occurring in different weekly intervals before or after artificial insemination (AI), on the probability of conception in Holstein cows. Clinical mastitis occurring in weekly intervals from 6 wk before until 6 wk after AI was modeled. The first 4 AI in a cow's lactation were included. The following categories of pathogens were studied: Streptococcus spp. (comprising Streptococcus dysgalactiae, Streptococcus uberis, and other Streptococcus spp.); Staphylococcus aureus; coagulase-negative staphylococci (CNS); Escherichia coli; Klebsiella spp.; cases with CM signs but no bacterial growth (above the level that can be detected from our microbiological procedures) observed in the culture sample and cases with contamination (≥ 3 pathogens in the sample); and other pathogens [including Citrobacter, yeasts, Trueperella pyogenes, gram-negative bacilli (i.e., gram-negative organisms other than E. coli, Klebsiella spp., Enterobacter, and Citrobacter), Corynebacterium bovis, Corynebacterium spp., Pasteurella, Enterococcus, Pseudomonas, Mycoplasma, Prototheca, and others]. Other factors included in the model were parity (1, 2, 3, 4 and higher), season of AI (winter, spring, summer, autumn), day in lactation of first AI, farm, and other non-CM diseases (retained placenta, metritis, ketosis, displaced abomasum). Data from 90,271 AI in 39,361 lactations in 20,328 cows collected from 2003/2004 to 2011 from 5 New York State dairy farms were analyzed in a generalized linear mixed model with a Poisson distribution. The largest reductions in probability of conception were associated with CM occurring in the week before AI or in the 2 wk following AI. Escherichia coli and Klebsiella spp. had the greatest adverse effects on probability of conception. The probability of conception for a cow with any combination of characteristics may be calculated based on the parameter estimates. These

  6. CT abnormality in multiple sclerosis analysis based on 28 probable cases and correlation with clinical manifestations

    International Nuclear Information System (INIS)

    In order to investigate the occurrence and nature of CT abnormality and its correlation with clinical manifestations in multiple sclerosis, 34 CT records obtained from 28 consecutive patients with probable multiple sclerosis were reviewed. Forty-six percent of all cases showed abnormal CT. Dilatation of cortical sulci was found in 39%; dilatation of the lateral ventricle in 36%; dilatation of prepontine or cerebello-pontine cistern and the fourth ventricle, suggesting brainstem atrophy, in 18%; dilatation of cerebellar sulci, superior cerebellar cistern and cisterna magna, suggesting cerebellar atrophy, in 11%. Low density area was found in the cerebral hemisphere in 11% of cases. Contrast enhancement, performed on 25 CT records, did not show any change. There was no correlation between CT abnormality and duration of the illness. Although abnormal CT tended to occur more frequently during exacerbations and chronic stable state than during remissions, the difference was not statistically significant. CT abnormalities suggesting brainstem atrophy, cerebellar atrophy or plaques were found exclusively during exacerbations and chronic stable state. The occurrence of CT abnormalities was not significantly different among various clinical forms which were classified based on clinically estimated sites of lesion, except that abnormal CT tended to occur less frequently in cases classified as the optic-spinal form. It is noteworthy that cerebral cortical atrophy and/or dilatation of the lateral ventricle were found in 31% of cases who did not show any clinical sign of cerebral involvement. There was a statistically significant correlation between CT abnormalities and levels of clinical disability. Eighty percent of the bedridden or severely disabled patients showed abnormal CT, in contrast with only 29% of those with moderate, slight or no disability. (author)

  7. Topological characteristics of brainstem lesions in clinically definite and clinically probable cases of multiple sclerosis: An MRI-study

    International Nuclear Information System (INIS)

    Disseminated lesions in the white matter of the cerebral hemispheres and confluent lesions at the borders of the lateral ventricles as seen on MRI are both considered acceptable paraclinical evidence for the diagnosis of multiple sclerosis. Similar changes are, however, also found in vascular diseases of the brain. We therefore aimed at identifying those additional traits in the infratentorial region, which in our experience are not frequently found in cerebrovascular pathology. We evaluated MR brain scans of 68 patients and found pontine lesions in 71% of cases with a clinically definite diagnosis (17 out of 24) and in 33% of cases with a probable diagnosis (14 out of 43). Lesions in the medulla oblongata were present in 50% and 16%, respectively, and in the midbrain in 25% and 7%, respectively. With rare exceptions all brainstem lesions were contiguous with the cisternal or ventricular cerebrospinal fluid spaces. In keeping with post-mortem reports the morphological spectrum ranged from large confluent patches to solitary, well delineated paramedian lesions or discrete linings of the cerebrospinal fluid border zones and were most clearly depicted from horizontal and sagittal T2 weighted SE-sequences. If there is a predilection for the outer or inner surfaces of the brainstem, such lesions can be considered an additional typical feature of multiple sclerosis and can be more reliably weighted as paraclinical evidence for a definite diagnosis. (orig.)

  8. Pre- and Post-Operative Nomograms to Predict Recurrence-Free Probability in Korean Men with Clinically Localized Prostate Cancer

    OpenAIRE

    Minyong Kang; Chang Wook Jeong; Woo Suk Choi; Yong Hyun Park; Sung Yong Cho; Sangchul Lee; Seung Bae Lee; Ja Hyeon Ku; Sung Kyu Hong; Seok-Soo Byun; Hyeon Jeong; Cheol Kwak; Hyeon Hoe Kim; Eunsik Lee; Sang Eun Lee

    2014-01-01

    OBJECTIVES: Although the incidence of prostate cancer (PCa) is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP). We established pre- and post-operative nomograms estimating biochemical recurrence (BCR)-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from...

  9. Clinical radiobiology of glioblastoma multiforme. Estimation of tumor control probability from various radiotherapy fractionation schemes

    Energy Technology Data Exchange (ETDEWEB)

    Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)

    2014-10-15

    The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei

  10. Planning and pre-testing: the key to effective AIDS education materials.

    Science.gov (United States)

    Ostfield, M L; Romocki, L S

    1991-06-01

    The steps in designing and producing effective AIDS prevention educational materials are outlines, using as an example a brochure originated in St. Lucia for clients at STD clinics. The brochure was intended to be read by clients as they waited for their consultation, thus it was targeted to a specific audience delimited by age, sex, language, educational level, religion and associated medical or behavioral characteristics. When researching the audience, it is necessary to learn the medium they best respond to, what they know already, what is their present behavior, how they talk about AIDS, what terms they use, how they perceive the benefits of AIDS prevention behavior, what sources of information they trust. The minimum number of key messages should be selected. Next the most appropriate channel of communication is identified. Mass media are not always best for a target audience, "little media" such as flyers and give-always may be better. The draft is then pre-tested by focus groups and interviews, querying about the text separately, then images, color, format, style. Listen to the way the respondents talk about the draft. Modify the draft and pre-test again. Fine-tune implications of the message for realism in emotional responses, respect, self-esteem, admiration and trust. To achieve wide distribution it is a good idea to involve community leaders to production of the materials, so they will be more likely to take part in the distribution process. PMID:12316892

  11. A Clinical model to identify patients with high-risk coronary artery disease

    NARCIS (Netherlands)

    Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)

    2015-01-01

    textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th

  12. Clinical characteristics of two probable cases of Angelman syndrome in the Hospital Nacional de Ninos

    International Nuclear Information System (INIS)

    Angelman Syndrome is a severe neurological disorder. No other case has been reported in our country until now. There are two children reported with the clinical suspicion of Angelman Syndrome. They were treated at the Departamento de Neurologia del Hospital Nacional de Ninos. The information was taken from their medical records. The two patients present the four cardinal clinical features, including severe developmental delay, profound speech impairment, ataxia and a happy, sociable disposition. In addition, the patients displayed other characteristics: seizures associated with a typical spike and slow wave activity on EEG an love for water. The clinical diagnosis is difficult because other disorders can mimic the features of Angelman Syndrome. Nonetheless, at an early age, the behavioral phenotype of happy disposition and hyperexcitability is the most important manifestation and appears to be decisive in the differential diagnosis of patients with psychomotor and language delay. (author)

  13. Oculodentodigital dysplasia: study of ophthalmological and clinical manifestations in three boys with probably autosomal recessive inheritance.

    Science.gov (United States)

    Frasson, Maria; Calixto, Nassim; Cronemberger, Sebastião; de Aguiar, Regina Amélia Lopes Pessoa; Leão, Letícia Lima; de Aguiar, Marcos José Burle

    2004-09-01

    Oculodentodigital dysplasia (ODDD) is a rare inherited disorder affecting the development of the face, eyes, teeth, and limbs. The majority of cases of ODDD are inherited as an autosomal dominant condition. There are few reports of probable autosomal recessive transmission. Affected patients exhibit a distinctive physiognomy with a narrow nose, hypoplastic alae nasi, and anteverted nostrils, bilateral microphthalmos, and microcornea. Sometimes iris anomalies and secondary glaucoma are present. There are malformations of the distal extremities such as syndactyly. In addition, there are defects in the dental enamel with hypoplasia and yellow discoloration of the teeth. Less common features include hypotrichosis, intracranial calcifications, and conductive deafness secondary to recurrent otitis media. We describe three brothers with ODDD. Their parents are first cousins and present no features of ODDD. These data are in favor of autosomal recessive inheritance and suggest genetic heterogeneity for this entity. PMID:15512999

  14. Clinical and Electrophysiological Studies of a Family with Probable X-linked Dominant Charcot-Marie-Tooth Neuropathy and Ptosis.

    Directory of Open Access Journals (Sweden)

    Tony Wu

    2004-07-01

    Full Text Available Background: The X-linked dominant Charcot-Marie-Tooth neuropathy (CMTX is ahereditary motor and sensory neuropathy linked to a variety of mutations inthe connexin32 (Cx32 gene. Clinical and genetic features of CMTX havenot previously been reported in Taiwanese.Methods: Clinical evaluations and electrophysiological studies were carried out on 25family members of a Taiwanese family group. Molecular genetic analysis ofthe Cx32 gene was performed. A sural nerve biopsy was obtained from 1patient.Results: Nine patients had clinical features of X-linked dominant inheritance and amoderate Charcot-Marie-Tooth (CMT neuropathy phenotype. Moleculargenetic analysis showed no mutation of the Cx32 coding region, but revealeda G-to-A transition at position -215 of the nerve-specific promoter P2 of theCx32 gene. Ptosis is 1 clinical manifestation of neuropathy in this probableCMTX family. Familial hyperthyroidism is an additional independent featureof the family. Electrophysiological and histological studies showed featuresof axonal neuropathy. Multimodality evoked potential studies revealed normalcentral motor and sensory conduction velocities.Conclusions: The presence of ptosis in this family illustrates the existence of clinical heterogeneityamong related family members with CMTX similar to that inCMT of autosomal inheritance. Electrophysiological and histological findingsrevealed normal central conduction and axonal neuropathy.

  15. Pre- and post-operative nomograms to predict recurrence-free probability in korean men with clinically localized prostate cancer.

    Directory of Open Access Journals (Sweden)

    Minyong Kang

    Full Text Available OBJECTIVES: Although the incidence of prostate cancer (PCa is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP. We established pre- and post-operative nomograms estimating biochemical recurrence (BCR-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from June 2004 through July 2011. After inappropriate data exclusion, we evaluated 2,867 patients for the development of nomograms. The Cox proportional hazards regression model was used to develop pre- and post-operative nomograms that predict BCR-free probability. Finally, we resampled from our study cohort 200 times to determine the accuracy of our nomograms on internal validation, which were designated with concordance index (c-index and further represented by calibration plots. RESULTS: Over a median of 47 months of follow-up, the estimated BCR-free rate was 87.8% (1 year, 83.8% (2 year, and 72.5% (5 year. In the pre-operative model, Prostate-Specific Antigen (PSA, the proportion of positive biopsy cores, clinical T3a and biopsy Gleason score (GS were independent predictive factors for BCR, while all relevant predictive factors (PSA, extra-prostatic extension, seminal vesicle invasion, lymph node metastasis, surgical margin, and pathologic GS were associated with BCR in the post-operative model. The c-index representing predictive accuracy was 0.792 (pre- and 0.821 (post-operative, showing good fit in the calibration plots. CONCLUSIONS: In summary, we developed pre- and post-operative nomograms predicting BCR-free probability after RP in a large Korean cohort with clinically localized PCa. These nomograms will be provided as the mobile application-based SNUH Prostate Cancer Calculator. Our nomograms can determine patients at high risk of disease recurrence

  16. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D50. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions

  17. Validation of a clinical-radiographic score to assess the probability of pulmonary tuberculosis in suspect patients with negative sputum smears

    OpenAIRE

    Soto, A.; Solari, L.; Díaz, J.; Mantilla, A.; Matthys, F.; Van der Stuyft, P.

    2011-01-01

    Background: Clinical suspects of pulmonary tuberculosis in which the sputum smears are negative for acid fast bacilli represent a diagnostic challenge in resource constrained settings. Our objective was to validate an existing clinical-radiographic score that assessed the probability of smear-negative pulmonary tuberculosis (SNPT) in high incidence settings in Peru. Methodology/Principal Findings: We included in two referral hospitals in Lima patients with clinical suspicion of pulmonary ...

  18. Use of CT Angiography in a Country with Low Pulmonary Embolism Prevalence: Correlation with Clinical Pretest Probability and D-dimer Values

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gee Won; Jeong, Yeon Joo; Kim, Chang Won [Dept. of Radiology, Pusan National University Hospital, Pusan National University School of Medicine and Medical Research Institutute, Pusan (Korea, Republic of); Chun, Sung Won; Kim, Yeong Dae [Dept. of Cardiovascular and Thoracic Surgery, Pusan National University Hospital, Pusan National University School of Medicine and Medical Research Institutute, Pusan (Korea, Republic of); Kim, Kun Il [Dept. of Radiology, Pusan National University Yangsan Hospital, Pusan National University School of Medicine and Medical Research Institutute, Yangsan (Korea, Republic of); Song, Jong Woon [Dept. of Radiology, Haeundae Paik Hospital, Inje University School of Medicine, Pusan (Korea, Republic of)

    2011-05-15

    To assess the use of CT angiography (CTA) in the diagnostic evaluation of pulmonary thromboembolism (PE) in a country with low PE prevalence and correlate the diagnostic performance of CTA with the clinical pretest probability and D-dimer values. The institutional review board approved this retrospective study. The observers reviewed all 660 CTAs and calculated the PE clot burden scores. The pretest probability of PE according to the Wells criteria and D-dimer values were calculated (clinical data were available for 371 of the 660 patients). We correlated the PE positivity rates of CTA and a PE clot burden score with the D-dimer values and pretest probability using Pearson's correlation coefficient. Of the 371 patients whose clinical data were available, 122 (32.8%) had PEs. None of the patients with both a normal D-dimer value and a low clinical probability had a PE. PE positivity rates of CTA were correlated with clinical pretest probability (r = 0.164, p = 0.002) and D-dimer values (r = 0.361, p < 0.001). PE clot burden scores were correlated with D-dimer values (r = 0.296, p < 0.001). Although PE positivity rates of CTA in a country with low prevalence were higher than those in a country with a higher prevalence, approximately 30% of the yield still represents an overuse of CTA. CTA should be performed after the pretest probability has been assigned and if the result of a D-dimer assay is abnormal.

  19. Lexicographic probability, conditional probability, and nonstandard probability

    OpenAIRE

    Halpern, Joseph Y.

    2003-01-01

    The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are ...

  20. 40 CFR 86.1334-84 - Pre-test engine and dynamometer preparation.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Pre-test engine and dynamometer preparation. 86.1334-84 Section 86.1334-84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... from the secondary dilution tunnel . Particulate sample filters need not be stabilized or weighed,...

  1. Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students

    Science.gov (United States)

    Montecinos, Alicia M.

    2014-01-01

    A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…

  2. Pre-test analyses for the NESC1 spinning cylinder experiment

    International Nuclear Information System (INIS)

    The spinning cylinder experiment organised by the Network for the Evaluation of Steel Components (NESC) is designed to investigate the cleavage initiation behaviour of both surface breaking and subclad defects in simulated end of life RPV material, exposed to a pressurised thermal shock transient. Pre-test structural integrity assessments are performed by the NESC Structural Analysis Task Group (TG3). The results of these structural integrity assessments are used to determine the design of the experiment and especially the sizes of the introduced defects. In this report the results of the pre-test analyses performed by the group Applied Mechanics at ECN - Nuclear Energy are described. Elastic as well as elasto-plastic structural analyses are performed for a surface breaking and a subclad defect in a forged cylinder with a 4 mm cladding. The semi elliptical defects have a depth of 40 mm and an aspect ratio of 1:3. (orig.)

  3. Screening for pulmonary embolism with a D-dimer assay: do we still need to assess clinical probability as well?

    OpenAIRE

    Hammond, Christopher J; Hassan, Tajek B.

    2005-01-01

    Clinical risk stratification and D-dimer assay can be of use in excluding pulmonary embolism in patients presenting to emergency departments but many D-dimer assays exist and their accuracy varies. We used clinical risk stratification combined with a quantitative latex-agglutination D-dimer assay to screen patients before arranging further imaging if required. Retrospective analysis of a sequential series of 376 patients revealed that no patient with a D-dimer of

  4. Pre-test calculations of SPES experiment - a loss of main feedwater transient

    International Nuclear Information System (INIS)

    Results of a pre-test calculation of international standard experiment ISP-22 SPES are shown in this paper. SPES facility represents a model of three-loop PWR power plant which was used to perform an experimental loss of main feedwater transient with emergency feedwater delayed. calculation was performed by RELAP5/MOD2/36.1 computer code which we had converted to VAX computers. (author)

  5. Prospective validation of a risk calculator which calculates the probability of a positive prostate biopsy in a contemporary clinical cohort

    NARCIS (Netherlands)

    van Vugt, Heidi A.; Kranse, Ries; Steyerberg, Ewout W.; van der Poel, Henk G.; Busstra, Martijn; Kil, Paul; Oomens, Eric H.; de Jong, Igle J.; Bangma, Chris H.; Roobol, Monique J.

    2012-01-01

    Background: Prediction models need validation to assess their value outside the development setting. Objective: To assess the external validity of the European Randomised study of Screening for Prostate Cancer (ERSPC) Risk Calculator (RC) in a contemporary clinical cohort. Methods: The RC calculates

  6. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  9. Confirmed Datura poisoning in a horse most probably due to D. ferox in contaminated tef hay : clinical communication

    Directory of Open Access Journals (Sweden)

    R. Gerber

    2006-06-01

    Full Text Available Two out of a group of 23 mares exposed to tef hay contaminated with Datura ferox (and possibly D. stramonium developed colic. The 1st animal was unresponsive to conservative treatment, underwent surgery for severe intestinal atony and had to be euthanased. The 2nd was less seriously affected, responded well to analgesics and made an uneventful recovery. This horse exhibited marked mydriasis on the first 2 days of being poisoned and showed protracted, milder mydriasis for a further 7 days. Scopolamine was chemically confirmed in urine from this horse for 3 days following the colic attack, while atropine could just be detected for 2 days. Scopolamine was also the main tropane alkaloid found in the contaminating plant material, confirming that this had most probably been a case of D. ferox poisoning. Although Datura intoxication of horses from contaminated hay was suspected previously, this is the 1st case where the intoxication could be confirmed by urine analysis for tropane alkaloids. Extraction and detection methods for atropine and scopolamine in urine are described employing enzymatic hydrolysis followed by liquid-liquid extraction and liquid chromatography tandem mass spectrometry (LC/MS/MS.

  10. Transition probabilities of HER2-positive and HER2-negative breast cancer patients treated with Trastuzumab obtained from a clinical cancer registry dataset.

    Science.gov (United States)

    Pobiruchin, Monika; Bochum, Sylvia; Martens, Uwe M; Kieser, Meinhard; Schramm, Wendelin

    2016-06-01

    Records of female breast cancer patients were selected from a clinical cancer registry and separated into three cohorts according to HER2-status (human epidermal growth factor receptor 2) and treatment with or without Trastuzumab (a humanized monoclonal antibody). Propensity score matching was used to balance the cohorts. Afterwards, documented information about disease events (recurrence of cancer, metastases, remission of local/regional recurrences, remission of metastases and death) found in the dataset was leveraged to calculate the annual transition probabilities for every cohort. PMID:27054173

  11. Pre-test analytical support for experiments quench-10, -11 and -12

    International Nuclear Information System (INIS)

    Pre-test analyses using MELCOR1.8.5, SCDAP/RELAP5 and SCDAPSIM have been performed in collaboration of PSI and FZK to support FZK QUENCH programme of electrically-heated bundle tests on reflood of a degraded core. The experiments include QUENCH-10 and -11, recently carried out in the EU 5th Framework LACOMERA programme, with analytical support in the 6th Framework SARNET network of excellence, and QUENCH-12 to be performed in 2006 in support of the project ISTC1648-2. Each test involves novel features that pose challenges in the planning analyses to determine the test protocol and that require code and input changes to accommodate the test conditions. Special versions of the SCDAP codes were developed to simulate the effect of air on Zircaloy oxidation in the PWR air ingress test QUENCH-10, following pre-oxidation in steam. The analyses highlighted potential difficulties during the air oxidation and reflood phases that were avoided by changes in the test protocol. A more gradual thermal excursion could be achieved, facilitating control of the test, interpretation of data, and minimising the risk of a major excursion during quench. QUENCH-11 involved the steady boildown of an initially water-filled PWR bundle. Additional heating and water supplies were needed to give the desired conditions, and these needed to be tightly specified. Data from pre-tests with lower maximum temperatures were used to benchmark the models for defining the main test. QUENCH-12 examines the effect of WWER bundle configuration and cladding on heat-up, oxidation, and quench response. The bundle is significantly modified with changes to cladding material (Zr/1%Nb instead of Zry-4), electrical heating, and geometry, hence to radiative heat transfer, hydraulics and oxidation characteristics. Oxidation correlations for Zr/1%Nb in steam were introduced into special versions of SCDAP. Pre-test calculations suggest that the modified kinetics have only a minor effect on the thermal response, but

  12. Pre-test analysis for the KNGR DVI performance test facility using FLUENT

    International Nuclear Information System (INIS)

    Pre-test analysis using a FLUENT code has been performed for the KGNR(Korean Next Generation Reactor) DVI(Direct Vessel Injection) performance test facility which is a full height and 1/24.3 volume scaled separate effect test facility. The ideal gas discharge condition is considered to simulation a steam discharge condition. The scale effects on the flow pattern, pressure distribution, and similarity for scaled model are numerically tested. From the various results for the scale effects, it was found that the similarity of hydraulics is founded

  13. A clinical case of a patient with probable cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL from Chuvashia

    Directory of Open Access Journals (Sweden)

    Tatiana Vladimirovna Mokina

    2015-10-01

    Full Text Available Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL syndrome is a congenital small-vessel disease running with recurrent lacunar infarcts and leading to gradually progressive subcortical, pseudobulbar, and cerebellar syndromes and dementia. Neuroimaging reveal multiple lacunar infarcts in the basal ganglia, thalamus, pons Varolii, and cerebral hemispheric white matter, as well as cerebral atrophy. The specific feature of the disease is white matter lesion adjacent to the temporal horns of the lateral ventricles and to the external capsules. The paper describes a patient with CADASIL syndrome. The latter runs a progressive course and includes the following neurological disorders: cognitive, pyramidal, extrapyramidal, and axial ones. This clinical case was differentially diagnosed with multiple sclerosis, including with consideration for neuroimaging findings. The CADASIL syndrome is a rare potentially menacing neurological condition that is observed in young patients and requires a detailed examination using current diagnostic techniques.

  14. Brief Cognitive Screening Battery (BCSB) is a very useful tool for diagnosis of probable mild Alzheimer´s disease in a geriatric clinic.

    Science.gov (United States)

    Fichman-Charchat, Helenice; Miranda, Cristina Vieira; Fernandes, Conceição Santos; Mograbi, Daniel; Oliveira, Rosinda Martins; Novaes, Regina; Aguiar, Daniele

    2016-02-01

    The diagnosis of early signs of Alzheimer's disease (AD) is a major challenge in a heterogeneous population. Objective To investigate the use of the Brief Cognitive Screening Battery (BCSB) for the diagnosis of mild AD in a geriatric outpatient unit of a public hospital in the city of Rio de Janeiro. Method BCSB was administered to 51 elderly adults with a clinical diagnosis of probable AD and 123 older adults without dementia (non-AD). Results AD patients performed worse than non-AD group in all BCSB tests, except Clock Drawing (p = 0.10). The ROC curves and Logistic Regression analysis indicated that delayed recall in the figure memory test was the best predictor, screening mild AD patients with sensibility and specificity superior to 80%. Conclusion The BCSB was accurate in identifying people with AD in a geriatric outpatient clinic at a public hospital, including elderly people with chronic diseases, physical frailty and cognitive impairment. PMID:26690839

  15. Pre-test analysis results of a PWR steel lined pre-stressed concrete containment model

    International Nuclear Information System (INIS)

    Pre-stressed concrete nuclear containment serves as the ultimate barrier against the release of radioactivity to the environment. This ultimate barrier must be checked for its ultimate load carrying capacity. BARC participated in a Round Robin analysis activity which is co-sponsored by Sandia National Laboratory, USA and Nuclear Power Engineering Corporation Japan for the pre-test prediction of a 1:4 size Pre-stressed Concrete Containment Vessel. In house finite element code ULCA was used to make the test predictions of displacements and strains at the standard output locations. The present report focuses on the important landmarks of the pre-test results, in sequential terms of first crack appearance, loss of pre-stress, first through thickness crack, rebar and liner yielding and finally liner tearing at the ultimate load. Global and local failure modes of the containment have been obtained from the analysis. Finally sensitivity of the numerical results with respect to different types of liners and different constitutive models in terms of bond strength between concrete and steel and tension-stiffening parameters are examined. The report highlights the important features which could be observed during the test and guidelines are given for improving the prediction in the post test computation after the test data is available. (author)

  16. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  17. Pre-test CFD Calculations for a Bypass Flow Standard Problem

    Energy Technology Data Exchange (ETDEWEB)

    Rich Johnson

    2011-11-01

    The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacent graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.

  18. Estimation of the mediastinal involvement probability in non-small cell lung cancer: a statistical definition of the clinical target volume for 3-dimensional conformal radiotherapy?

    International Nuclear Information System (INIS)

    Purpose. - Conformal irradiation of non-small cell lung carcinoma (NSCLC) is largely based on a precise definition of the nodal clinical target volume (CTVn). The reduction of the number of nodal stations to be irradiated would render tumor dose escalation more achievable. The aim of this work was to design an mathematical tool based on documented data, that would predict the risk of metastatic involvement for each nodal station. Methods and material. - From the large surgical series published in the literature we looked at the main pre-treatment parameters that modify the risk of nodal invasion. The probability of involvement for the 17 nodal stations described by the American Thoracic Society (ATS) was computed from all these publications and then weighted according to the French epidemiological data. Starting from the primitive location of the tumour as the main characteristic, we built a probabilistic tree for each nodal station representing the risk distribution as a function of each tumor feature. From the statistical point of view, we used the inversion of probability trees method described by Weinstein and Feinberg. Results. -Taking into account all the different parameters of I the pre-treatment staging relative to each level of the ATS map brings up to 20,000 different combinations. The first chosen parameters in the tree were, depending on the tumour location, the histological classification, the metastatic stage, the nodal stage weighted in function of the sensitivity and specificity of the diagnostic examination used (PET scan, CAT scan) and the tumoral stage. A software is proposed to compute a predicted probability of involvement of each nodal station for any given clinical presentation.Conclusion. -To better define the CTVn in NSCLC 3DRT, we propose a software that evaluates the mediastinal nodal involvement risk from easily accessible individual pretreatment parameters. (authors)

  19. Developing and pre-testing a decision board to facilitate informed choice about delivery approach in uncomplicated pregnancy

    Directory of Open Access Journals (Sweden)

    Wood Stephen

    2009-10-01

    Full Text Available Abstract Background The rate of caesarean sections is increasing worldwide, yet medical literature informing women with uncomplicated pregnancies about relative risks and benefits of elective caesarean section (CS compared with vaginal delivery (VD remains scarce. A decision board may address this gap, providing systematic evidence-based information so that patients can more fully understand their treatment options. The objective of our study was to design and pre-test a decision board to guide clinical discussions and enhance informed decision-making related to delivery approach (CS or VD in uncomplicated pregnancy. Methods Development of the decision board involved two preliminary studies to determine women's preferred mode of risk presentation and a systematic literature review for the most comprehensive presentation of medical risks at the time (VD and CS. Forty women were recruited to pre-test the tool. Eligible subjects were of childbearing age (18-40 years but were not pregnant in order to avoid raising the expectation among pregnant women that CS was a universally available birth option. Women selected their preferred delivery approach and completed the Decisional Conflict Scale to measure decisional uncertainty before and after reviewing the decision board. They also answered open-ended questions reflecting what they had learned, whether or not the information had helped them to choose between birth methods, and additional information that should be included. Descriptive statistics were used to analyse sample characteristics and women's choice of delivery approach pre/post decision board. Change in decisional conflict was measured using Wilcoxon's sign rank test for each of the three subscales. Results The majority of women reported that they had learned something new (n = 37, 92% and that the tool had helped them make a hypothetical choice between delivery approaches (n = 34, 85%. Women wanted more information about neonatal risks and

  20. TOPFLOW-PTS experiments. pre-test calculations with NEPTUNECFD code

    International Nuclear Information System (INIS)

    Hypothetical Small Break Loss Of Coolant Accident is identified as one of the most severe transients leading to a potential huge Pressurized Thermal Shock on the Reactor Pressure Vessel (RPV). This may result in two-phase flow configurations in the cold legs, according to the operating conditions, and to reliably assess the RPV wall integrity, advanced two-phase flow simulations are required. Related needs in development and/or validation of these advanced models are important, and the on-going TOPFLOW-PTS experimental program was designed to provide a well documented data base to meet these needs. This paper focuses on pre-test NEPTUNECFD simulations of TOPFLOW-PTS experiments; these simulations were performed to (i) help in the definition of the test matrix and test procedure, and (ii) check the presence of the different key physical phenomena at the mock-up scale. (author)

  1. NESC-1 spinning cylinder experiment. Pre-test fracture analysis evaluation

    International Nuclear Information System (INIS)

    A pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical sub-clad flaw (70 mm deep) is more favourable to cleavage initiation near the base metal - cladding interface. (K.A.)

  2. Pre-test prediction for LOBI test A1-04 (PREX test)

    International Nuclear Information System (INIS)

    This report contains the pre-test prediction for the first LOBI test A1-04, which has been chosen as Pre-Prediction Exercise (LOBI-PREX). The test A1-04 will be a simulation of a nearly 200% double ended off-set shear break in the cold leg of the primary system of a four loop PWR. The prediction was performed with the RELAP4/Mod 6 computer code. The report gives the test specification (initial and boundary conditions), a brief description of the RELAP4 model used for the LOBI test facility and a short analysis of the predicted system behaviour. A complete RELAP4 input listing is given in Appendix A

  3. FUMEX cases 1, 2, and 3 calculated pre-test and post-test results

    International Nuclear Information System (INIS)

    Two versions (modified pre-test and modified post-test) of PIN-micro code were used to analyse the fuel rod behaviour of three FUMEX experiments. The experience of applying PIN-micro code with its simple structure and old conception of the steady-state operation shows significant difficulties in treating the complex processes like those in FUMEX experiments. These difficulties were partially overcame through different model modifications and corrections based on special engineering estimations and the results obtained as a whole do not seem unreasonable. The calculations have been performed by a group from two Bulgarian institutions in collaboration with specialists from the Kurchatov Research Center. 1 tab., 14 figs., 8 refs

  4. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM

    International Nuclear Information System (INIS)

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper

  5. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    Science.gov (United States)

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management. PMID:26386548

  6. Implementation of Math Pre-testing and Tutorials for Improving Student Success in Algebra-based Introductory Physics Course

    Science.gov (United States)

    Stokes, Donna

    2012-10-01

    The student success rate in the algebra-based Introductory General Physics I course at the University of Houston (UH) and across the United States is low in comparison to success rates in other service courses. In order to improve student success rates, we have implemented, in addition to interactive teaching techniques, pre-testing as an early intervention process to identify and remediate at-risk students. The pre-testing includes a math and problem-solving skills diagnostic exam and pre-tests administered prior to all regular exams. Students identified as at risk based on their scores on these pre-tests are given incentives to utilize a tutoring intervention consisting of on-line math tutoring to address math deficiencies and tutoring by graduate Physics Teaching Assistants to address student understanding of the physics concepts. Results from 503 students enrolled in three sections of the course showed that 78% of the students identified as at-risk students by the diagnostic exam who completed the math tutorial successfully completed the course, as compared to 45% of at-risk students who did not complete the math tutorial. Results of the pre-testing before each regular exam showed that all students who were identified as at risk based on pre-test scores had positive gains ranging from 9 -- 32% for the three regular exams. However, the large standard deviations of these gains indicate that they are not statistically significant; therefore, pretesting before exams will not be offer in the course. However, utilization of the math tutorials as remediation will continue to be offered to all sections of the algebra-based course at UH with the goal of significantly improving the overall success rates for the introductory physics courses.

  7. Pre-test calculations for FAL-19 and FAL-20 using the INSPECT code

    International Nuclear Information System (INIS)

    Pre-test calculations have been carried out for tests FAL-19 and FAL-20. These experiments will be conducted in early 1993 as part of the Falcon test matrix, and have the objective of studying iodine chemistry within the containment under conditions simulating aspects of a severe accident within a light water reactor. In order to make these predictions it was assumed that 10% of the iodine inventory entered the containment as I2, and that in FAL-19 (high humidity in the containment) reaction of I2 with steel was irreversible and in FAL-20 (low humidity) it was reversible. In FAL-20, I2 was predicted to transfer from the steel to paint and sump. Results predict rapid uptake by walls and very little long term volatility apart from a low level of CH3I. A final report of this work will be issued in December 1992 that also takes account of the role of non-aqueous aerosols on the iodine behaviour. (author)

  8. Pre-test prediction report LOBI-MOD2 Test BT-12 large steam line break

    International Nuclear Information System (INIS)

    The RETRAN-02 code has been selected by the CEGB for independent assessment of the thermal hydraulic component of the intact circuit fault safety case for Sizewell B. An important source of validation data for RETRAN is the European Community sponsored LOB1-MOD2 Integral Test Facility. One component of the agreed LOB1 test matrix is the large (100%) steam line break test BT-12 for which the UK has been designated as partner country. This report details the pre-test predictions undertaken in support of Test BT-12 using the RETRAN-02/Mod 3 code. Three separate analyses are presented. In addition to the best estimate prediction, two scoping predictions are presented which respectively minimise and maximise the primary cooldown. The best estimate calculation was undertaken using dynamic slip with multi-node steam generator representations. The maximum cooldown was obtained using a single bubble rise volume broken loop steam generator model to minimise the liquid carryover to the break. The minimum cooldown used full noding for the broken loop steam generator but without slip (ie equal phase velocities) to maximise the carryover. A number of modelling difficulties had to be overcome including steady state initialisation at the zero feed and steam flow hot standby condition. (author)

  9. Strong association between serological status and probability of progression to clinical visceral leishmaniasis in prospective cohort studies in India and Nepal.

    Directory of Open Access Journals (Sweden)

    Epco Hasker

    Full Text Available INTRODUCTION: Asymptomatic persons infected with the parasites causing visceral leishmaniasis (VL usually outnumber clinically apparent cases by a ratio of 4-10 to 1. We assessed the risk of progression from infection to disease as a function of DAT and rK39 serological titers. METHODS: We used available data on four cohorts from villages in India and Nepal that are highly endemic for Leishmania donovani. In each cohort two serosurveys had been conducted. Based on results of initial surveys, subjects were classified as seronegative, moderately seropositive or strongly seropositive using both DAT and rK39. Based on the combination of first and second survey results we identified seroconvertors for both markers. Seroconvertors were subdivided in high and low titer convertors. Subjects were followed up for at least one year following the second survey. Incident VL cases were recorded and verified. RESULTS: We assessed a total of 32,529 enrolled subjects, for a total follow-up time of 72,169 person years. Altogether 235 incident VL cases were documented. The probability of progression to disease was strongly associated with initial serostatus and with seroconversion; this was particularly the case for those with high titers and most prominently among seroconvertors. For high titer DAT convertors the hazard ratio reached as high as 97.4 when compared to non-convertors. The strengths of the associations varied between cohorts and between markers but similar trends were observed between the four cohorts and the two markers. DISCUSSION: There is a strongly increased risk of progressing to disease among DAT and/or rK39 seropositives with high titers. The options for prophylactic treatment for this group merit further investigation, as it could be of clinical benefit if it prevents progression to disease. Prophylactic treatment might also have a public health benefit if it can be corroborated that these asymptomatically infected individuals are infectious

  10. A Content Analysis of Multinationals' Web Communication Strategies: Cross-Cultural Research Framework and Pre-Testing.

    Science.gov (United States)

    Okazaki, Shintaro; Alonso Rivas, Javier

    2002-01-01

    Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…

  11. Pre-test CFD simulations on TOPFLOW-PTS experiments with ANSYS CFX 12.0

    International Nuclear Information System (INIS)

    Some scenarios for Small Break Loss Of Coolant Accidents (SB-LOCA) lead to an Emergency Core Cooling (ECC) water injection into the cold leg of a Pressurized Water Reactor (PWR). The cold water mixes there with a hot coolant present in the primary circuit. The mixture flows to the downcomer where further mixing of fluids takes place. Single-phase as well as two-phase PTS (Pressurized Thermal Shock) situations have to be considered. Pressurized Thermal Shock implies the occurrence of thermal loads on the Reactor Pressure Vessel wall. In order to predict thermal gradients in the structural components of the Reactor Pressure Vessel (RPV) wall knowledge of transient temperature distribution in the downcomer is needed. The prediction of the temperature distribution requires reliable Computational Fluid Dynamic simulations. In case of two-phase PTS situations the water level in the RPV has dropped down to the height position of the cold leg nozzle or below leading to a partially filled or totally uncovered cold leg. In the frame of the EU project NURISP (Nuclear Reactor Integrated Simulation Project) attempts are made to improve the CFD modelling for two-phase PTS situations. This paper presents pre-test simulations on TOPFLOW-PTS experiments. The experiments will be carried out on the TOPFLOW-PTS test facility of the Forschungszentrum Dresden-Rossendorf. For the numerical investigations in the frame of NURISP two reference cases were defined: one for steady air-water and one for steady steam-water flow. The simulations were performed by using the CFD-code ANSYS CFX 12.0. Best practice guidelines were considered as far as possible. A homogeneous model was applied for momentum equations. Turbulence was modelled with the homogeneous Shear Stress Transport turbulence model. In all simulations the cold leg was 50 pc full of water. In case of air-water simulation the operating conditions for both fluids were 40 deg. C - 50 deg. C and 22.5 bar for the temperature and pressure

  12. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A

    2007-09-01

    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  13. NESC-1 spinning cylinder experiment: Pre-test fracture analysis evaluation

    International Nuclear Information System (INIS)

    The NESC project (Network for the Evaluation of Steel Components) has been started in Europe with fundings from European Community (EC) and Health and Safety Executive (HSE, UK). This project contains several aspects of the structural integrity assessment procedure and more specifically nondestructive examination, fracture mechanics and materials characterization. A first test is being planned at the AEA Technology Laboratories (Risley, UK) on the Spinning Cylinder test facility. The experiment will be conducted on a large scale cladded cylinder containing surface and subclad cracks exposed to a pressurized thermal shock transient (PTS). The main purpose of the test is to obtain the cleavage initiation in base metal. Within the framework of this project, a pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied, the first one with a 40 mm deep semi-elliptical subclad flaw, the second with a 70 mm deep semi-elliptical subclad flaw in a thicker vessel. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical subclad flaw (70 mm deep) is more favorable to cleavage initiation near the base metal-cladding interface

  14. The Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS): a Pre-test Study

    OpenAIRE

    de Jong, Merel; Tamminga, Sietske J; de Boer, Angela G E M; Frings-Dresen, Monique H.W.

    2016-01-01

    Background Returning to and continuing work is important to many cancer survivors, but also represents a challenge. We know little about subjective work outcomes and how cancer survivors perceive being returned to work. Therefore, we developed the Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS). Our aim was to pre-test the items of the initial QWLQ-CS on acceptability and comprehensiveness. In addition, item retention was performed by pre-assessing the relevance scores an...

  15. Comparison of different coupling CFD–STH approaches for pre-test analysis of a TALL-3D experiment

    Energy Technology Data Exchange (ETDEWEB)

    Papukchiev, Angel, E-mail: angel.papukchiev@grs.de [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany); Jeltsov, Marti; Kööp, Kaspar; Kudinov, Pavel [KTH Royal Institute of Technology, Stockholm (Sweden); Lerchl, Georg [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany)

    2015-08-15

    Highlights: • Thermal-hydraulic system codes and CFD tools are coupled. • Pre-test calculations for the TALL-3D facility are performed. • Complex flow and heat transfer phenomena are modeled. • Comparative analyses have been performed. - Abstract: The system thermal-hydraulic (STH) code ATHLET was coupled with the commercial 3D computational fluid dynamics (CFD) software package ANSYS CFX to improve ATHLET simulation capabilities for flows with pronounced 3D phenomena such as flow mixing and thermal stratification. Within the FP7 European project THINS (Thermal Hydraulics of Innovative Nuclear Systems), validation activities for coupled thermal-hydraulic codes are being carried out. The TALL-3D experimental facility, operated by KTH Royal Institute of Technology in Stockholm, is designed for thermal-hydraulic experiments with lead-bismuth eutectic (LBE) coolant at natural and forced circulation conditions. GRS carried out pre-test simulations with ATHLET–ANSYS CFX for the TALL-3D experiment T01, while KTH scientists perform these analyses with the coupled code RELAP5/STAR CCM+. In the experiment T01 the main circulation pump is stopped, which leads to interesting thermal-hydraulic transient with local 3D phenomena. In this paper, the TALL-3D behavior during T01 is analyzed and the results of the coupled pre-test calculations, performed by GRS (ATHLET–ANSYS CFX) and KTH (RELAP5/STAR CCM+) are directly compared.

  16. Interpretations of Negative Probabilities

    OpenAIRE

    Burgin, Mark

    2010-01-01

    In this paper, we give a frequency interpretation of negative probability, as well as of extended probability, demonstrating that to a great extent, these new types of probabilities, behave as conventional probabilities. Extended probability comprises both conventional probability and negative probability. The frequency interpretation of negative probabilities gives supportive evidence to the axiomatic system built in (Burgin, 2009; arXiv:0912.4767) for extended probability as it is demonstra...

  17. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  18. On Probability Leakage

    OpenAIRE

    Briggs, William M

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  19. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  20. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    2013-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  1. Demographic, clinical and treatment related predictors for event-free probability following low-dose radiotherapy for painful heel spurs - a retrospective multicenter study of 502 patients

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, Ralph [Dept. of Radiotherapy, St. Josefs-Hospital. Wiesbaden (Germany); Micke, Oliver [Dept. of Radiotherapy, Muenster Univ. Hospital (Germany); Reichl, Berthold [Dept. of Radiotherapy, Weiden Hospital (DE)] (and others)

    2007-03-15

    A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p <0.001); >58/{<=}58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis {<=} 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p <0.001), an age >58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs.

  2. Demographic, clinical and treatment related predictors for event-free probability following low-dose radiotherapy for painful heel spurs - a retrospective multicenter study of 502 patients

    International Nuclear Information System (INIS)

    A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p 58/≤58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis ≤ 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p 58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs

  3. Asbestos and Probable Microscopic Polyangiitis

    OpenAIRE

    George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W

    2004-01-01

    Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...

  4. Non-Archimedean Probability

    OpenAIRE

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization o...

  5. Probability and paternity testing.

    OpenAIRE

    Elston, R C

    1986-01-01

    A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of p...

  6. Logical Probability Preferences

    OpenAIRE

    Saad, Emad

    2013-01-01

    We present a unified logical framework for representing and reasoning about both probability quantitative and qualitative preferences in probability answer set programming, called probability answer set optimization programs. The proposed framework is vital to allow defining probability quantitative preferences over the possible outcomes of qualitative preferences. We show the application of probability answer set optimization programs to a variant of the well-known nurse restoring problem, c...

  7. Pre-test habituation improves the reliability of a handheld test of mechanical nociceptive threshold in dairy cows

    DEFF Research Database (Denmark)

    Raundal, P. M.; Andersen, P. H.; Toft, Nils;

    2015-01-01

    Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre......-testing habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P < 0.001), increased MNT (P < 0.001) and decreased the discrepancy between applied and target force during stimulations (P < 0.001). Pre...

  8. Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax

    Energy Technology Data Exchange (ETDEWEB)

    Ryerson, F.J.; Qualheim, B.J.

    1983-12-01

    Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables.

  9. Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax

    International Nuclear Information System (INIS)

    Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables

  10. Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02

    Science.gov (United States)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2011-01-01

    This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.

  11. Agreeing Probability Measures for Comparative Probability Structures

    OpenAIRE

    Wakker, Peter

    1981-01-01

    It is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a $\\sigma$-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid for the general case, but only for $\\sigma$-algebras. Here the proof of Niiniluoto (1972) is supplemented. Furthermore an example is presented that reveals many misunderstandings in the literature. At the...

  12. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  13. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  14. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  15. Genetic, clinical and environmental factors with a probable role for development of early radiation adverse events in cervical and endometrial cancer patients after postoperative pelvic irradiation

    International Nuclear Information System (INIS)

    The aim of this study was to investigate the factors influencing the development of early radiation adverse events in cervical and endometrial cancer patients after postoperative radiotherapy. The study included 109 patients with cervical and endometrial carcinoma after radical surgery. External beam pelvic irradiation was performed on a Co-60 machine with 'Box' technique to 50 Gy total dose in 2 Gy daily fractions. Early radiation adverse events were assessed according to Common Terminology Criteria for Adverse Events, v.3.0, for first time applied in Bulgaria by own designed questionnaire. Information on smoking habits, previous abdominal surgery, sensitivity to sunlight, family history and concomitant diseases was generated. DNA isolated from venous blood was used for genotype analysis with Polymerise Chain Reaction Fragment Length Polymorphisms (PCR-REFP). Standard statistical package and logistic regression analysis was applied for statistical evaluation. Only 2% of the patients did not developed any early radiation adverse events Majority of patients suffer grade 1 and 2 adverse events. No grade 4 and 5 events were recorded. Smoking increases the grade of gastrointestinal events and the summarized clinical radiosensitivity Sensitivity to sunlight was associated with moderate and severe skin reactions. Genetic factors influence the severity of genitourinary (XRCC1 194 (C>T), XRCC1 280 (G>A)) and skin adverse events (XRCC1 194 (C>T), XRCC1 280 (G>A)) and also the summarized clinical radiosensitivity (XRCC1 194 (C>T)). The risk factors for development of early radiation adverse events found in the present study are smoking, sensitivity to sunlight and the following SNPs XRCC1 194 (C>T), XRCC1 280 (G>A) u XRCC1 194 (C>T), XRCC1 280 (G>A).

  16. Estimating extreme flood probabilities

    International Nuclear Information System (INIS)

    Estimates of the exceedance probabilities of extreme floods are needed for the assessment of flood hazard at Department of Energy facilities. A new approach using a joint probability distribution of extreme rainfalls and antecedent soil moisture conditions, along with a rainfall runoff model, provides estimates of probabilities for floods approaching the probable maximum flood. This approach is illustrated for a 570 km2 catchment in Wisconsin and a 260 km2 catchment in Tennessee

  17. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  18. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  19. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  20. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  1. Introduction to probability models

    CERN Document Server

    Ross, Sheldon M

    2006-01-01

    Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v

  2. Pre-test analysis for the KNGR LBLOCA DVI performance test using a best estimate code MARS

    International Nuclear Information System (INIS)

    Pre-test analysis using a MARS code has been performed for the KNGR (Korean Next Generation Reactor) DVI (Direct Vessel Injection) performance test facility which is a full height and 1/24.3 volume scaled separate effects test facility focusing on the identification of multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. From the steady state analyses for various test cases at late reflood condition, the degree of major thermal-hydraulic phenomena such as ECC bypass, ECC penetration, steam condensation, and water level sweep-out are quantified. The MARS code analysis results showed that: (a) multi-dimensional flow and temperature behaviors occurred in the downcomer region as expected, (b) the proximity of ECC injection to the break caused more ECC bypass and less steam condensation efficiency, (c) increasing steam flow rate resulted in more ECC bypass and less steam condensation, (d) and the high velocity of steam flow swept-out the water in the downcomer just below the cold leg nozzle. These results are comparable with those observed in the previous tests such as UPTF and CCTF. (author)

  3. RELAP4 pre-test predictions for the LOFT transient (blowdown) DNB tests in the Columbia University test loop

    International Nuclear Information System (INIS)

    Rod bundles simulating the LOFT Core-1, both with and without rod external thermocouple simulators, will be tested to determine the effect of rod external thermocouples on time-to-DNB under blowdown conditions similar to those in LOFT. Pre-test predictions have been made using the RELAP4 computer code. The purposes of this analysis were (1) to predict blowdown orifice sizes which result in the closest simulation of coolant pressure, quality, and flow rate between the test section and the LOFT core for a LOFT 200 percent simulated cold leg break at a peak linear heat generation rate of 19 kw/ft, (2) to determine ranges for instrumentation, and (3) to estimate the time-to-DNB in the rod bundles. An exact simulation of the LOFT blowdown conditions, however, can not be obtained in the test section because the rod bundles have a uniform axial power profile and the Columbia test loop is not scaled to the LOFT configuration

  4. Pre-test of the KYLIN-II thermal-hydraulics mixed circulation LBE loop using RELAP5

    International Nuclear Information System (INIS)

    To investigate the behavior of lead bismuth eutectic (LBE) as coolant in China LEAd-based Research Reactor, Institute of Nuclear Energy Safety Institute (INEST), Chinese Academy of Sciences has built a multi-functional LBE experiment facility KYLIN-II. Mixed circulation loop, which is one of the KYLIN-II thermal-hydraulics loops, has the capability to drive the flowing LBE in different ways such as pump, gas lift and temperature difference (natural circulation). In this contribution, preliminary numerical simulations in support of the operation and experiment of KYLIN-II thermal-hydraulics mixed circulation LBE loop have been carried out and the obtained results have been studied. The RELAP5 Mod4.0 with LBE model has been utilized. Pre-test analysis showed the LBE circulation capability can reach the object under several driven patterns. The maximum velocity in fuel pin bundles can be larger than 0.15 m/s for natural circulation, 0.5 m/s for gas enhanced circulation, and 2 m/s for pump driven circulation. (author)

  5. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  6. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  7. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to...

  9. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  10. Qubit persistence probability

    International Nuclear Information System (INIS)

    In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)

  11. Pre-test metyrapone impairs memory recall in fear conditioning tasks: lack of interaction with β-adrenergic activity

    Directory of Open Access Journals (Sweden)

    Mariella B.L. Careaga

    2015-03-01

    Full Text Available Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs, adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor, administered 90 min before the first test of contextual or auditory fear conditioning, negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning, suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol, a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling.

  12. Pre-Test Assessment of the Use Envelope of the Normal Force of a Wind Tunnel Strain-Gage Balance

    Science.gov (United States)

    Ulbrich, N.

    2016-01-01

    The relationship between the aerodynamic lift force generated by a wind tunnel model, the model weight, and the measured normal force of a strain-gage balance is investigated to better understand the expected use envelope of the normal force during a wind tunnel test. First, the fundamental relationship between normal force, model weight, lift curve slope, model reference area, dynamic pressure, and angle of attack is derived. Then, based on this fundamental relationship, the use envelope of a balance is examined for four typical wind tunnel test cases. The first case looks at the use envelope of the normal force during the test of a light wind tunnel model at high subsonic Mach numbers. The second case examines the use envelope of the normal force during the test of a heavy wind tunnel model in an atmospheric low-speed facility. The third case reviews the use envelope of the normal force during the test of a floor-mounted semi-span model. The fourth case discusses the normal force characteristics during the test of a rotated full-span model. The wind tunnel model's lift-to-weight ratio is introduced as a new parameter that may be used for a quick pre-test assessment of the use envelope of the normal force of a balance. The parameter is derived as a function of the lift coefficient, the dimensionless dynamic pressure, and the dimensionless model weight. Lower and upper bounds of the use envelope of a balance are defined using the model's lift-to-weight ratio. Finally, data from a pressurized wind tunnel is used to illustrate both application and interpretation of the model's lift-to-weight ratio.

  13. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  14. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  15. Topics in probability

    CERN Document Server

    Prabhu, Narahari

    2011-01-01

    Recent research in probability has been concerned with applications such as data mining and finance models. Some aspects of the foundations of probability theory have receded into the background. Yet, these aspects are very important and have to be brought back into prominence.

  16. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  17. Economy, probability and risk

    Directory of Open Access Journals (Sweden)

    Elena Druica

    2007-05-01

    Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.

  18. Learning Probability in the Arts Stream Classes: Do Colour Balls with STADCooperative Learning Help in Improving Students’ Performance?

    OpenAIRE

    Siew Nyet Moi; Abdullah Sopiah; Kueh Ngie King

    2013-01-01

    Aims: 1. To investigate the effects of concrete learning aids (Colour Balls) with Student Teams-Achievement Division (STAD) cooperative learning (CBCL) method on Form Four Arts Stream students’ performance in probability; 2. To find out students’ perception towards the use of CBCL method in learning probability. Study Design: Quasi experimental pre-test post-test control group design. Two treatment groups were employed in this design, they were CBCL (experimental gr...

  19. The concept of probability

    International Nuclear Information System (INIS)

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  20. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  1. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  2. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  3. Stochastic Programming with Probability

    CERN Document Server

    Andrieu, Laetitia; Vázquez-Abad, Felisa

    2007-01-01

    In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...

  4. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  5. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  6. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  7. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  8. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  9. Probability with Roulette

    Science.gov (United States)

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  10. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  11. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  12. Bayesian default probability models

    OpenAIRE

    Andrlíková, Petra

    2014-01-01

    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  13. Evaluation of a reproductive health awareness program for adolescence in urban Tanzania-A quasi-experimental pre-test post-test research

    Directory of Open Access Journals (Sweden)

    Iida Mariko

    2011-06-01

    Full Text Available Abstract Background Sub-Saharan Africa is among the countries where 10% of girls become mothers by the age of 16 years old. The United Republic of Tanzania located in Sub-Saharan Africa is one country where teenage pregnancy is a problem facing adolescent girls. Adolescent pregnancy has been identified as one of the reasons for girls dropping out from school. This study's purpose was to evaluate a reproductive health awareness program for the improvement of reproductive health for adolescents in urban Tanzania. Methods A quasi-experimental pre-test and post-test research design was conducted to evaluate adolescents' knowledge, attitude, and behavior about reproductive health before and after the program. Data were collected from students aged 11 to 16, at Ilala Municipal, Dar es Salaam, Tanzania. An anonymous 23-item questionnaire provided the data. The program was conducted using a picture drama, reproductive health materials and group discussion. Results In total, 313 questionnaires were distributed and 305 (97.4% were useable for the final analysis. The mean age for girls was 12.5 years and 13.2 years for boys. A large minority of both girls (26.8% and boys (41.4% had experienced sex and among the girls who had experienced sex, 51.2% reported that it was by force. The girls' mean score in the knowledge pre-test was 5.9, and 6.8 in post-test, which increased significantly (t = 7.9, p = 0.000. The mean behavior pre-test score was 25.8 and post-test was 26.6, which showed a significant increase (t = 3.0, p = 0.003. The boys' mean score in the knowledge pre-test was 6.4 and 7.0 for the post-test, which increased significantly (t = 4.5, p = 0.000. The mean behavior pre-test score was 25.6 and 26.4 in post-test, which showed a significant increase (t = 2.4, p = 0.019. However, the pre-test and post-test attitude scores showed no statistically significant difference for either girls or boys. Conclusions Teenagers have sexual experiences including

  14. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  15. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  16. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  17. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving a...

  18. Probability of causation

    International Nuclear Information System (INIS)

    New Zealand population and cancer statistics have been used to derive the probability that an existing cancer in an individual was the result of a known exposure to radiation. Hypothetical case histories illustrate how sex, race, age at exposure, age at presentation with disease, and the type of cancer affect this probability. The method can be used now to identify claims in which a link between exposure and disease is very strong or very weak, and the types of cancer and population sub-groups for which radiation is most likely to be the causative agent. Advantages and difficulties in using a probability of causation approach in legal or compensation hearings are outlined. The approach is feasible for any carcinogen for which reasonable risk estimates can be made

  19. Minimum Probability Flow Learning

    CERN Document Server

    Sohl-Dickstein, Jascha; DeWeese, Michael R

    2009-01-01

    Learning in probabilistic models is often severely hampered by the general intractability of the normalization factor and its derivatives. Here we propose a new learning technique that obviates the need to compute an intractable normalization factor or sample from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the initial flow of probability away from the data distribution. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate the application of minimum probability flow learning to parameter estimation in Ising models, deep belief networks, multivariate Gaussian distributions and a continuous model with a highly general energy function defined as a power series. In the Ising model case, minimum probability flow learning outperforms cur...

  20. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  1. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  2. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  3. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  4. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.; Simon, H.- U.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  5. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  6. Logic, Truth and Probability

    OpenAIRE

    Quznetsov, Gunn

    1998-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  7. Logic and probability

    OpenAIRE

    Quznetsov, G. A.

    2003-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  8. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  11. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  12. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  15. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  16. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  17. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    Science.gov (United States)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  18. Objectifying Subjective Probabilities

    Czech Academy of Sciences Publication Activity Database

    Childers, Timothy

    Dordrecht: Springer, 2012 - ( Weber , M.; Dieks, D.; Gonzalez, W.; Hartman, S.; Stadler, F.; Stöltzner, M.), s. 19-28. (The Philosophy of Science in a European Perspective. 3). ISBN 978-94-007-3029-8. [Pluralism in the Foundations of Statistics. Canterbury (GB), 09.09.2010-10.09.2010] R&D Projects: GA ČR(CZ) GAP401/10/1504 Institutional support: RVO:67985955 Keywords : probabilities * direct Inference Subject RIV: AA - Philosophy ; Religion

  19. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  20. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  1. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  2. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  3. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t

  4. Pre-test calculation of reflooding experiments with wider lattice in APWR-geometry (FLORESTAN 2) using the advanced computer code FLUT-FDWR

    International Nuclear Information System (INIS)

    After the reflooding tests in an extremely tight bundle (p/d=1.06, FLORESTAN 1) have been completed, new experiments for a wider lattice (p/d=1.242, FLORESTAN 2), which is employed in the recent APWR design of KfK, are planned at KfK to obtain the benchmark data for validation and improvement of calculation methods. This report presents the results of pre-test calculations for the FLORESTAN 2 experiment using FLUT-FDWR, a modified version of the GRS computer code FLUT for analysis of the most important behaviour during the reflooding phase after a LOCA in the APWR design. (orig.)

  5. 孤立性肺结节恶性概率估算临床预测模型的建立%Establishment of Clinical Prediction Model to Estimate the Probability of Malignancy in Patients with Solitary Pulmonary Nodules

    Institute of Scientific and Technical Information of China (English)

    张晓辉; 陈成; 曾辉; 宁卫卫; 张楠; 黄建安

    2016-01-01

    Objective To screen the clinical risk factors of lung cancer in the patients with solitary pulmonary nodules ( SPN) ,and build the clinical prediction model to estimate the probability of malignancy.Methods A retrospective analysis was performed on the clinical data and chest imaging characteristics of 270 patients with SPN.Results Among 270 patients,there had 110 (40.7%) cases of lung cancer,and 160 (59.3%) benign lesions.On the analysis of imaging characteristics,lobulation, spiculated sign, pleural indentation sign, contrast enhancement, air bronchogram sign were associated with lung cancer ( P <0.05).Nodules with clear boundary,calcification,homogeneous density were associated with benign lesions (P<0.05).Single factor analysis showed that age, smoking history, malignant imaging characteristics and diameter were significantly affected the judgment of SPN whether it was benign or malignant(P<0.05).The multivariate analysis revealed that age,malignant imaging characteristics and diameter were independent risk factors of lung cancer in the patients with SPN (P<0.01).The clinical pre-diction model to estimate the probability of malignancy as following:P=ex/(1+ex),X=-5.882+0.050* age+1.672*ima-ging characteristic+0.123* the maximum diameter,where the e is the base of the natural logarithm.The cut-off value was 0.46. The sensitivity was 82%,specificity 85%,positive balue 80%,and negative predictive value 87%.The area under the ROC curve for our model was 0.901.Conclusion Age,malignant imaging characteristics and diameter are independent risk factors of lung cancer in the patients with SPN.Our prediction model is accurate and sufficient to estimate the malignancy of patients with SPN.%目的:筛选恶性孤立性肺结节(solitary pulmonary nodules,SPN)的危险因素,构建判断SPN良恶性的临床预测模型。方法回顾性分析孤立性肺结节患者270例的临床资料及胸部影像学特征。结果270例患者中,肺癌110例(40.7

  6. Probable maximum flood control

    International Nuclear Information System (INIS)

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  7. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  8. Accidents, probabilities and consequences

    International Nuclear Information System (INIS)

    Following brief discussion of the safety of wind-driven power plants and solar power plants, some aspects of the safety of fast breeder and thermonuclear power plants are presented. It is pointed out that no safety evaluation of breeders comparable to the Rasmussen investigation has been carried out and that discussion of the safety aspects of thermonuclear power is only just begun. Finally, as an illustration of the varying interpretations of risk and safety analyses, four examples are given of predicted probabilities and consequences in Copenhagen of the maximum credible accident at the Barsebaeck plant, under the most unfavourable meterological conditions. These are made by the Environment Commission, Risoe Research Establishment, REO (a pro-nuclear group) and OOA (an anti-nuclear group), and vary by a factor of over 1000. (JIW)

  9. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  10. 薏苡叶化学成分的预试验%The Coix Leaves Chemical Composition of the Pre-test

    Institute of Scientific and Technical Information of China (English)

    谭冰; 黄锁义; 严焕宁; 史柳芝; 吕龙祥

    2014-01-01

    Experimental study on the Guangxi Coix leaves of the chemical composition of pre-test Chemical reaction identification method of production of the Coix leaves water extract , ethanol extract and petroleum ether extract of Guangxi Coix leaves chemical composition of the pretest. Through the pre-test , suggesting that from Guangxi Coix leaves contain flavonoids,Phenolic,Coumarin,Volatile oil,Phytosterol,Carbohydrate,Glycosides, Tannin,Organic acids,Alkaloids and other chemical constituents. This test provided the experimental basis for further biologically active constituents of the plant.%对广西薏苡叶的化学成分进行预试验研究。采用化学反应鉴别法分别对广西产薏苡叶水提取液、乙醇提取液和石油醚提取液进行化学成分预试。通过预试验,提示广西产薏苡叶中可能含有黄酮类、酚类、香豆素类、挥发油、植物甾醇、糖类、苷类、鞣质、有机酸、生物碱等化学成分。此试验为进一步进行该植物的生物活性成分研究提供了实验基础。

  11. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  12. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  13. The Logic of Parametric Probability

    CERN Document Server

    Norman, Joseph W

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.

  14. Set-up of a pre-test mock-up experiment in preparation for the HCPB Breeder Unit mock-up experimental campaign

    International Nuclear Information System (INIS)

    Highlights: ► As preparation for the HCPB-TBM Breeder Unit out-of-pile testing campaign, a pre-test experiment (PREMUX) has been prepared and described. ► A new heater system based on a wire heater matrix has been developed for imitating the neutronic volumetric heating and it is compared with the conventional plate heaters. ► The test section is described and preliminary thermal results with the available models are presented and are to be benchmarked with PREMUX. ► The PREMUX integration in the air cooling loop L-STAR/LL in the Karlsruhe Institute for Technology is shown and future steps are discussed. -- Abstract: The complexity of the experimental set-up for testing a full-scaled Breeder Unit (BU) mock-up for the European Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) has motivated to build a pre-test mock-up experiment (PREMUX) consisting of a slice of the BU in the Li4SiO4 region. This pre-test aims at verifying the feasibility of the methods to be used for the subsequent testing of the full-scaled BU mock-up. Key parameters needed for the modeling of the breeder material is also to be determined by the Hot Wire Method (HWM). The modeling tools for the thermo-mechanics of the pebble beds and for the mock-up structure are to be calibrated and validated as well. This paper presents the setting-up of PREMUX in the L-STAR/LL facility at the Karlsruhe Institute of Technology. A key requirement of the experiments is to mimic the neutronic volumetric heating. A new heater concept is discussed and compared to several conventional heater configurations with respect to the estimated temperature distribution in the pebble beds. The design and integration of the thermocouple system in the heater matrix and pebble beds is also described, as well as other key aspects of the mock-up (dimensions, layout, cooling system, purge gas line, boundary conditions and integration in the test facility). The adequacy of these methods for the full-scaled BU mock-up is

  15. Physics with exotic probability theory

    OpenAIRE

    Youssef, Saul

    2001-01-01

    Probability theory can be modified in essentially one way while maintaining consistency with the basic Bayesian framework. This modification results in copies of standard probability theory for real, complex or quaternion probabilities. These copies, in turn, allow one to derive quantum theory while restoring standard probability theory in the classical limit. The argument leading to these three copies constrain physical theories in the same sense that Cox's original arguments constrain alter...

  16. Quantum Foundations : Is Probability Ontological ?

    OpenAIRE

    Rosinger, Elemer E

    2007-01-01

    It is argued that the Copenhagen Interpretation of Quantum Mechanics, founded ontologically on the concept of probability, may be questionable in view of the fact that within Probability Theory itself the ontological status of the concept of probability has always been, and is still under discussion.

  17. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  18. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  19. Pretest probability assessment derived from attribute matching

    OpenAIRE

    Hollander Judd E; Diercks Deborah B; Pollack Charles V; Johnson Charles L; Kline Jeffrey A; Newgard Craig D; Garvey J Lee

    2005-01-01

    Abstract Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possib...

  20. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  1. Probabilities of multiple quantum teleportation

    OpenAIRE

    Woesler, Richard

    2002-01-01

    Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...

  2. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  3. Free Probability on a Direct Product of Noncommutative Probability Spaces

    OpenAIRE

    Cho, Ilwoo

    2005-01-01

    In this paper, we observevd the amalgamated free probability of direct product of noncommutative probability spaces. We defined the amalgamated R-transforms, amalgamated moment series and the amalgamated boxed convolution. They maks us to do the amalgamated R-transform calculus, like the scalar-valued case.

  4. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to...

  5. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  6. Decision analysis with approximate probabilities

    Science.gov (United States)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  7. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  8. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  9. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  10. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  11. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  12. Diagnostic accuracy of MRI in adults with suspect brachial plexus lesions: A multicentre retrospective study with surgical findings and clinical follow-up as reference standard

    Energy Technology Data Exchange (ETDEWEB)

    Tagliafico, Alberto, E-mail: alberto.tagliafico@unige.it [Institute of Anatomy, Department of Experimental Medicine, University of Genoa, Largo Rosanna Benzi 8, 16132 Genoa (Italy); Succio, Giulia; Serafini, Giovanni [Department of Radiology, Santa Corona Hospital, Pietra Ligure, Italy via XXV Aprile, 38- Pietra Ligure, 17027 Savona (Italy); Martinoli, Carlo [Radiology Department, DISC, Università di Genova, Largo Rosanna Benzi 8, 16138 Genova (Italy)

    2012-10-15

    Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well.

  13. Diagnostic accuracy of MRI in adults with suspect brachial plexus lesions: A multicentre retrospective study with surgical findings and clinical follow-up as reference standard

    International Nuclear Information System (INIS)

    Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well

  14. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  15. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  16. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  17. Transition probabilities of Br II

    Science.gov (United States)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  18. Induction, of and by Probability

    OpenAIRE

    Rendell, Larry

    2013-01-01

    This paper examines some methods and ideas underlying the author's successful probabilistic learning systems(PLS), which have proven uniquely effective and efficient in generalization learning or induction. While the emerging principles are generally applicable, this paper illustrates them in heuristic search, which demands noise management and incremental learning. In our approach, both task performance and learning are guided by probability. Probabilities are incrementally normalized and re...

  19. Trajectory probability hypothesis density filter

    OpenAIRE

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  20. Hf Transition Probabilities and Abundances

    OpenAIRE

    Lawler, J. E.; Hartog, E.A. den; Labby, Z. E.; Sneden, C.; Cowan, J. J.; Ivans, I. I.

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement...

  1. Gd Transition Probabilities and Abundances

    OpenAIRE

    Hartog, E.A. den; Lawler, J. E.; Sneden, C.; Cowan, J. J.

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has be...

  2. Sm Transition Probabilities and Abundances

    OpenAIRE

    Lawler, J. E.; Hartog, E.A. den; Sneden, C.; Cowan, J. J.

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundanc...

  3. Gaussian Probabilities and Expectation Propagation

    OpenAIRE

    Cunningham, John P.; Hennig, Philipp; Lacoste-Julien, Simon

    2011-01-01

    While Gaussian probability densities are omnipresent in applied mathematics, Gaussian cumulative probabilities are hard to calculate in any but the univariate case. We study the utility of Expectation Propagation (EP) as an approximate integration method for this problem. For rectangular integration regions, the approximation is highly accurate. We also extend the derivations to the more general case of polyhedral integration regions. However, we find that in this polyhedral case, EP's answer...

  4. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  5. Compliance with endogenous audit probabilities

    OpenAIRE

    Konrad, Kai A.; Lohse, Tim; Qari, Salmai

    2015-01-01

    This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...

  6. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  7. Field-Based Video Pre-Test Counseling, Oral Testing, and Telephonic Post-Test Counseling: Implementation of an HIV Field Testing Package among High-Risk Indian Men

    Science.gov (United States)

    Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A.

    2012-01-01

    In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling…

  8. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Joint probability distributions for projection probabilities of random orthonormal states

    International Nuclear Information System (INIS)

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)

  10. Joint probability distributions for projection probabilities of random orthonormal states

    Science.gov (United States)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  11. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  12. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  13. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  14. Born Rule and Noncontextual Probability

    CERN Document Server

    Logiurato, Fabrizio

    2012-01-01

    The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  17. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  18. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  19. Fusion Probability in Dinuclear System

    CERN Document Server

    Hong, Juhee

    2015-01-01

    Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.

  20. Interference of probabilities in dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Zak, Michail, E-mail: michail.zak@gmail.com [Jet Propulsion Laboratory California Institute of Technology, Pasadena, CA 91109 (United States)

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  1. Pre-Test CFD for the Design and Execution of the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.

    2014-01-01

    With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to

  2. Pollock on probability in epistemology

    OpenAIRE

    Fitelson, Branden

    2010-01-01

    In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.

  3. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    Science.gov (United States)

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  4. Transition probability and preferential gauge

    OpenAIRE

    Chen, C.Y.

    1999-01-01

    This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.

  5. Quantum correlations; quantum probability approach

    OpenAIRE

    Majewski, W A

    2014-01-01

    This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...

  6. Diverse Consequences of Algorithmic Probability

    OpenAIRE

    Özkural, Eray

    2011-01-01

    We reminisce and discuss applications of algorithmic probability to a wide range of problems in artificial intelligence, philosophy and technological society. We propose that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline. We also relate to our own work in incremental machine learning and philosophy of complexity.

  7. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  8. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  9. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  10. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    James J. Buckley; Eslami, Esfandiar

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  11. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  12. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  13. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  14. Probability densities in strong turbulence

    Science.gov (United States)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  15. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  16. Sm Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Sneden, C; Cowan, J J

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).

  17. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  18. The probability of extraterrestrial life

    International Nuclear Information System (INIS)

    Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)

  19. Classical Probability and Quantum Outcomes

    Directory of Open Access Journals (Sweden)

    James D. Malley

    2014-05-01

    Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.

  20. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  1. Relative transition probabilities of cobalt

    Science.gov (United States)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  2. Probability for primordial black holes

    Science.gov (United States)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  3. Tight Bernoulli tail probability bounds

    OpenAIRE

    Dzindzalieta, Dainius

    2014-01-01

    The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...

  4. Probability distributions of landslide volumes

    OpenAIRE

    M. T. Brunetti; Guzzetti, F.; M. Rossi

    2009-01-01

    We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...

  5. Field-based video pre-test counseling, oral testing and telephonic post-test counseling: Implementation of an HIV field testing package among high-risk Indian men

    OpenAIRE

    Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A

    2012-01-01

    In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling and then connected them to government facilities. 598 MSM and truck drivers participated in the FTP and completed surveys covering sociodemographics,...

  6. Generating target probability sequences and events

    OpenAIRE

    Ella, Vaignana Spoorthy

    2013-01-01

    Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.

  7. Clinical utility of acoustic radiation force impulse imaging for identification of malignant liver lesions: a meta-analysis

    International Nuclear Information System (INIS)

    To assess the performance of acoustic radiation force impulse (ARFI) imaging for identification of malignant liver lesions using meta-analysis. PubMed, the Cochrane Library, the ISI Web of Knowledge and the China National Knowledge Infrastructure were searched. The studies published in English or Chinese relating to evaluation accuracy of ARFI imaging for identification of malignant liver lesions were collected. A hierarchical summary receiver operating characteristic (HSROC) curve was used to examine the ARFI imaging accuracy. Clinical utility of ARFI imaging for identification of malignant liver lesions was evaluated by Fagan plot analysis. A total of eight studies which included 590 liver lesions were analysed. The summary sensitivity and specificity for identification of malignant liver lesions were 0.86 (95 % confidence interval (CI) 0.74-0.93) and 0.89 (95 % CI 0.81-0.94), respectively. The HSROC was 0.94 (95 % CI 0.91-0.96). After ARFI imaging results over the cut-off value for malignant liver lesions (''positive'' result), the corresponding post-test probability for the presence (if pre-test probability was 50 %) was 89 %; in ''negative'' measurement, the post-test probability was 13 %. ARFI imaging has a high accuracy in the classification of liver lesions. (orig.)

  8. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  9. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  10. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  11. Joint Probability Models of Radiology Images and Clinical Annotations

    Science.gov (United States)

    Arnold, Corey Wells

    2009-01-01

    Radiology data, in the form of images and reports, is growing at a high rate due to the introduction of new imaging modalities, new uses of existing modalities, and the growing importance of objective image information in the diagnosis and treatment of patients. This increase has resulted in an enormous set of image data that is richly annotated…

  12. Garlic: A Concise Drug Review with Probable Clinical Uses

    OpenAIRE

    Vineet Singla; Jai Deep Bajaj; Radhika Bhaskar; Bimlesh Kumar

    2012-01-01

    Garlic and its preparations have been widely recognized as an agent for prevention and treatment of cardiovascular diseases and other metabolic disorders, atherosclerosis, hyperlipidemia, thrombosis, hypertension and hypoglycemia. This review discusses the possible mechanism of therapeutic actions of garlic, different extraction procedures along with determination of its constituents, its stability and dissolution method of garlic tablet.

  13. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  14. Hf Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...

  15. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  16. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  17. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  18. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  19. Delayed neutron emission probability measurements

    International Nuclear Information System (INIS)

    Some neutrons are emitted from fission fragments several seconds to several minutes after fission occurs. These delayed neutrons play a key role for the conduct and in safety aspects of nuclear reactors [1]. But the probabilities to emit such neutrons (Pn) are not well known. A summary of different database and compilation of Pn values is presented to show these discrepancies and uncertainties. Experiments are carried out at the Lohengrin mass spectrometer (at Inst. Laue Langevin in Grenoble) and at the ISOLDE facility (CERN) in order to measure some Pn values. Two different techniques are used: either by using gamma-rays detection or neutron emission detection. These two techniques and some preliminary results are presented. (authors)

  20. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  1. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  2. Transition Probabilities in 189Os

    International Nuclear Information System (INIS)

    The level structure of 189Os has been studied from the decay of 189Ir (13,3 days) produced in proton spallation at CERN and mass separated in the ISOLDE on-line facility. The gamma-ray spectrum has been recorded both with a high resolution Si(Li) - detector and Ge(Li) - detectors. Three previously not reported transitions were observed defining a new level at 348.5 keV. Special attention was given to the low energy level band structure. Several multipolarity mixing ratios were deduced from measured L-subshell ratios which, together with measured level half-lives, gave absolute transition probabilities. The low level decay properties are discussed in terms of the Nilsson model with the inclusion of Coriolis coupling

  3. Transition probabilities for argon I

    International Nuclear Information System (INIS)

    Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    2014-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  6. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  7. Pre-Test and Post-Test Applications to Shape the Education of Phlebotomists in A Quality Management Program: An Experience in A Training Hospital

    Directory of Open Access Journals (Sweden)

    Aykal Güzin

    2016-09-01

    Full Text Available Background: After the introduction of modern laboratory instruments and information systems, preanalytic phase is the new field of battle. Errors in preanalytical phase account for approximately half of total errors in clinical laboratory. The objective of this study was to share an experience of an education program that was believed to be successful in decreasing the number of rejected samples received from the Emergency Department (ED.

  8. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections over a wider energy range for

  9. Failure-probability driven dose painting

    Science.gov (United States)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Berthelsen, Anne K.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Bentzen, Søren M.

    2013-01-01

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed. Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy. Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%. Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity

  10. Failure-probability driven dose painting

    Energy Technology Data Exchange (ETDEWEB)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  11. Avoiding Negative Probabilities in Quantum Mechanics

    CERN Document Server

    Nyambuya, Golden Gadzirayi

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...

  12. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  13. The Black Hole Formation Probability

    CERN Document Server

    Clausen, Drew; Ott, Christian D

    2014-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...

  14. Exact Bures Probabilities of Separability

    CERN Document Server

    Slater, P B

    1999-01-01

    We reexamine the question of what constitutes the conditional Bures or "quantum Jeffreys" prior for a certain four-dimensional convex subset (P) of the eight-dimensional convex set (Q) of 3 x 3 density matrices (rho_{Q}). We find that two competing procedures yield related but not identical priors - the prior previously reported (J. Phys. A 29, L271 [1996]) being normalizable over P, the new prior here, not. Both methods rely upon the same formula of Dittmann for the Bures metric tensor g, but differ in the parameterized form of rho employed. In the earlier approach, the input is a member of P, that is rho_{P}, while here it is rho_{Q}, and only after this computation is the conditioning on P performed. Then, we investigate several one-dimensional subsets of the fifteen-dimensional set of 4 x 4 density matrices, to which we apply, in particular, the first methodology. Doing so, we determine exactly the conditional Bures probabilities of separability into product states of 2 x 2 density matrices. We find that ...

  15. Trajectory versus probability density entropy

    Science.gov (United States)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  16. Homonymous Hemianopsia Associated with Probable Alzheimer's Disease.

    Science.gov (United States)

    Ishiwata, Akiko; Kimura, Kazumi

    2016-01-01

    Posterior cortical atrophy (PCA) is a rare neurodegenerative disorder that has cerebral atrophy in the parietal, occipital, or occipitotemporal cortices and is characterized by visuospatial and visuoperceptual impairments. The most cases are pathologically compatible with Alzheimer's disease (AD). We describe a case of PCA in which a combination of imaging methods, in conjunction with symptoms and neurological and neuropsychological examinations, led to its being diagnosed and to AD being identified as its probable cause. Treatment with donepezil for 6 months mildly improved alexia symptoms, but other symptoms remained unchanged. A 59-year-old Japanese woman with progressive alexia, visual deficit, and mild memory loss was referred to our neurologic clinic for the evaluation of right homonymous hemianopsia. Our neurological examination showed alexia, constructional apraxia, mild disorientation, short-term memory loss, and right homonymous hemianopsia. These findings resulted in a score of 23 (of 30) points on the Mini-Mental State Examination. Occipital atrophy was identified, with magnetic resonance imaging (MRI) showing left-side dominance. The MRI data were quantified with voxel-based morphometry, and PCA was diagnosed on the basis of these findings. Single photon emission computed tomography with (123)I-N-isopropyl-p-iodoamphetamine showed hypoperfusion in the corresponding voxel-based morphometry occipital lobes. Additionally, the finding of hypoperfusion in the posterior associate cortex, posterior cingulate gyrus, and precuneus was consistent with AD. Therefore, the PCA was considered to be a result of AD. We considered Lewy body dementia as a differential diagnosis because of the presence of hypoperfusion in the occipital lobes. However, the patient did not meet the criteria for Lewy body dementia during the course of the disease. We therefore consider including PCA in the differential diagnoses to be important for patients with visual deficit, cognitive

  17. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  18. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  19. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan Cort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  20. Limit Theorems in Free Probability Theory I

    OpenAIRE

    Chistyakov, G. P.; Götze, F.

    2006-01-01

    Based on a new analytical approach to the definition of additive free convolution on probability measures on the real line we prove free analogs of limit theorems for sums for non-identically distributed random variables in classical Probability Theory.

  1. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  2. A Case of Probable Ibuprofen-Induced Acute Pancreatitis

    Directory of Open Access Journals (Sweden)

    Paul Magill

    2006-05-01

    Full Text Available Context :The incidence of drug-induced pancreatitis is rare. There have been no prior definite cases reported of ibuprofen-induced pancreatitis. Case report: We present a case of a young man with acute pancreatitis probably secondary to an ibuprofen overdose. Immediately preceding the onset of the attack he took a 51 mg/kg dose of ibuprofen. He had other causes of acute pancreatitis excluded by clinical history, serum toxicology and abdominal imaging. Discussion :In the absence of re-challenge we believe it is probable that ibuprofen has acausative link with acute pancreatitis.

  3. Estimating Small Probabilities for Langevin Dynamics

    OpenAIRE

    Aristoff, David

    2012-01-01

    The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...

  4. Probability distributions of landslide volumes

    Directory of Open Access Journals (Sweden)

    M. T. Brunetti

    2009-03-01

    Full Text Available We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3VL≤1013 m3. We determine the probability density of landslide volumes, p(VL, using kernel density estimation. Each landslide dataset exhibits heavy tailed (self-similar behaviour for their frequency-size distributions, p(VL as a function of VL, for failures exceeding different threshold volumes, VL*, for each dataset. These non-cumulative heavy-tailed distributions for each dataset are negative power-laws, with exponents 1.0≤β≤1.9, and averaging β≈1.3. The scaling behaviour of VL for the ensemble of the 19 datasets is over 17 orders of magnitude, and is independent of lithological characteristics, morphological settings, triggering mechanisms, length of period and extent of the area covered by the datasets, presence or lack of water in the failed materials, and magnitude of gravitational fields. We argue that the statistics of landslide volume is conditioned primarily on the geometrical properties of the slope or rock mass where failures occur. Differences in the values of the scaling exponents reflect the primary landslide types, with rock falls exhibiting a smaller scaling exponent (1.1≤β≤1.4 than slides and soil slides (1.5≤β≤1.9. We argue that the difference is a consequence of the disparity in the mechanics of rock falls and slides.

  5. On the measurement probability of quantum phases

    OpenAIRE

    Schürmann, Thomas

    2006-01-01

    We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

  6. Uniqueness in ergodic decomposition of invariant probabilities

    OpenAIRE

    Zimmermann, Dieter

    1992-01-01

    We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.

  7. Equivalence of two orthogonalities between probability measures

    OpenAIRE

    Takatsu, Asuka

    2011-01-01

    Given any two probability measures on a Euclidean space with mean 0 and finite variance, we demonstrate that the two probability measures are orthogonal in the sense of Wasserstein geometry if and only if the two spaces by spanned by the supports of each probability measure are orthogonal.

  8. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.; Hole, Arna Risa; Rutström, E. Elisabeth

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  9. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  10. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  12. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  13. Combination of clinical and v/q scan assessment for the diagnosis of pulmonary embolism: a 2-year outcome prospective study

    Energy Technology Data Exchange (ETDEWEB)

    Barghouth, G.; Boubaker, A.; Delaloye, A.B. [Univ. Hospital, Lausanne (Switzerland). Dept. of Nuclear Medicine; Yersin, B. [Dept. of Internal Medicine, Univ. Hospital, Lausanne (Switzerland); Doenz, F.; Schnyder, P. [Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland). Dept. of Radiology

    2000-09-01

    With the aim of evaluating the efficiency of our diagnositc approach in patients with suspected acute pulmonary embolism (PE), we prospectively studied 143 patients investigated by means of a ventilation/perfusion (V/Q) lung scan. A pre-test clinical probability of PE (P{sub clin}) was assigned to all patients by the clinicians and scans were interpreted blinded to clinical assessment. A 2-year follow-up of our patients was systematically performed and possible in 134 cases. Distribution of clinical probabilities was high P{sub clin} in 22.5%, intermediate P{sub clin} in 24% and low P{sub clin} in 53.5%, whereas the distribution of scan categories was high P{sub scan} in 14%, intermediate P{sub scan} in 18%, low P{sub scan} in 57% and normal P{sub scan} in 11%. The final prevalence of PE was 24.5%. High P{sub scan} and normal P{sub scan} were always conclusive (19 and 15 cases respectively). Low P{sub scan} associated with low P{sub clin} could exclude PE in 43/45 cases (96%). Noe of the patients in whom the diagnosis of PE was discarded had a major event related to PE during the 2-year follow-up. Overall, the combined assessment of clinical and scintigraphic probabilities allowed confirmation or exclusion of PE in 80% of subjects (107/134) and proved to be a valuable tool for selecting patients who needed pulmonary angiography, which was required in 20% of our patients (27/134). (orig.)

  14. Pretest probability assessment derived from attribute matching

    Directory of Open Access Journals (Sweden)

    Hollander Judd E

    2005-08-01

    Full Text Available Abstract Background Pretest probability (PTP assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS. We compare the new method with a validated logistic regression equation (LRE. Methods Eight clinical variables (attributes were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82 for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77 for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25% patients as having a PTP Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE.

  15. TPmsm: Estimation of the Transition Probabilities in 3-State Models

    Directory of Open Access Journals (Sweden)

    Artur Araújo

    2014-12-01

    Full Text Available One major goal in clinical applications of multi-state models is the estimation of transition probabilities. The usual nonparametric estimator of the transition matrix for non-homogeneous Markov processes is the Aalen-Johansen estimator (Aalen and Johansen 1978. However, two problems may arise from using this estimator: first, its standard error may be large in heavy censored scenarios; second, the estimator may be inconsistent if the process is non-Markovian. The development of the R package TPmsm has been motivated by several recent contributions that account for these estimation problems. Estimation and statistical inference for transition probabilities can be performed using TPmsm. The TPmsm package provides seven different approaches to three-state illness-death modeling. In two of these approaches the transition probabilities are estimated conditionally on current or past covariate measures. Two real data examples are included for illustration of software usage.

  16. Probability and Quantum Paradigms: the Interplay

    International Nuclear Information System (INIS)

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology

  17. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  18. Probability and Quantum Paradigms: the Interplay

    Science.gov (United States)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  19. Probable Effects Of Exposure To Electromagnetic Waves Emitted From Video Display Terminals On Ocular Functions

    International Nuclear Information System (INIS)

    There is growing body of evidence that usage of computers can adversely affect the visual health. Considering the rising number of computer users in Egypt, computer-related visual symptoms might take an epidemic form. In view of that, this study was undertaken to find out the magnitude of the visual problems in computer operators and its relationship with various personal and workplace factors. Aim: To evaluate the probable effects of exposure to electromagnetic waves radiated from visual display terminals on some visual functions. Subjects and Methods: hundred fifty computer operators working in different institutes were randomly selected. They were asked to fill a pre-tested questionnaire (written in Arabic), after obtaining their verbal consent. The selected exposed subjects were were subjected to the following clinical assessment: 1-Visual acuity measurements 2-Refraction (using autorefractometer). 3- Measurements of the ocular dryness defects using the following different diagnostic tests: Schirmer test-,Fluorescein staining , Rose Bengal staining, Tear Break Up Time (TBUT) and LIPCOF test (lid parallel conjunctival fold). A control group included hundred fifty participants, they are working in a field does not necessitate exposure to video display terminals. Inclusion criteria of the subjects were as follows: minimum three symptoms of computer vision syndrome (CVS), minimum one year exposure to (VDT, s) and minimum 6 hs/day in 5working days/week. Exclusion criteria included candidates having ocular pathology like: glaucoma, optic atrophy, diabetic retinopathy, papilledema The following complaints were studied: 1-Tired eyes. 2- Burning eyes with excessive tear production. 3-Dry sore eyes 4-Blurred near vision (letters on the screen run together). 5-Asthenopia. 6-Neck, shoulder and back aches, overall bodily fatigue or tiredness. An interventional protective measure for the selected subjects from the exposed group was administered, it included the following (1

  20. Introduction: Research and Developments in Probability Education

    OpenAIRE

    Manfred Borovcnik; Ramesh Kapadia

    2009-01-01

    In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...

  1. Probabilities and signalling in quantum field theory

    OpenAIRE

    Dickinson, Robert; Forshaw, Jeff; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators i...

  2. Local Percolation Probabilities for a Natural Sandstone

    OpenAIRE

    Hilfer, R.; Rag, T.; Virgi, B.

    1996-01-01

    Local percolation probabilities are used to characterize the connectivity in porous and heterogeneous media. Together with local porosity distributions they allow to predict transport properties \\cite{hil91d}. While local porosity distributions are readily obtained, measurements of the local percolation probabilities are more difficult and have not been attempted previously. First measurements of three dimensional local porosity distributions and percolation probabilities from a pore space re...

  3. Are All Probabilities Fundamentally Quantum Mechanical?

    OpenAIRE

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype e...

  4. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the...... motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  5. Lagrangian Probability Distributions of Turbulent Flows

    OpenAIRE

    Friedrich, R.

    2002-01-01

    We outline a statistical theory of turbulence based on the Lagrangian formulation of fluid motion. We derive a hierarchy of evolution equations for Lagrangian N-point probability distributions as well as a functional equation for a suitably defined probability functional which is the analog of Hopf's functional equation. Furthermore, we adress the derivation of a generalized Fokker-Plank equation for the joint velocity - position probability density of N fluid particles.

  6. Quantum Statistical Mechanics. III. Equilibrium Probability

    OpenAIRE

    Attard, Phil

    2014-01-01

    Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.

  7. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  8. Time and probability in quantum cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)

    1990-10-01

    A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).

  9. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  10. Bayesian logistic betting strategy against probability forecasting

    CERN Document Server

    Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei

    2012-01-01

    We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.

  11. Entailment in Probability of Thresholded Generalizations

    OpenAIRE

    Bamber, Donald

    2013-01-01

    A nonmonotonic logic of thresholded generalizations is presented. Given propositions A and B from a language L and a positive integer k, the thresholded generalization A=>B{k} means that the conditional probability P(B|A) falls short of one by no more than c*d^k. A two-level probability structure is defined. At the lower level, a model is defined to be a probability function on L. At the upper level, there is a probability distribution over models. A definition is given of what it means for a...

  12. Advantages of the probability amplitude over the probability density in quantum mechanics

    OpenAIRE

    Kurihara, Yoshimasa; Quach, Nhi My Uyen

    2013-01-01

    We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...

  13. Incorporating medical interventions into carrier probability estimation for genetic counseling

    Directory of Open Access Journals (Sweden)

    Katki Hormuzd A

    2007-03-01

    Full Text Available Abstract Background Mendelian models for predicting who may carry an inherited deleterious mutation of known disease genes based on family history are used in a variety of clinical and research activities. People presenting for genetic counseling are increasingly reporting risk-reducing medical interventions in their family histories because, recently, a slew of prophylactic interventions have become available for certain diseases. For example, oophorectomy reduces risk of breast and ovarian cancers, and is now increasingly being offered to women with family histories of breast and ovarian cancer. Mendelian models should account for medical interventions because interventions modify mutation penetrances and thus affect the carrier probability estimate. Methods We extend Mendelian models to account for medical interventions by accounting for post-intervention disease history through an extra factor that can be estimated from published studies of the effects of interventions. We apply our methods to incorporate oophorectomy into the BRCAPRO model, which predicts a woman's risk of carrying mutations in BRCA1 and BRCA2 based on her family history of breast and ovarian cancer. This new BRCAPRO is available for clinical use. Results We show that accounting for interventions undergone by family members can seriously affect the mutation carrier probability estimate, especially if the family member has lived many years post-intervention. We show that interventions have more impact on the carrier probability as the benefits of intervention differ more between carriers and non-carriers. Conclusion These findings imply that carrier probability estimates that do not account for medical interventions may be seriously misleading and could affect a clinician's recommendation about offering genetic testing. The BayesMendel software, which allows one to implement any Mendelian carrier probability model, has been extended to allow medical interventions, so future

  14. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  15. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  16. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out-c...... well as for bimodal processes with two dominating frequencies in the structural response....

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  18. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability of...

  19. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  20. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  1. Correlation as Probability of Common Descent.

    Science.gov (United States)

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  2. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  3. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  4. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  5. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  6. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  7. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  8. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  9. The enigma of probability and physics

    International Nuclear Information System (INIS)

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  10. Quantum probability assignment limited by relativistic causality.

    Science.gov (United States)

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  11. An Orientation Program for Clinical Adjunct Faculty.

    Science.gov (United States)

    Rice, Gwendolyn

    2016-01-01

    Having highly competent clinical faculty in an institution of higher learning is a prerequisite for graduating safe nurses in the future. The purpose of this project was to increase each clinical nurse's knowledge and skills for the new role of clinical adjunct nursing faculty. Successful implementation of this program will help promote consistency in effective job performance of clinical adjunct faculty and facilitate achievement of the projected goals and outcomes. This orientation program was presented in a one day face-to-face encounter with twelve (12) adjunct faculty members, tenured and others on the tenured track. These faculty members were hired by City Colleges of Chicago (CCC) School of Nursing Program at the Malcolm X College. Presentations were given by attendees with a lesson plan. Pre-test, post-test and evaluation forms were presented and it was agreed that an orientation program should be developed and presented to all newly hired clinical adjunct nursing faculty at CCC. PMID:26930766

  12. Angular anisotropy representation by probability tables

    International Nuclear Information System (INIS)

    In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)

  13. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  14. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  15. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  16. Radiative lifetimes and atomic transition probabilities

    International Nuclear Information System (INIS)

    Radiative lifetimes and atomic transition probabilities have been measured for over 35 neutral and singly ionized species in the Wisconsin Atomic Transition Probabilities (WATP) Program since it began in 1980. Radiative lifetimes are measured using time-resolved laser-induced fluorescence of a slow atomic/ionic beam. These lifetimes are combined with branching fractions to yield absolute atomic transition probabilities for neutral and singly ionized species. The branching fractions are determined from emission spectra recorded using the 1.0 m Fourier-transform spectrometer at the National Solar Observatory. The current focus of the WATP Program is on the rare-earth elements, in particular Tm, Dy, and Ho

  17. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from the...... same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects with...

  18. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  19. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  20. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  1. Miscorrection probability beyond the minimum distance

    OpenAIRE

    Cassuto, Yuval; Bruck, Jehoshua

    2004-01-01

    The miscorrection probability of a list decoder is the probability that the decoder will have at least one non-causal codeword in its decoding sphere. Evaluating this probability is important when using a list-decoder as a conventional decoder since in that case we require the list to contain at most one codeword for most of the errors. A lower bound on the miscorrection is the main result. The key ingredient in the proof is a new combinatorial upper bound on the list-size for a general q−ary...

  2. Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities

    OpenAIRE

    Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina

    2012-01-01

    More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...

  3. Choosing information variables for transition probabilities in a time-varying transition probability Markov switching model

    OpenAIRE

    Andrew J. Filardo

    1998-01-01

    This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...

  4. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  5. Asymmetry of the work probability distribution

    OpenAIRE

    Saha, Arnab; Bhattacharjee, J. K.

    2006-01-01

    We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.

  6. Transition Probability and the ESR Experiment

    Science.gov (United States)

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  7. Transition Probability Estimates for Reversible Markov Chains

    OpenAIRE

    Telcs, Andras

    2000-01-01

    This paper provides transition probability estimates of transient reversible Markov chains. The key condition of the result is the spatial symmetry and polynomial decay of the Green's function of the chain.

  8. Transition probabilities in superfluid He4

    International Nuclear Information System (INIS)

    The transition probabilities between various states of superfluid helium-4 are found by using the approximation method of Bogolyubov and making use of his canonical transformations for different states of transitions. (author)

  9. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  10. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  11. Determinantal Probability: Basic Properties and Conjectures

    OpenAIRE

    Lyons, Russell

    2014-01-01

    We describe the fundamental constructions and properties of determinantal probability measures and point processes, giving streamlined proofs. We illustrate these with some important examples. We pose several general questions and conjectures.

  12. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination of...... the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  13. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of...

  14. Weak Convergence of Probability Measures Revisited

    OpenAIRE

    Salinetti, G.; Wets, R. J.-B.

    1987-01-01

    The hypo-convergence of upper semicontinuous functions provides a natural framework for the study of the convergence of probability measures. This approach also yields some further characterizations of weak convergence and tightness.

  15. Stimulus probability effects in absolute identification.

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959

  16. Probability of spent fuel transportation accidents

    Energy Technology Data Exchange (ETDEWEB)

    McClure, J. D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10/sup -7/ spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10/sup -9//mile.

  17. Measuring the impact of information literacy e-learning and in-class courses via pre-tests and post-test at the Faculty of Medicine, Masaryk University

    Directory of Open Access Journals (Sweden)

    Jiří Kratochvíl

    2014-10-01

    Full Text Available Introduction: This paper aims to evaluate the results of the assessment and comparison of the impact of information literacy in e-learning and in-class courses at the Faculty of Medicine, Masaryk University, Czech Republic. The objective herein is to show that e-learning can be as effective a method of teaching IL activities as in-class lessons. Methods: In the autumn of 2012 and the spring of 2013, a total of 159 medical students enrolled in the e-learning course and completed the required pre-tests and post-tests comprising 30 multiple-choice questions on information literacy topics; another 92 PhD students from in-class courses took the 22-question test. The pre-test and post-test scores along with the number of students who correctly answered the questions were counted and the overall percentage was calculated. The final outcome was the extent of knowledge increase and the number of students with correct answers, expressed in percentage. Results: On average, 95.5% and 92.5% increase in knowledge was recorded among the medical students and PhD students respectively; an average of 4.5% medical students and 7.5% of PhD students recorded low scores in the post-test. As for the number of correct answers, the average results of the 22 set questions shared among the study groups were as follows: 15 questions were answered correctly more often by medical students, 6 were answered correctly more often by PhD students and only 1 question was correctly answered in the same average percentage by both the groups. Discussion: The results point to the need for proposing several key revisions. Among these include an exercise to be included in both curricula on online search for an article (Web of Science or Scopus without full text availability via link service, while instructions on manually creating bibliographic references shall be added to the PhD course. Additional search examples shall be added to the study materials and video records of in

  18. Subjective Probability and the Theory of Games

    OpenAIRE

    Kadane, Joseph B.; Patrick D. Larkey

    1982-01-01

    This paper explores some of the consequences of adopting a modern subjective view of probability for game theory. The consequences are substantial. The subjective view of probability clarifies the important distinction between normative and positive theorizing about behavior in games, a distinction that is often lost in the search for "solution concepts" which largely characterizes game theory since the work of von Neumann and Morgenstern. Many of the distinctions that appear important in con...

  19. A case concerning the improved transition probability

    OpenAIRE

    Tang, Jian; Wang, An Min

    2006-01-01

    As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...

  20. Atomic transition probabilities of neutral samarium

    International Nuclear Information System (INIS)

    Absolute atomic transition probabilities from a combination of new emission branching fraction measurements using Fourier transform spectrometer data with radiative lifetimes from recent laser induced fluorescence measurements are reported for 299 lines of the first spectrum of samarium (Sm i). Improved values for the upper and lower energy levels of these lines are also reported. Comparisons to published transition probabilities from earlier experiments show satisfactory and good agreement with two of the four published data sets. (paper)

  1. Validation of fluorescence transition probability calculations

    OpenAIRE

    M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)

    2015-01-01

    A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...

  2. Generalized couplings and convergence of transition probabilities

    OpenAIRE

    Kulik, Alexei; Scheutzow, Michael

    2015-01-01

    We provide sufficient conditions for the uniqueness of an invariant measure of a Markov process as well as for the weak convergence of transition probabilities to the invariant measure. Our conditions are formulated in terms of generalized couplings. We apply our results to several SPDEs for which unique ergodicity has been proven in a recent paper by Glatt-Holtz, Mattingly, and Richards and show that under essentially the same assumptions the weak convergence of transition probabilities actu...

  3. Semiclassical transition probabilities for interacting oscillators

    OpenAIRE

    Khlebnikov, S. Yu.

    1994-01-01

    Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...

  4. Country Default Probabilities: Assessing and Backtesting

    OpenAIRE

    Vogl, Konstantin; Maltritz, Dominik; Huschens, Stefan; Karmann, Alexander

    2006-01-01

    We address the problem how to estimate default probabilities for sovereign countries based on market data of traded debt. A structural Merton-type model is applied to a sample of emerging market and transition countries. In this context, only few and heterogeneous default probabilities are derived, which is problematic for backtesting. To deal with this problem, we construct likelihood ratio test statistics and quick backtesting procedures.

  5. Transition probability studies in 175Au

    OpenAIRE

    Grahn, Tuomas; Watkins, H.; Joss, David; Page, Robert; Carroll, R. J.; Dewald, A.; Greenlees, Paul; Hackstein, M.; Herzberg, Rolf-Dietmar; Jakobsson, Ulrika; Jones, Peter; Julin, Rauno; Juutinen, Sakari; Ketelhut, Steffen; Kröll, Th

    2013-01-01

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms...

  6. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  7. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  8. Non-Gaussian Photon Probability Distribution

    International Nuclear Information System (INIS)

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub wavelength confinement; thereby providing a strong case that

  9. Avoiding Negative Probabilities in Quantum Mechanics

    OpenAIRE

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless questi...

  10. Breakdown Point Theory for Implied Probability Bootstrap

    OpenAIRE

    Lorenzo Camponovo; Taisuke Otsu

    2011-01-01

    This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...

  11. The Pauli equation for probability distributions

    International Nuclear Information System (INIS)

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  12. The Pauli equation for probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it

    2001-04-27

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  13. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  14. Failure probability of ceramic coil springs

    OpenAIRE

    Nohut, Serkan; Schneider, Gerold A.

    2009-01-01

    Ceramic springs are commercially available and a detailed reliability analysis of these components would be useful for their introduction in new applications. In this paper an analytical and a numerical analyses of the failure probability for coil springs under compression is presented. Based on analytically derived relationships and numerically calculated results, fitting functions for volume and surface flaws will be introduced which provide the prediction of the failure probability of cera...

  15. On the Robustness of Most Probable Explanations

    OpenAIRE

    Chan, Hei; Darwiche, Adnan

    2012-01-01

    In Bayesian networks, a Most Probable Explanation (MPE) is a complete variable instantiation with a highest probability given the current evidence. In this paper, we discuss the problem of finding robustness conditions of the MPE under single parameter changes. Specifically, we ask the question: How much change in a single network parameter can we afford to apply while keeping the MPE unchanged? We will describe a procedure, which is the first of its kind, that computes this answer for each p...

  16. The cumulative reaction probability as eigenvalue problem

    Science.gov (United States)

    Manthe, Uwe; Miller, William H.

    1993-09-01

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.

  17. The cumulative reaction probability as eigenvalue problem

    International Nuclear Information System (INIS)

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=summation kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems---transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0

  18. Presmoothing the transition probabilities in the illness-death model

    OpenAIRE

    Amorim, Ana Paula de; De Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2011-01-01

    Abstract One major goal in clinical applications of multi-state models is the estimation of transition probabilities. In a recent paper, Meira-Machado, de U?a-Alvarez and Cadarso-Suarez (2006) introduce a substitute for the Aalen- Johansen estimator in the case of a non-Markov illness-death model. The idea behind their estimator is to weight the data by the Kaplan-Meier weights pertaining to the distribution of the total survival time of the process. In this paper we propose a modi...

  19. Probability to retrieve testicular spermatozoa in azoospermic patients

    Institute of Scientific and Technical Information of China (English)

    H.-J.Glander; L.-C.Horn; W.Dorschner; U.Paasch; J.Kratzsch

    2000-01-01

    Aim: The degree of probability to retrieve spermatozoa from testicular tissue for intracytoplasmic sperm injection into oocytes is of interest for counselling of infertility patients. We investigated the relation of sperm retrieval to clinical data and histological pattern in testicular biopsies from azoospermic patients. Methods: In 264 testicular biopsies from 142 azoospermic patients, the testicular tissue was shredded to separate the spermatozoa, histological semi - thin sections of which were then evaluated using Johnsen score. Results: The retrieval of spermatozoa correlated significantly ( P 18 U/L, testicular volume < 5 mL, mean Johnsen score<5, and maximum Johnsen score < 7.

  20. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  1. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  2. Approximation of Failure Probability Using Conditional Sampling

    Science.gov (United States)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  3. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  4. Economic choices reveal probability distortion in macaque monkeys

    OpenAIRE

    Stauffer, William R.; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-01-01

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas ...

  5. Consistent probabilities in loop quantum cosmology

    CERN Document Server

    Craig, David A

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler-DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce vs. a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation v...

  6. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  7. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  8. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  9. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  10. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  11. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.

  12. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  13. Sampling Quantum Nonlocal Correlations with High Probability

    Science.gov (United States)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  14. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  15. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  16. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner;

    2007-01-01

    Cardiovascular Society grade 2 pain or higher (n=176) or high (higher than 85%) estimated pretest likelihood of disease (n=142). RESULTS: In the three groups, 34% to 39% of male patients and 65% to 69% of female patients had normal MPS, while 37% to 38% and 60% to 71%, respectively, had insignificant findings on...... CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...... in male patients, but it makes the high probability groups so small that the addition appears to be of limited clinical relevance....

  17. A structural model of intuitive probability

    CERN Document Server

    Dessalles, Jean-Louis

    2011-01-01

    Though the ability of human beings to deal with probabilities has been put into question, the assessment of rarity is a crucial competence underlying much of human decision-making and is pervasive in spontaneous narrative behaviour. This paper proposes a new model of rarity and randomness assessment, designed to be cognitively plausible. Intuitive randomness is defined as a function of structural complexity. It is thus possible to assign probability to events without being obliged to consider the set of alternatives. The model is tested on Lottery sequences and compared with subjects' preferences.

  18. Quantum measurements and Kolmogorovian probability theory

    CERN Document Server

    Slavnov, D A

    2003-01-01

    We establish connections between the requirement of measurability of a probability space and the principle of complimentarity in quantum mechanics. It is shown that measurability of a probability space implies the dependence of results of quantum measurement not only on the properties of a quantum object under consideration, but also on the classical characteristics of the measuring device which is used. We show that if one takes into account the requirement of measurability in a quantum case, the Bell inequality does not follow from the hypothesis about the existence of an objective reality.

  19. Electric quadrupole transition probabilities for atomic lithium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT

  20. Poisson spaces with a transition probability

    OpenAIRE

    Landsman, N. P.

    1997-01-01

    The common structure of the space of pure states $P$ of a classical or a quantum mechanical system is that of a Poisson space with a transition probability. This is a topological space equipped with a Poisson structure, as well as with a function $p:P\\times P-> [0,1]$, with certain properties. The Poisson structure is connected with the transition probabilities through unitarity (in a specific formulation intrinsic to the given context). In classical mechanics, where $p(\\rho,\\sigma)=\\dl_{\\rho...

  1. Transition probability studies in 175Au

    International Nuclear Information System (INIS)

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms of available systematics as a function of atomic number and aligned angular momentum.

  2. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  3. Probability, statistics, and decision for civil engineers

    CERN Document Server

    Benjamin, Jack R

    2014-01-01

    Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and

  4. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  5. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  6. Probability groups as orbits of groups

    International Nuclear Information System (INIS)

    The set of double cosets of a group with respect to a subgroup and the set of orbits of a group with respect to a group of automorphisms have structures which can be studied as multigroups, hypergroups or Pasch geometries. When the subgroup or the group of automorphisms are finite, the multivalued products can be provided with some weightages forming so-called Probability Groups. It is shown in this paper that some abstract probability groups can be realized as orbit spaces of groups. (author)

  7. Quantum probability and quantum decision making

    CERN Document Server

    Yukalov, V I

    2016-01-01

    A rigorous general definition of quantum probability is given, which is valid for elementary events and for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  8. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  9. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  10. What is probability? The importance of probability when dealing with technical risks

    International Nuclear Information System (INIS)

    The book handles the following themes: - different aspects in connection with the probability concept including the mathematical fundamentals, - the importance of the probability concepts for the estimation of the effects of various activities, - the link between risk and time and the utilisation of concepts for describing this link, - the application of the probability concept in various engineering fields, - complementary attempts for the probabilistic safety analysis of systems. figs., tabs., refs

  11. Direct Updating of an RNA Base-Pairing Probability Matrix with Marginal Probability Constraints

    OpenAIRE

    Hamada, Michiaki

    2012-01-01

    A base-pairing probability matrix (BPPM) stores the probabilities for every possible base pair in an RNA sequence and has been used in many algorithms in RNA informatics (e.g., RNA secondary structure prediction and motif search). In this study, we propose a novel algorithm to perform iterative updates of a given BPPM, satisfying marginal probability constraints that are (approximately) given by recently developed biochemical experiments, such as SHAPE, PAR, and FragSeq. The method is easily ...

  12. Reduced reward-related probability learning in schizophrenia patients

    Directory of Open Access Journals (Sweden)

    Yılmaz A

    2012-01-01

    Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation

  13. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H1,H2), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H1),P(H2), to the subspaces H1, H2. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    Energy Technology Data Exchange (ETDEWEB)

    Vourdas, A. [Department of Computing, University of Bradford, Bradford BD7 1DP (United Kingdom)

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  15. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    Science.gov (United States)

    Vourdas, A.

    2014-08-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator {{D}}(H_1, H_2), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors {{P}}(H_1), {{P}}(H_2), to the subspaces H1, H2. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  16. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    OpenAIRE

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d- dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities, is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H1,H2),...

  17. Probable maximum floods: Making a collective judgment

    International Nuclear Information System (INIS)

    A critical review is presented of current procedures for estimation of the probable maximum flood (PMF). The historical development of the concept and the flaws in current PMF methodology are discussed. The probable maximum flood concept has been criticized by eminent hydrologists on the basis that it violates scientific principles, and has been questioned from a philosophical viewpoint particularly with regard to the implications of a no-risk criterion. The PMF is not a probable maximum flood, and is less by an arbitrary amount. A more appropriate term would be 'conceivable catastrophic flood'. The methodology for estimating probable maximum precipitation is reasonably well defined and has to a certain extent been verified. The methodology for estimating PMF is not well defined and has not been verified. The use of the PMF concept primarily reflects a need for engineering expediency and does not meet the standards for scientific truth. As the PMF is an arbitrary concept, collective judgement is an important component of making PMF estimates. The Canadian Dam Safety Association should play a leading role in developing guidelines and standards. 18 refs

  18. Adiabatic transition probability for a tangential crossing

    OpenAIRE

    Watanabe, Takuya

    2006-01-01

    We consider a time-dependent Schrödinger equation whose Hamiltonian is a $2\\times 2$ real symmetric matrix. We study, using an exact WKB method, the adiabatic limit of the transition probability in the case where several complex eigenvalue crossing points accumulate to one real point.

  19. Markov Chains with Stochastically Stationary Transition Probabilities

    OpenAIRE

    Orey, Steven

    1991-01-01

    Markov chains on a countable state space are studied under the assumption that the transition probabilities $(P_n(x,y))$ constitute a stationary stochastic process. An introductory section exposing some basic results of Nawrotzki and Cogburn is followed by four sections of new results.

  20. Dynamic Estimation of Credit Rating Transition Probabilities

    OpenAIRE

    Berd, Arthur M.

    2009-01-01

    We present a continuous-time maximum likelihood estimation methodology for credit rating transition probabilities, taking into account the presence of censored data. We perform rolling estimates of the transition matrices with exponential time weighting with varying horizons and discuss the underlying dynamics of transition generator matrices in the long-term and short-term estimation horizons.

  1. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381. ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  2. Learning a Probability Distribution Efficiently and Reliably

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  3. Partially Specified Probabilities: Decisions and Games

    OpenAIRE

    Ehud Lehrer

    2012-01-01

    The paper develops a theory of decision making based on partially specified probabilities. It takes an axiomatic approach using Anscombe and Aumann's (1963) setting, and is based on the concave integral for capacities. This theory is then expanded to interactive models in order to extend Nash equilibrium by introducing the concept of partially specified equilibrium. (JEL C70, D81, D83)

  4. The Britannica Guide to Statistics and Probability

    CERN Document Server

    2011-01-01

    By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction

  5. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  6. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on...

  7. Probability Theories and the Justification of Theism

    OpenAIRE

    Portugal, Agnaldo Cuoco

    2003-01-01

    In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.

  8. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  9. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  10. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  11. A priori probabilities of separable quantum states

    International Nuclear Information System (INIS)

    Zyczkowski, Horodecki, Sanpera and Lewenstein (ZHSL) recently proposed a 'natural measure' on the N-dimensional quantum systems, but expressed surprise when it led them to conclude that for N=2x2, disentangled (separable) systems are more probable (0.632±0.002) in nature than entangled ones. We contend, however, that ZHSL's (rejected) intuition has, in fact, a sound theoretical basis, and that the a priori probability of disentangled 2x2 systems should more properly be viewed as (considerably) less than 0.5. We arrive at this conclusion in two quite distinct ways, the first based on classical and the second, quantum considerations. Both approaches, however, replace (in whole or part) the ZHSL (product) measure by ones based on the volume elements of monotone metrics, which in the classical case amounts to adopting the Jeffreys' prior of Bayesian theory. Only the quantum-theoretic analysis - which yields the smallest probabilities of disentanglement - uses the minimum number of parameters possible, that is N2-1, as opposed to N2+N-1 (although this 'over-parametrization', as recently indicated by Byrd, should be avoidable). However, despite substantial computation, we are not able to obtain precise estimates of these probabilities and the need for additional (possibly supercomputer) analyses is indicated - particularly so for higher-dimensional quantum systems (such as the 2x3 ones, which we also study here). (author)

  12. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  13. The Pauli Equation for Probability Distributions

    OpenAIRE

    Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.

    2000-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  14. The Pauli Equation for Probability Distributions

    CERN Document Server

    Mancini, S; Man'ko, V I; Tombesi, P

    2001-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  15. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  16. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  17. Stable symmetric probability laws in quantum mechanics

    International Nuclear Information System (INIS)

    The aim of this note is to find all possible symmetric limit probability distribution operators for arbitrarily normed sequences of sums of independent pairs of canonical observables. The main, rather unexpected, result is that the ground states are the only such limit operators. (author)

  18. Asymptotic probability density functions in turbulence

    OpenAIRE

    Minotti, F. O.; Speranza, E.

    2007-01-01

    A formalism is presented to obtain closed evolution equations for asymptotic probability distribution functions of turbulence magnitudes. The formalism is derived for a generic evolution equation, so that the final result can be easily applied to rather general problems. Although the approximation involved cannot be ascertained a priori, we show that application of the formalism to well known problems gives the correct results.

  19. On the probability of being synchronizable

    OpenAIRE

    Berlinkov, Mikhail V.

    2013-01-01

    We prove that a random automaton with $n$ states and any fixed non-singleton alphabet is synchronizing with high probability. Moreover, we also prove that the convergence rate is exactly $1-\\Theta(\\frac{1}{n})$ as conjectured by Cameron \\cite{CamConj} for the most interesting binary alphabet case.

  20. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  1. A real formula for transition probabilities

    Directory of Open Access Journals (Sweden)

    Alessandra Luati

    2007-10-01

    Full Text Available Transition probabilities between states in two dimensional quantum systems are derived as functions of unit vectors in R3 instead of state vectors in C2. This can be done once represented states and von Neumann measurements acting on C2 by means of vectors on the unit sphere of R3.

  2. Structure Functions Are Not Parton Probabilities

    International Nuclear Information System (INIS)

    The common view that structure functions measured in deep inelastic lepton scattering are determined by the probability distribution of quarks and gluons in the target is not correct. We show that the leading-twist cross section is affected by the rescattering of the struck quark in the target. This is consistent with the Glauber-Gribov interpretation of shadowing as a rescattering effect

  3. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  4. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  5. The Probability of Blindness in Patients Treated for Glaucoma

    Directory of Open Access Journals (Sweden)

    Li-Chun Chang

    2005-07-01

    Full Text Available Background: To investigate the risk factors and probability of blindness in patients treatedfor glaucoma.Methods: The study design was a retrospective, hospital-based, clinical chart, reviewstudy. Medical records were reviewed from patients seen between January2003 and December 2003 at the Kaohsiung Chang Gung Memorial Hospitaleye clinic, who had been diagnosed with glaucoma in 1986 or later and whohad been treated for at least 2 years for glaucoma.Results: A total of 186 charts were reviewed, which included 66 patients who wereblind in at least one eye from glaucoma on presentation. A total of 172patients and 290 eyes were followed-up for a mean duration of 10.6 4.67years. Twenty seven patients and 31 eyes developed blindness from glaucomaduring follow-up. The Kaplan-Meier survival estimate at 16 years was28.6% for glaucoma-related blindness in at least one eye. A worse visualfield on presentation, older age, and poor compliance during therapy weresignificantly associated with the development of blindness. Glaucoma type, agender difference, systemic disease, greater intraocular pressure fluctuationin the last year of therapy and blindness in one eye on presentation did notshow a significant relationship with the rate of development of blindness.Conclusion: Blindness from treated glaucoma is considerable. Our results gave a 28.6%probability of blindness at 16 years in at least one eye. An older age, poorcompliance and a worse visual field on presentation were significant risk factors.

  6. Structure-factor probabilities for related structures

    International Nuclear Information System (INIS)

    Probability relationships between structure factors from related structures have allowed previously only for either differences in atomic scattering factors (isomorphous replacement case) or differences in atomic positions (coordinate error case). In the coordinate error case, only errors drawn from a single probability distribution have been considered, in spite of the fact that errors vary widely through models of macromolecular structures. It is shown that the probability relationships can be extended to cover more general cases. Either the atomic parameters or the reciprocal-space vectors may be chosen as the random variables to derive probability relationships. However, the relationships turn out to be very similar for either choice. The most intuitive is the expected electron-density formalism, which arises from considering the atomic parameters as random variables. In this case, the centroid of the structure-factor distribution is the Fourier transform of the expected electron-density function, which is obtained by smearing each atom over its possible positions. The centroid estimate has a phase different from, and more accurate than, that obtained from the unweighted atoms. The assumption that there is a sufficient number of independent errors allows the application of the central limit theorem. This gives a one- (centric case) or two-dimensional (non-centric) Gaussian distribution about the centroid estimate. The general probability expression reduces to those derived previously when the appropriate simplifying assumptions are made. The revised theory has implications for calculating more accurate phases and maps, optimizing molecular replacement models, refining structures, estimating coordinate errors and interpreting refined B factors. (orig.)

  7. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10-3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  8. The albedo effect on neutron transmission probability.

    Science.gov (United States)

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  9. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  10. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  11. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  12. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  13. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  14. Investigation of probable decays in rhenium isotopes

    International Nuclear Information System (INIS)

    Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated

  15. Radiationless transition probabilities in muonic 209Bi

    International Nuclear Information System (INIS)

    The probability for non-radiative (n.r.) excitations in muonic 209Bi was determined from a (μ-, γγ)-measurement by comparing the intensities of muonic X-ray transitions in single and coincidence spectra. The values of Pn.r.(3p→1s)=(17.9±2.0)% and Pn.r.(3d→1s)=(3.0±2.2)% were measured for the first time. The strength of the n.r. decay of the 2p-level was found to be (4.2±2.2)%. The n.r. transition probabilities of two subcomplexes of the (2p→1s)-transition leading to different mean excitation energies are (3.2±1.8)% and (5.0±2.0)%, respectively. (orig.)

  16. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author)

  17. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions and the...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...

  18. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  19. Chemisorptive electron emission versus sticking probability

    Science.gov (United States)

    Böttcher, Artur; Niehus, Horst

    2001-07-01

    The chemisorption of N2O on thin Cs films has been studied by monitoring the time evolution of the sticking probability as well as the kinetics of the low-energy electron emission. By combining the data sets, two time domains become distinguishable: the initial chemisorption stage is characterized by a high sticking probability (0.1exoemission and the chemisorption excludes the model of surface harpooning as the elementary process responsible for the electron emission in the late chemisorption stage. A long-term emission decay has also been observed after turning off the flux of chemisorbing molecules. A model is proposed that attributes both, the late chemisorptive and the nonchemisorptive electron emission to the relaxation of a narrow state originated from an oxygen vacancy in the Cs oxide layer terminating the surface. The presence of such a state has been confirmed by the metastable de-excitation spectroscopy [MDS, He*(21S)].

  20. Need for probabilities in cancer litigation

    International Nuclear Information System (INIS)

    The third article in a series on radiation and the courts considers the new concept of probability of causation (PC), which the author concludes is the best of imperfect approaches. The problem arises from the medical inability to state that the particular cancer was the result of a specific exposure to radiation. Epidemiological and statistical evidence as the basis for probable cause has precedents in other situations where absolute certainty is unattainable. Alternatives to PC include threshold dose levels, referrals to judges with demonstrated scientific understanding, and improvements in the legal skills used in trying cases. Although PC yields more consistent results, the approach is best because it encompasses the advantages of the other approaches. 20 references

  1. Probability of Boundary Conditions in Quantum Cosmology

    CERN Document Server

    Suenobu, Hiroshi

    2016-01-01

    One of the main interest in quantum cosmology is to determine which type of boundary conditions for the wave function of the universe can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation numerically and evaluate probabilities for an observable representing evolution of the classical universe, especially, the number of e-foldings of the inflation. To express boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify them introducing two real parameters which discriminate boundary conditions and estimate values of these parameters resulting in observationally preferable predictions. We obtain the probability for these parameters under the requirement of the sufficient e-foldings of the inflation.

  2. Collision probabilities in spatially stochastic media II

    International Nuclear Information System (INIS)

    An improved model for calculating collision probabilities in spatially stochastic media is described based upon a method developed by Cassell and Williams [Cassell, J.S., Williams, M.M.R., in press. An approximate method for solving radiation and neutron transport problems in spatially stochastic media. Annals of Nuclear Energy] and is applicable to three-dimensional problems. We shall show how to evaluate the collision probability in an arbitrarily shaped non-re-entrant lump, consisting of a random dispersal of two phases, for any form of autocorrelation function. Specific examples, with numerical values, are given for a sphere and a slab. In the case of the slab we allow the material to have different stochastic properties in the x, y and z directions

  3. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  4. Registration probability of alphas in cellulose nitrate

    International Nuclear Information System (INIS)

    Registration 'thresholds' of alpha particles in cellulose nitrate plastic present a statistical behaviour. The effect depends on etching conditions. It is particularly large in strong etching conditions, in which registration is transposed to high energies: up to 7.7 MeV for the conditions and energies studied. 'Registration probability' expresses more adequately the effect of registration constraints. The study of registration probability indicates that the 'target theory' can describe the effect. The parameters of target theory, m (number of targets) and D0 (the equivalent of biological dose D37) were found to be: m = 5 and D0 = 3 x 107 erg cm-3. Dose distribution around the trajectory of alphas of various energies is estimated. It is also deduced that track development takes place when the required dose for registration is deposited at a distance r >= 20 A from particle trajectory. (author)

  5. A Tutorial Introduction to the Logic of Parametric Probability

    OpenAIRE

    Norman, Joseph W.

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are so...

  6. An introduction to measure-theoretic probability

    CERN Document Server

    Roussas, George G

    2004-01-01

    This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics,probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail.* Excellent exposition marked by a clear, coherent and logical devleopment of the subject* Easy to understand, detailed discussion of material* Complete proofs

  7. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  8. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  9. Picturing mobility: Transition probability color plots

    OpenAIRE

    Philippe Kerm

    2011-01-01

    This talk presents a simple graphical device for visualization of patterns of income mobility. The device uses color palettes to picture information contained in transition matrices created from a fine partition of the marginal distributions. The talk explains how these graphs can be constructed using the user-written package spmap from Maurizio Pisati, briefly presents the wrapper command transcolorplot (for transition probability color plots) and demonstrates how such graphs are effective f...

  10. Transition Probability (Fidelity) and Its Relatives

    OpenAIRE

    Uhlmann, Armin

    2011-01-01

    Transition Probability (fidelity) for pairs of density operators can be defined as "functor" in the hierarchy of "all" quantum systems and also within any quantum system. The introduction of "amplitudes" for density operators allows for a more intuitive treatment of these quantities, also pointing to a natural parallel transport. The latter is governed by a remarkable gauge theory with strong relations to the Riemann-Bures metric.

  11. Continuum ionization transition probabilities of atomic oxygen

    Science.gov (United States)

    Samson, J. A. R.; Petrosky, V. E.

    1974-01-01

    The technique of photoelectron spectroscopy was employed in the investigation. Atomic oxygen was produced in a microwave discharge operating at a power of 40 W and at a pressure of approximately 20 mtorr. The photoelectron spectrum of the oxygen with and without the discharge is shown. The atomic states can be clearly seen. In connection with the measurement of the probability for transitions into the various ionic states, the analyzer collection efficiency was determined as a function of electron energy.

  12. Transition choice probabilities and welfare in ARUM's

    OpenAIRE

    de Palma, André; Kilani, Karim

    2009-01-01

    We study the descriptive and the normative consequences of price and/or other attributes changes in additive random utility models. We first derive expressions for the transition choice probabilities associated to these changes. A closed-form formula is obtained for the logit. We then use these expressions to compute the cumulative distribution functions of the compensating variation conditional on ex-ante and/or ex-post choices. The unconditional distribution is also provided. The conditiona...

  13. Calculating nuclear accident probabilities from empirical frequencies

    OpenAIRE

    Ha-Duong, Minh; Journé, V.

    2014-01-01

    International audience Since there is no authoritative, comprehensive and public historical record of nuclear power plant accidents, we reconstructed a nuclear accident data set from peer-reviewed and other literature. We found that, in a sample of five random years, the worldwide historical frequency of a nuclear major accident, defined as an INES level 7 event, is 14 %. The probability of at least one nuclear accident rated at level ≥4 on the INES scale is 67 %. These numbers are subject...

  14. Unseated septifoil non-detection probability

    International Nuclear Information System (INIS)

    The frequency that the Savannah River K-Reactor would proceed beyond hydraulic startup with a septifoil not properly seated is estimated in this report. It summarizes previous work on this subject, incorporates concerns about the utility of individual septifoil pressure measurements, and discusses two proposed techniques that could lower the non-detection probability to the point that this issue could be beyond Design Basis consideration

  15. The probability for primordial black holes

    CERN Document Server

    Bousso, R

    1995-01-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal, we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  16. PSA, subjective probability and decision making

    International Nuclear Information System (INIS)

    PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven

  17. Investigation of Flood Inundation Probability in Taiwan

    Science.gov (United States)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  18. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  19. Probable Psittacosis Outbreak Linked to Wild Birds

    OpenAIRE

    Telfer, Barbara L.; Moberley, Sarah A.; Hort, Krishna P.; Branley, James M.; Dominic E. Dwyer; Muscatello, David J; Correll, Patricia K.; England, John; McAnulty, Jeremy M.

    2005-01-01

    In autumn 2002, an outbreak of probable psittacosis occurred among residents of the Blue Mountains district, Australia. We conducted a case-control study to determine independent risk factors for psittacosis by comparing exposures between hospitalized patients and other residents selected randomly from the telephone directory. Of the 59 case-patients with laboratory results supportive of psittacosis, 48 participated in a case-control study with 310 controls. Independent risk factors were resi...

  20. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  1. Probabilities as Similarity-Weighted Frequencies

    OpenAIRE

    Antoine Billot; Itzhak Gilboa; Dov Samet; David Schmeidler

    2004-01-01

    A decision maker is asked to express her beliefs by assigning probabilities to certain possible states. We focus on the relationship between her database and her beliefs. We show that, if beliefs given a union of two databases are a convex combination of beliefs given each of the databases, the belief formation process follows a simple formula: beliefs are a similarity-weighted average of the beliefs induced by each past case.

  2. Confidence measures from local posterior probability estimates

    OpenAIRE

    Williams, Gethin; Renals, Steve

    1999-01-01

    In this paper we introduce a set of related confidence measures for large vocabulary continuous speech recognition (LVCSR) based on local phone posterior probability estimates output by an acceptor HMM acoustic model. In addition to their computational efficiency, these confidence measures are attractive as they may be applied at the state-, phone-, word- or utterance-levels, potentially enabling discrimination between different causes of low confidence recognizer output, such as unclear acou...

  3. The Probability Model of Expectation Disconfirmation Process

    OpenAIRE

    Hui-Hsin HUANG

    2015-01-01

    This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model...

  4. Computational methods for probability of instability calculations

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  5. Free Energy Changes, Fluctuations, and Path Probabilities

    OpenAIRE

    Hoover, William G.; Hoover, Carol G.

    2011-01-01

    We illustrate some of the static and dynamic relations discovered by Cohen, Crooks, Evans, Jarzynski, Kirkwood, Morriss, Searles, and Zwanzig. These relations link nonequilibrium processes to equilibrium isothermal free energy changes and to dynamical path probabilities. We include ideas suggested by Dellago, Geissler, Oberhofer, and Schoell-Paschinger. Our treatment is intended to be pedagogical, for use in an updated version of our book: Time Reversibility, Computer Simulation, and Chaos. C...

  6. Coherence and Consistency of Investors' Probability Judgments

    OpenAIRE

    David V. Budescu; Ning Du

    2007-01-01

    This study investigates the quality of direct probability judgments and quantile estimates with a focus on calibration and consistency. The two response modes use different measures of miscalibration, so it is difficult to directly compare their relative (in)accuracy. We employed a more refined within-subject design in which decision makers (DMs) used both response modes to make judgments about a random sample of stocks accompanied by identical information to facilitate comparison between the...

  7. Quantile Probability and Statistical Data Modeling

    OpenAIRE

    Parzen, Emanuel

    2004-01-01

    Quantile and conditional quantile statistical thinking, as I have innovated it in my research since 1976, is outlined in this comprehensive survey and introductory course in quantile data analysis. We propose that a unification of the theory and practice of statistical methods of data modeling may be possible by a quantile perspective. Our broad range of topics of univariate and bivariate probability and statistics are best summarized by the key words. Two fascinating practical examples are g...

  8. Ragnar Frisch and the Probability Approach

    OpenAIRE

    BJERKHOLT, Olav; DUPONT, Ariane

    2011-01-01

    The title hints at the attention given to the lack of probability considerations in the econometric work of the recognized pioneer of econometrics, Ragnar Frisch. Clues to a better understanding of his position may be found in his comprehensive archive and correspondence. This essay gives a brief overview of Frisch's scientific archive and exhibits from his search for econometric methods. It also sets out a selection of letters exchanged between Frisch and other leading members of the Econome...

  9. A probability loophole in the CHSH

    CERN Document Server

    Geurdes, J F

    2014-01-01

    In the present paper a robustness stress-test of the CHSH experiments for Einstein locality and causality is designed and employed. Random A and B from dice and coins, but based on a local model, run "parallel" to a real experiment. We found a local causal model with a nonzero probability to violate the CHSH inequality for some relevant quartets $\\mathcal{Q}$ of settings in the series of trials.

  10. A probability loophole in the CHSH

    Directory of Open Access Journals (Sweden)

    Han Geurdes

    2014-01-01

    Full Text Available In the present paper a robustness stress-test of the CHSH experiments for Einstein locality and causality is designed and employed. Random A and B from dice and coins, but based on a local model, run ”parallel” to a real experiment. We found a local causal model with a nonzero probability to violate the CHSH inequality for some relevant quartets Q of settings in the series of trials.

  11. Probability estimation and compression involving large alphabets

    OpenAIRE

    Santhanam, Narayana

    2006-01-01

    Many results in statistics and information theory are asymptotic in nature, with the implicit assumption that we operate in a regime where the data size is much larger than the alphabet size. In this dissertation, we will be concerned with large alphabets, namely alphabets for which the above assumption does not hold. We consider universal compression, i.e., compression when data statistics are unknown, and probability estimation involving data drawn from large alphabets. Both these problems ...

  12. Origins of geometric probability and stereology

    Czech Academy of Sciences Publication Activity Database

    Saxl, Ivan; Hykšová, M.

    Bologna: Esculapio, 2009 - (Capasso, V.; Aletti, G.; Micheletti, A.), s. 173-178 ISBN 978-88-7488-310-3. [10tj European Congress of Stereology and Image Analysis. Milano (IT), 22.06.2009-26.06.2009] R&D Projects: GA AV ČR(CZ) IAA100110502 Grant ostatní: GA AV ČR(CZ) IAA801240901 Institutional research plan: CEZ:AV0Z10190503 Keywords : geometric probability * stereology * counting measure * probe Subject RIV: BA - General Mathematics

  13. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    Science.gov (United States)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  14. Probable maximum flood estimates in Canada

    International Nuclear Information System (INIS)

    The derivation of the probable maximum flood (PMF) for high hazard dams is one of the most important components of a dam safety program. Ontario Hydro defines the probable maximum flood as a hypothetical flood for a selected location on a given stream whose magnitude is such that there is virtually no chance of being exceeded. Different design assumptions are used in the derivation of PMF by various agencies to reflect historical hydrometeorological events and possible future events, for the geographic area under consideration and for the time period of interest. Details are presented of the design assumptions relating to probable maximum precipitation (PMP) determination for British Columbia Hydro, Alberta Environment and Hydro-Quebec, which is used by the agencies in PMF studies. The computer model used by many of the Canadian agencies in the derivation of PMF is the Streamflow Synthesis and Reservoir Regulation (SSARR) model developed by the U.S. Army Corps of Engineers. The PMP is the most important design input data used in the derivation of PMF, and under and over-estimates of this parameter will significantly affect the PMF. Suggestions to aid in minimizing over- and under-estimation of PMF are presented. 18 refs., 2 figs., 5 tabs

  15. Atomic transition probabilities in refractory metals

    International Nuclear Information System (INIS)

    Accurate transition probabilities for a large number of spectral lines in the first and second spectra of 3d, 4d and 5d metals are being measured. Radiative lifetimes of hundreds of levels in TaI, WI, MoI, NbI, HfI, ReI, RhI, RuI, NbII, CoII++, and other atoms and ions are measured using time-resolved laser-induced fluorescence on an atom or ion beam. The atom or ion beam is produced by a versatile hollow cathode discharge source. Branching ratios of levels in WI, NbI, HfI, and ReI are measured from calibrated spectra recorded on the Kitt Peak one-meter Fourier Transform Spectrometer. The transition probability measurements are used in solar and stellar elemental abundance determination. Some of the elements mentioned above are commonly used as electrodes in discharge devices. Accurate transition probabilities are also useful in studying concentrations and the effects of sputtered electrode material on laboratory discharges

  16. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...... model. It also decomposes effects of both discrete and continuous variables, applies to average partial effects, and provides analytically derived statistical tests. The method can be extended to other models in the GLM-family....

  17. Quantum mechanics as a theory of probability

    CERN Document Server

    Pitowsky, I

    2005-01-01

    We develop and defend the thesis that the Hilbert space formalism of quantum mechanics is a new theory of probability. The theory, like its classical counterpart, consists of an algebra of events, and the probability measures defined on it. The construction proceeds in the following steps: (a) Axioms for the algebra of events are introduced following Birkhoff and von Neumann. All axioms, except the one that expresses the uncertainty principle, are shared with the classical event space. The only models for the set of axioms are lattices of subspaces of inner product spaces over a field K. (b) Another axiom due to Soler forces K to be the field of real, or complex numbers, or the quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's theorem fully characterizes the probability measures on the algebra of events, so that Born's rule is derived. (d) Gleason's theorem is equivalent to the existence of a certain finite set of rays, with a particular orthogonality graph (Wondergraph). Consequ...

  18. Detection of cervical lymph node metastasis in head and neck cancer patients with clinically N0 neck—a meta-analysis comparing different imaging modalities

    International Nuclear Information System (INIS)

    How to properly manage clinically negative neck of head and neck cancer patients is a controversial topic. Research is now directed toward finding a method sensitive enough to bring the risk of occult metastases below 20%. The aim of this review was to compare the diagnostic accuracy of different imaging modalities, including CT, MRI, PET and US, in clinically N0 head and neck cancer patients. For this systematic review and meta-analysis, PubMed and the Cochrane Database were searched for relevant original articles published up to May 2011. Inclusion criteria were as follows: articles were reported in English; CT, MRI, PET or US were performed to identify cervical metastases in clinically N0 head and neck squamous cell carcinoma; and data were sufficient for the calculation of true-positive or false-negative values. A bivariate random effect model was used to obtain pooled sensitivity and specificity. The positive and negative test probability of neck metastasis was generated based on Bayesian theory and collected data for different pre-test possibilities. Of the 168 identified relevant articles, 7 studies fulfilled all inclusion criteria for CT, 6 studies for MRI, 11 studies for PET and 8 studies for US. There was no difference in sensitivity and specificity among these imaging modalities, except CT was superior to US in specificity. The pooled estimates for sensitivity were 52% (95% confidence interval [CI], 39% ~ 65%), 65% (34 ~ 87%) 66% (47 ~ 80%), and 66% (45 ~ 77%), on a per-neck basis for CT, MRI, PET and US, respectively. The pooled estimates for specificity were 93% (87% ~ 97%), 81% (64 ~ 91%), 87% (77 ~ 93%), and 78% (71 ~ 83%) for CT, MRI, PET and US, respectively. With pre-examination nodal metastasis probabilities set at 10%, 20% and 30%, the post-exam probabilities of positive nodal metastasis rates were 47%, 66% and 77% for CT; 27%, 46% and 59% for MRI; 36%, 56% and 69% for PET; and 25%, 42% and 56% for US, respectively. Negative nodal metastasis

  19. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  20. School and conference on probability theory

    International Nuclear Information System (INIS)

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this