Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie; Larsen, Torben Bjerregaard; Lundbye-Christensen, Søren; Elbrønd, Henrik; Thorlacius-Ussing, Ole
2008-01-01
clinical pre-test probability (PTP) can be safely used to rule out the tentative diagnosis of DVT in cancer patients. However, the accuracy in colorectal cancer patients is uncertain. This study assessed the diagnostic accuracy of a quantitative D-dimer assay in combination with the PTP score in ruling out...... preoperative DVT in colorectal cancer patients admitted for surgery. Preoperative D-dimer test and compression ultrasonography for DVT were performed in 193 consecutive patients with newly diagnosed colorectal cancer. Diagnostic accuracy indices of the D-dimer test were assessed according to the PTP score. The...... negative predictive value, positive predictive value, sensitivity and specificity were 99% (95% confidence interval (CI), 95-100%), 17% (95% CI, 9-26), 93% (95% CI, 68-100%) and 61% (95% CI, 53-68%), respectively. In conclusion, the combined use of pre-test probability and D-dimer test may be useful in...
Full text: The PIOPED survey confirmed the significance of the high probability ventilation/perfusion scan (HP V/Q scan) in establishing the diagnosis of pulmonary embolism (PE). In an interesting sentence, however, the authors indicated that 'the clinicians' assessment of the likelihood of PE (prior probability)' can substantially increase the predictive value of the investigation. The criteria used for this assessment were not published, and this statement conflicts with the belief that the clinical diagnosis of pulmonary embolism is unreliable. A medical history was obtained from 668 patients undergoing V/Q lung scans for suspected PE, and certain clinical features linked to PE were, when present, documented. These included pleuritic chest pain, haemoptysis, dyspnoea, clinical evidence of DVT, recent surgery and right ventricular strain pattern an ECG. D-Dimer levels and initial arterial oxygen saturation (PaO2) levels were also obtained. The prevalence of these clinical and biochemical criteria was then compared between HP (61) and normal (171) scans after exclusion of all equivocal or intermediate scan outcomes (436), (where lung scintigraphy was unable to provide a definite diagnosis). D-Dimer and/or oxygen saturation levels, were similarly compared in each group. A true positive result was scored for each clinical or biochemical criterion when linked with a high probability scan and, conversely, a false positive score when the scan outcome was normal. In this fashion, the positive predictive value (PPV) and, when appropriate, the negative predictive value (NPV) was obtained for each risk factor. In the context of PE, DVT and post-operative status prove the more reliable predictors of a high probability outcome. Where both features were present, the PPV rose to 0.57. A normal D-Dimer level was a better excluder of PE than a normal oxygen saturation level (NPV 0.78-v-0.44). Conversely, a raised D-Dimer, or reduced oxygen saturation, were both a little value in
Gustavo Diniz Ferreira Gusso
2013-04-01
Full Text Available Objective: The purpose of this study is to present a methodology for assessing patients’ demands and calculating pre-test probabilities using paper forms in Primary Care. Method: Most developing countries do not use Electronic Health Records (EHR in primary care settings. This makes it difficult to access information regarding what occurs within the health center working process. Basically, there are two methodologies to assess patients’ demands and problems or diagnosis stated by doctors. The first is based on single attendance at each appointment, while the second is based on episodes of care; the latter deals with each problem in a longitudinal manner. The methodology developed in this article followed the approach of confronting the ‘reason for the appointment’ and ‘the problem registered’ by doctors. Paper forms were developed taking this concept as central. All appointments were classified by the International Classification of Primary Care (ICPC. Discussion: Even in paper form, confrontation between ‘reason for the appointment’ and ‘problem registered’ is useful for measuring the pre-test probabilities of each problem-based appointment. This approach can be easily reproduced in any health center and enables a better understanding of population profile. Prevalence of many illnesses and diseases are not known in each reality, and studies conducted in other settings, such as secondary and tertiary care, are not adequate for primary health care. Conclusion: This study offers adequate technology for primary health care workers that have potential to transform each health center into a research-led practice, contributing directly to patient care.
Scheffel, Hans; Alkadhi, Hatem; Desbiolles, Lotus; Frauenfelder, Thomas; Schertler, Thomas; Husmann, Lars; Marincek, Borut; Leschka, Sebastian [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Plass, Andre; Vachenauer, Robert; Grunenfelder, Juerg; Genoni, Michele [Clinic for Cardiovascular Surgery, Zurich (Switzerland); Gaemperli, Oliver; Schepis, Tiziano [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); University of Zurich, Center for Integrative Human Physiology, Zurich (Switzerland)
2006-12-15
The aim of this study was to assess the diagnostic accuracy of dual-source computed tomography (DSCT) for evaluation of coronary artery disease (CAD) in a population with extensive coronary calcifications without heart rate control. Thirty patients (24 male, 6 female, mean age 63.1{+-}11.3 years) with a high pre-test probability of CAD underwent DSCT coronary angiography and invasive coronary angiography (ICA) within 14{+-}9 days. No beta-blockers were administered prior to the scan. Two readers independently assessed image quality of all coronary segments with a diameter {>=}1.5 mm using a four-point score (1: excellent to 4: not assessable) and qualitatively assessed significant stenoses as narrowing of the luminal diameter >50%. Causes of false-positive (FP) and false-negative (FN) ratings were assigned to calcifications or motion artifacts. ICA was considered the standard of reference. Mean body mass index was 28.3{+-}3.9 kg/m{sup 2} (range 22.4-36.3 kg/m{sup 2}), mean heart rate during CT was 70.3{+-}14.2 bpm (range 47-102 bpm), and mean Agatston score was 821{+-}904 (range 0-3,110). Image quality was diagnostic (scores 1-3) in 98.6% (414/420) of segments (mean image quality score 1.68{+-}0.75); six segments in three patients were considered not assessable (1.4%). DSCT correctly identified 54 of 56 significant coronary stenoses. Severe calcifications accounted for false ratings in nine segments (eight FP/one FN) and motion artifacts in two segments (one FP/one FN). Overall sensitivity, specificity, positive and negative predictive value for evaluating CAD were 96.4, 97.5, 85.7, and 99.4%, respectively. First experience indicates that DSCT coronary angiography provides high diagnostic accuracy for assessment of CAD in a high pre-test probability population with extensive coronary calcifications and without heart rate control. (orig.)
Assessing the clinical probability of pulmonary embolism
Miniati, M. [Consiglio Nazionale delle Ricerche, Institute of Clinical Physiology, Pisa (Italy); Pistolesi, M. [University of Florence, Dept. of Section of Nuclear Medicine Critical Care, Florence (Italy)
2001-12-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score {<=} 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score {>=} 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was
Assessing the clinical probability of pulmonary embolism
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3
Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin
2013-01-01
Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomog...
Noordman, J.; Weijden, T.T. van der; Dulmen, S. van
2014-01-01
AIMS: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. BACKGROUND: Continuing professional education may be
Noordman, J.; Weijden, T. van der; Dulmen, S. van
2014-01-01
Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be
Noordman, J.; van der Weijden, T; Van Dulmen, S.
2014-01-01
Aims: To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Background: Continuing professional education may be necessary to refresh and reflect on the communication and motivational interviewing skills of experienced primary care practice nurses. A video-feedback method was designed to improve these skills...
40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If...
The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the
Clinical features of probable severe acute respiratory syndrome in Beijing
Hai-Ying Lu; Xiao-Yuan Xu; Yu Lei; Yang-Feng Wu; Bo-Wen Chen; Feng Xiao; Gao-Qiang Xie; De-Min Han
2005-01-01
AIM: To summarize clinical features of probable severe acute respiratory syndrome (SARS) in Beijing.METHODS: Retrospective cases involving 801 patients admitted to hospitals in Beijing between March and June 2003, with a diagnosis of probable SARS, moderate type.The series of clinical manifestation, laboratory and radiograph data obtained from 801 cases were analyzed. RESULTS: One to three days after the onset of SARS, the major clinical symptoms were fever (in 88.14% of patients), fatigue, headache, myalgia, arthralgia (25-36%), etc. The counts of WBC (in 22.56% of patients) lymphocyte (70.25%)and CD3, CD4, CD8 positive T cells (70%) decreased. From 4-7 d, the unspecific symptoms became weak; however, the rates of low respiratory tract symptoms, such as cough (24.18%), sputum production (14.26%), chest distress (21.04%) and shortness of breath (9.23%) increased, so did the abnormal rates on chest radiograph or CT. The low counts of WBC, lymphocyte and CD3, CD4, CD8 positiveT cells touched bottom. From 8 to 16 d, the patients presented progressive cough (29.96%), sputum production (13.09%), chest distress (29.96%) and shortness of breath (35.34%). All patients had infiltrates on chest radiograph or CT, some even with multi-infiltrates. Two weeks later, patients' respiratory symptoms started to alleviate, the infiltrates on the lung began to absorb gradually, the counts of WBC, lymphocyte and CD3, CD4, CD8 positive T cells were restored to normality.CONCLUSION: The data reported here provide evidence that the course of SARS could be divided into four stages, namely the initial stage, progressive stage, fastigium and convalescent stage.
Bayesian probability of success for clinical trials using historical data.
Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F
2015-01-30
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499
The probabilities of psyquiatric hospitalization of mental health clinic patients
Leonardo Naves dos Reis; Julio Cesar Ribeiro Simplicio; Edilaine Cristina da Silva Gherardi-Donato; Ana Carolina Guidorizzi Zanetti
2015-01-01
The objective of this study is to evaluate the factors of prediction (diagnostic and socio- demographic characteristics) regarding psychiatric outpatient mental health among users. The study was conducted from secondary data, extracted from the charts and analyzed through logistic regression, to obtain the prediction equation of probability of psychiatric hospitalization. The diagnoses that showed statistical significance (p < 0.05) were bipolar affective disorder, schizophrenia, anxious ...
The probabilities of psyquiatric hospitalization of mental health clinic patients
Leonardo Naves dos Reis
2015-01-01
Full Text Available The objective of this study is to evaluate the factors of prediction (diagnostic and socio- demographic characteristics regarding psychiatric outpatient mental health among users. The study was conducted from secondary data, extracted from the charts and analyzed through logistic regression, to obtain the prediction equation of probability of psychiatric hospitalization. The diagnoses that showed statistical significance (p < 0.05 were bipolar affective disorder, schizophrenia, anxious disorders and depression, and the first two showed a high magnitude association with the need of hospitalization. The age was inversely proportional to the need of hospitalization. The results found may stimulate specific actions and the psychiatric prevention of younger patients with schizophrenia and bipolar affective disorder.
Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von [Division of Nuclear Medicine, University Hospital Zurich, Raemistrasse 100, 8091, Zurich (Switzerland); Steurer, Johann; Bachmann, Lucas M. [Horten Centre, University of Zurich, Zurich (Switzerland)
2003-04-01
The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the
During a 3-year period, 173 clinically selected patients underwent pulmonary angiography to confirm or exclude acute pulmonary embolism. All patients had undergone ventilation-perfusion (V/Q) scanning (167 patients) or perfusion scanning alone (six) before angiography. Angiography was done because the results of the V/Q scanning did not satisfy the clinician's needs for certainty. The results of the V/Q and studies were compared to determine the relative accuracy of V/Q scanning in this clinical setting. Pulmonary embolism was found in seven (15%) of 47 patients with low-probability scans, 11 (32%) of 34 patients with intermediate-probability scans, 22 (39%) of 57 patients with indeterminate scans, and 23 (66%) of 35 patients with high-probability scans. In this clinically selected population, low-probability scans were more accurate in excluding pulmonary embolism than were high-probability scans in establishing that diagnosis
Pre-testing advertisements for effectiveness of communication
O'Connor, Ciara
1991-01-01
The purpose of this study was to examine a technique of Pre- Testing Advertisements for Effectiveness, and to seek to provide a Psychological basis for such Pre-Testing. The results of this study suggest a possible system for Pre-Testing the effectiveness of advertisements. The approach taken in this study was to test the effectiveness of communication of advertisements. Three advertisements were tested on a sample of a target audience, that Bample consisting of 78 people taken from thr...
Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.
2009-01-01
Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t
Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn [RaySearch Laboratories, Sveavägen 44, Stockholm SE-111 34 (Sweden); Forsgren, Anders [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, Stockholm SE-100 44 (Sweden)
2015-07-15
Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.
Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality
Background: pulmonary embolism (PE) remains an elusive diagnosis, and still causes too many unexpected deaths. Because of this, noninvasive investigations are done when pulmonary embolism is suspected. Objective: to determine the clinical and x-rays findings in patients with diagnosis of pulmonary embolism by high probability ventilation/perfusion (V/Q) lung scan. Materials and methods: inpatient medical records of 91 patients with clinical suspected PE and high and low probability V/Q lung scan were analyzed (PIOPED criterion). Results: there were statistics correlation with four clinical findings: hemoptysis (p value=0,02, odds ratio=8,925), taquicardia (p value=0,02 odds ratio=3,5), chest pain (p value=0,01, odds ratio=1,87), and recent surgery (p value=0,02, odds ratio=2,762). The 70,7% chest x-rays were normal (p value < 0,001). Conclusion: the clinical and x-rays findings in patients with diagnosis of PE by high probability V/Q lung scan were: hemoptysis, taquicardia, chest pain, recent surgery and normal chest x-ray. This is important because would help to choose the patients in whom the V/Q lung scan will have the maximal performance (Au)
Low-probability ventilation-perfusion scintigrams: clinical outcomes in 99 patients
To evaluate the reliability of low probability ventilation-perfursion (V-P) scintigrams in excluding pulmonary embolism (PE), the authors reviewed the clinical records of 99 consecutive patients whose V-P studies had been interpreted as indicative of a low probability of PE. None of the 99 patients were referred for pulmonary angiography. Seven of the hospitalized patients died during the index admission and seven additional hospitalized patients died 1-5 months after discharge from the hospital. None were thought clinically to have died as a result of PE, and autopsy disclosed no PE in two. Follow-up information was obtained for 69 surviving patients not treated with anticoagulants. None of these patients were thought clinically to have had PE during follow-up of a least 2 weeks duration (greater than 2 months in 93% and greater than 6 months in 75%). The results suggest that major short-term morbidity or death attributable to PE are quite infrequent in patients with low-probability V-P scintigrams
Effects of pathogen-specific clinical mastitis on probability of conception in Holstein dairy cows.
Hertl, J A; Schukken, Y H; Welcome, F L; Tauer, L W; Gröhn, Y T
2014-11-01
The objective of this study was to estimate the effects of pathogen-specific clinical mastitis (CM), occurring in different weekly intervals before or after artificial insemination (AI), on the probability of conception in Holstein cows. Clinical mastitis occurring in weekly intervals from 6 wk before until 6 wk after AI was modeled. The first 4 AI in a cow's lactation were included. The following categories of pathogens were studied: Streptococcus spp. (comprising Streptococcus dysgalactiae, Streptococcus uberis, and other Streptococcus spp.); Staphylococcus aureus; coagulase-negative staphylococci (CNS); Escherichia coli; Klebsiella spp.; cases with CM signs but no bacterial growth (above the level that can be detected from our microbiological procedures) observed in the culture sample and cases with contamination (≥ 3 pathogens in the sample); and other pathogens [including Citrobacter, yeasts, Trueperella pyogenes, gram-negative bacilli (i.e., gram-negative organisms other than E. coli, Klebsiella spp., Enterobacter, and Citrobacter), Corynebacterium bovis, Corynebacterium spp., Pasteurella, Enterococcus, Pseudomonas, Mycoplasma, Prototheca, and others]. Other factors included in the model were parity (1, 2, 3, 4 and higher), season of AI (winter, spring, summer, autumn), day in lactation of first AI, farm, and other non-CM diseases (retained placenta, metritis, ketosis, displaced abomasum). Data from 90,271 AI in 39,361 lactations in 20,328 cows collected from 2003/2004 to 2011 from 5 New York State dairy farms were analyzed in a generalized linear mixed model with a Poisson distribution. The largest reductions in probability of conception were associated with CM occurring in the week before AI or in the 2 wk following AI. Escherichia coli and Klebsiella spp. had the greatest adverse effects on probability of conception. The probability of conception for a cow with any combination of characteristics may be calculated based on the parameter estimates. These
In order to investigate the occurrence and nature of CT abnormality and its correlation with clinical manifestations in multiple sclerosis, 34 CT records obtained from 28 consecutive patients with probable multiple sclerosis were reviewed. Forty-six percent of all cases showed abnormal CT. Dilatation of cortical sulci was found in 39%; dilatation of the lateral ventricle in 36%; dilatation of prepontine or cerebello-pontine cistern and the fourth ventricle, suggesting brainstem atrophy, in 18%; dilatation of cerebellar sulci, superior cerebellar cistern and cisterna magna, suggesting cerebellar atrophy, in 11%. Low density area was found in the cerebral hemisphere in 11% of cases. Contrast enhancement, performed on 25 CT records, did not show any change. There was no correlation between CT abnormality and duration of the illness. Although abnormal CT tended to occur more frequently during exacerbations and chronic stable state than during remissions, the difference was not statistically significant. CT abnormalities suggesting brainstem atrophy, cerebellar atrophy or plaques were found exclusively during exacerbations and chronic stable state. The occurrence of CT abnormalities was not significantly different among various clinical forms which were classified based on clinically estimated sites of lesion, except that abnormal CT tended to occur less frequently in cases classified as the optic-spinal form. It is noteworthy that cerebral cortical atrophy and/or dilatation of the lateral ventricle were found in 31% of cases who did not show any clinical sign of cerebral involvement. There was a statistically significant correlation between CT abnormalities and levels of clinical disability. Eighty percent of the bedridden or severely disabled patients showed abnormal CT, in contrast with only 29% of those with moderate, slight or no disability. (author)
Disseminated lesions in the white matter of the cerebral hemispheres and confluent lesions at the borders of the lateral ventricles as seen on MRI are both considered acceptable paraclinical evidence for the diagnosis of multiple sclerosis. Similar changes are, however, also found in vascular diseases of the brain. We therefore aimed at identifying those additional traits in the infratentorial region, which in our experience are not frequently found in cerebrovascular pathology. We evaluated MR brain scans of 68 patients and found pontine lesions in 71% of cases with a clinically definite diagnosis (17 out of 24) and in 33% of cases with a probable diagnosis (14 out of 43). Lesions in the medulla oblongata were present in 50% and 16%, respectively, and in the midbrain in 25% and 7%, respectively. With rare exceptions all brainstem lesions were contiguous with the cisternal or ventricular cerebrospinal fluid spaces. In keeping with post-mortem reports the morphological spectrum ranged from large confluent patches to solitary, well delineated paramedian lesions or discrete linings of the cerebrospinal fluid border zones and were most clearly depicted from horizontal and sagittal T2 weighted SE-sequences. If there is a predilection for the outer or inner surfaces of the brainstem, such lesions can be considered an additional typical feature of multiple sclerosis and can be more reliably weighted as paraclinical evidence for a definite diagnosis. (orig.)
Minyong Kang; Chang Wook Jeong; Woo Suk Choi; Yong Hyun Park; Sung Yong Cho; Sangchul Lee; Seung Bae Lee; Ja Hyeon Ku; Sung Kyu Hong; Seok-Soo Byun; Hyeon Jeong; Cheol Kwak; Hyeon Hoe Kim; Eunsik Lee; Sang Eun Lee
2014-01-01
OBJECTIVES: Although the incidence of prostate cancer (PCa) is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP). We established pre- and post-operative nomograms estimating biochemical recurrence (BCR)-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from...
Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)
2014-10-15
The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei
Planning and pre-testing: the key to effective AIDS education materials.
Ostfield, M L; Romocki, L S
1991-06-01
The steps in designing and producing effective AIDS prevention educational materials are outlines, using as an example a brochure originated in St. Lucia for clients at STD clinics. The brochure was intended to be read by clients as they waited for their consultation, thus it was targeted to a specific audience delimited by age, sex, language, educational level, religion and associated medical or behavioral characteristics. When researching the audience, it is necessary to learn the medium they best respond to, what they know already, what is their present behavior, how they talk about AIDS, what terms they use, how they perceive the benefits of AIDS prevention behavior, what sources of information they trust. The minimum number of key messages should be selected. Next the most appropriate channel of communication is identified. Mass media are not always best for a target audience, "little media" such as flyers and give-always may be better. The draft is then pre-tested by focus groups and interviews, querying about the text separately, then images, color, format, style. Listen to the way the respondents talk about the draft. Modify the draft and pre-test again. Fine-tune implications of the message for realism in emotional responses, respect, self-esteem, admiration and trust. To achieve wide distribution it is a good idea to involve community leaders to production of the materials, so they will be more likely to take part in the distribution process. PMID:12316892
A Clinical model to identify patients with high-risk coronary artery disease
Y. Yang (Yelin); L. Chen (Li); Y. Yam (Yeung); S. Achenbach (Stephan); M. Al-Mallah (Mouaz); D.S. Berman (Daniel); M.J. Budoff (Matthew); F. Cademartiri (Filippo); T.Q. Callister (Tracy); H.-J. Chang (Hyuk-Jae); V.Y. Cheng (Victor); K. Chinnaiyan (Kavitha); R.C. Cury (Ricardo); A. Delago (Augustin); A. Dunning (Allison); G.M. Feuchtner (Gudrun); M. Hadamitzky (Martin); J. Hausleiter (Jörg); R.P. Karlsberg (Ronald); P.A. Kaufmann (Philipp); Y.-J. Kim (Yong-Jin); J. Leipsic (Jonathon); T.M. LaBounty (Troy); F.Y. Lin (Fay); E. Maffei (Erica); G.L. Raff (Gilbert); L.J. Shaw (Leslee); T.C. Villines (Todd); J.K. Min (James K.); B.J.W. Chow (Benjamin)
2015-01-01
textabstractObjectives This study sought to develop a clinical model that identifies patients with and without high-risk coronary artery disease (CAD). Background Although current clinical models help to estimate a patient's pre-test probability of obstructive CAD, they do not accurately identify th
Angelman Syndrome is a severe neurological disorder. No other case has been reported in our country until now. There are two children reported with the clinical suspicion of Angelman Syndrome. They were treated at the Departamento de Neurologia del Hospital Nacional de Ninos. The information was taken from their medical records. The two patients present the four cardinal clinical features, including severe developmental delay, profound speech impairment, ataxia and a happy, sociable disposition. In addition, the patients displayed other characteristics: seizures associated with a typical spike and slow wave activity on EEG an love for water. The clinical diagnosis is difficult because other disorders can mimic the features of Angelman Syndrome. Nonetheless, at an early age, the behavioral phenotype of happy disposition and hyperexcitability is the most important manifestation and appears to be decisive in the differential diagnosis of patients with psychomotor and language delay. (author)
Frasson, Maria; Calixto, Nassim; Cronemberger, Sebastião; de Aguiar, Regina Amélia Lopes Pessoa; Leão, Letícia Lima; de Aguiar, Marcos José Burle
2004-09-01
Oculodentodigital dysplasia (ODDD) is a rare inherited disorder affecting the development of the face, eyes, teeth, and limbs. The majority of cases of ODDD are inherited as an autosomal dominant condition. There are few reports of probable autosomal recessive transmission. Affected patients exhibit a distinctive physiognomy with a narrow nose, hypoplastic alae nasi, and anteverted nostrils, bilateral microphthalmos, and microcornea. Sometimes iris anomalies and secondary glaucoma are present. There are malformations of the distal extremities such as syndactyly. In addition, there are defects in the dental enamel with hypoplasia and yellow discoloration of the teeth. Less common features include hypotrichosis, intracranial calcifications, and conductive deafness secondary to recurrent otitis media. We describe three brothers with ODDD. Their parents are first cousins and present no features of ODDD. These data are in favor of autosomal recessive inheritance and suggest genetic heterogeneity for this entity. PMID:15512999
Tony Wu
2004-07-01
Full Text Available Background: The X-linked dominant Charcot-Marie-Tooth neuropathy (CMTX is ahereditary motor and sensory neuropathy linked to a variety of mutations inthe connexin32 (Cx32 gene. Clinical and genetic features of CMTX havenot previously been reported in Taiwanese.Methods: Clinical evaluations and electrophysiological studies were carried out on 25family members of a Taiwanese family group. Molecular genetic analysis ofthe Cx32 gene was performed. A sural nerve biopsy was obtained from 1patient.Results: Nine patients had clinical features of X-linked dominant inheritance and amoderate Charcot-Marie-Tooth (CMT neuropathy phenotype. Moleculargenetic analysis showed no mutation of the Cx32 coding region, but revealeda G-to-A transition at position -215 of the nerve-specific promoter P2 of theCx32 gene. Ptosis is 1 clinical manifestation of neuropathy in this probableCMTX family. Familial hyperthyroidism is an additional independent featureof the family. Electrophysiological and histological studies showed featuresof axonal neuropathy. Multimodality evoked potential studies revealed normalcentral motor and sensory conduction velocities.Conclusions: The presence of ptosis in this family illustrates the existence of clinical heterogeneityamong related family members with CMTX similar to that inCMT of autosomal inheritance. Electrophysiological and histological findingsrevealed normal central conduction and axonal neuropathy.
Minyong Kang
Full Text Available OBJECTIVES: Although the incidence of prostate cancer (PCa is rapidly increasing in Korea, there are few suitable prediction models for disease recurrence after radical prostatectomy (RP. We established pre- and post-operative nomograms estimating biochemical recurrence (BCR-free probability after RP in Korean men with clinically localized PCa. PATIENTS AND METHODS: Our sampling frame included 3,034 consecutive men with clinically localized PCa who underwent RP at our tertiary centers from June 2004 through July 2011. After inappropriate data exclusion, we evaluated 2,867 patients for the development of nomograms. The Cox proportional hazards regression model was used to develop pre- and post-operative nomograms that predict BCR-free probability. Finally, we resampled from our study cohort 200 times to determine the accuracy of our nomograms on internal validation, which were designated with concordance index (c-index and further represented by calibration plots. RESULTS: Over a median of 47 months of follow-up, the estimated BCR-free rate was 87.8% (1 year, 83.8% (2 year, and 72.5% (5 year. In the pre-operative model, Prostate-Specific Antigen (PSA, the proportion of positive biopsy cores, clinical T3a and biopsy Gleason score (GS were independent predictive factors for BCR, while all relevant predictive factors (PSA, extra-prostatic extension, seminal vesicle invasion, lymph node metastasis, surgical margin, and pathologic GS were associated with BCR in the post-operative model. The c-index representing predictive accuracy was 0.792 (pre- and 0.821 (post-operative, showing good fit in the calibration plots. CONCLUSIONS: In summary, we developed pre- and post-operative nomograms predicting BCR-free probability after RP in a large Korean cohort with clinically localized PCa. These nomograms will be provided as the mobile application-based SNUH Prostate Cancer Calculator. Our nomograms can determine patients at high risk of disease recurrence
Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D50. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions
Soto, A.; Solari, L.; Díaz, J.; Mantilla, A.; Matthys, F.; Van der Stuyft, P.
2011-01-01
Background: Clinical suspects of pulmonary tuberculosis in which the sputum smears are negative for acid fast bacilli represent a diagnostic challenge in resource constrained settings. Our objective was to validate an existing clinical-radiographic score that assessed the probability of smear-negative pulmonary tuberculosis (SNPT) in high incidence settings in Peru. Methodology/Principal Findings: We included in two referral hospitals in Lima patients with clinical suspicion of pulmonary ...
Lee, Gee Won; Jeong, Yeon Joo; Kim, Chang Won [Dept. of Radiology, Pusan National University Hospital, Pusan National University School of Medicine and Medical Research Institutute, Pusan (Korea, Republic of); Chun, Sung Won; Kim, Yeong Dae [Dept. of Cardiovascular and Thoracic Surgery, Pusan National University Hospital, Pusan National University School of Medicine and Medical Research Institutute, Pusan (Korea, Republic of); Kim, Kun Il [Dept. of Radiology, Pusan National University Yangsan Hospital, Pusan National University School of Medicine and Medical Research Institutute, Yangsan (Korea, Republic of); Song, Jong Woon [Dept. of Radiology, Haeundae Paik Hospital, Inje University School of Medicine, Pusan (Korea, Republic of)
2011-05-15
To assess the use of CT angiography (CTA) in the diagnostic evaluation of pulmonary thromboembolism (PE) in a country with low PE prevalence and correlate the diagnostic performance of CTA with the clinical pretest probability and D-dimer values. The institutional review board approved this retrospective study. The observers reviewed all 660 CTAs and calculated the PE clot burden scores. The pretest probability of PE according to the Wells criteria and D-dimer values were calculated (clinical data were available for 371 of the 660 patients). We correlated the PE positivity rates of CTA and a PE clot burden score with the D-dimer values and pretest probability using Pearson's correlation coefficient. Of the 371 patients whose clinical data were available, 122 (32.8%) had PEs. None of the patients with both a normal D-dimer value and a low clinical probability had a PE. PE positivity rates of CTA were correlated with clinical pretest probability (r = 0.164, p = 0.002) and D-dimer values (r = 0.361, p < 0.001). PE clot burden scores were correlated with D-dimer values (r = 0.296, p < 0.001). Although PE positivity rates of CTA in a country with low prevalence were higher than those in a country with a higher prevalence, approximately 30% of the yield still represents an overuse of CTA. CTA should be performed after the pretest probability has been assigned and if the result of a D-dimer assay is abnormal.
Lexicographic probability, conditional probability, and nonstandard probability
Halpern, Joseph Y.
2003-01-01
The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are ...
40 CFR 86.1334-84 - Pre-test engine and dynamometer preparation.
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Pre-test engine and dynamometer preparation. 86.1334-84 Section 86.1334-84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... from the secondary dilution tunnel . Particulate sample filters need not be stabilized or weighed,...
Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students
Montecinos, Alicia M.
2014-01-01
A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…
Pre-test analyses for the NESC1 spinning cylinder experiment
The spinning cylinder experiment organised by the Network for the Evaluation of Steel Components (NESC) is designed to investigate the cleavage initiation behaviour of both surface breaking and subclad defects in simulated end of life RPV material, exposed to a pressurised thermal shock transient. Pre-test structural integrity assessments are performed by the NESC Structural Analysis Task Group (TG3). The results of these structural integrity assessments are used to determine the design of the experiment and especially the sizes of the introduced defects. In this report the results of the pre-test analyses performed by the group Applied Mechanics at ECN - Nuclear Energy are described. Elastic as well as elasto-plastic structural analyses are performed for a surface breaking and a subclad defect in a forged cylinder with a 4 mm cladding. The semi elliptical defects have a depth of 40 mm and an aspect ratio of 1:3. (orig.)
Hammond, Christopher J; Hassan, Tajek B.
2005-01-01
Clinical risk stratification and D-dimer assay can be of use in excluding pulmonary embolism in patients presenting to emergency departments but many D-dimer assays exist and their accuracy varies. We used clinical risk stratification combined with a quantitative latex-agglutination D-dimer assay to screen patients before arranging further imaging if required. Retrospective analysis of a sequential series of 376 patients revealed that no patient with a D-dimer of
Pre-test calculations of SPES experiment - a loss of main feedwater transient
Results of a pre-test calculation of international standard experiment ISP-22 SPES are shown in this paper. SPES facility represents a model of three-loop PWR power plant which was used to perform an experimental loss of main feedwater transient with emergency feedwater delayed. calculation was performed by RELAP5/MOD2/36.1 computer code which we had converted to VAX computers. (author)
van Vugt, Heidi A.; Kranse, Ries; Steyerberg, Ewout W.; van der Poel, Henk G.; Busstra, Martijn; Kil, Paul; Oomens, Eric H.; de Jong, Igle J.; Bangma, Chris H.; Roobol, Monique J.
2012-01-01
Background: Prediction models need validation to assess their value outside the development setting. Objective: To assess the external validity of the European Randomised study of Screening for Prostate Cancer (ERSPC) Risk Calculator (RC) in a contemporary clinical cohort. Methods: The RC calculates
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
R. Gerber
2006-06-01
Full Text Available Two out of a group of 23 mares exposed to tef hay contaminated with Datura ferox (and possibly D. stramonium developed colic. The 1st animal was unresponsive to conservative treatment, underwent surgery for severe intestinal atony and had to be euthanased. The 2nd was less seriously affected, responded well to analgesics and made an uneventful recovery. This horse exhibited marked mydriasis on the first 2 days of being poisoned and showed protracted, milder mydriasis for a further 7 days. Scopolamine was chemically confirmed in urine from this horse for 3 days following the colic attack, while atropine could just be detected for 2 days. Scopolamine was also the main tropane alkaloid found in the contaminating plant material, confirming that this had most probably been a case of D. ferox poisoning. Although Datura intoxication of horses from contaminated hay was suspected previously, this is the 1st case where the intoxication could be confirmed by urine analysis for tropane alkaloids. Extraction and detection methods for atropine and scopolamine in urine are described employing enzymatic hydrolysis followed by liquid-liquid extraction and liquid chromatography tandem mass spectrometry (LC/MS/MS.
Pobiruchin, Monika; Bochum, Sylvia; Martens, Uwe M; Kieser, Meinhard; Schramm, Wendelin
2016-06-01
Records of female breast cancer patients were selected from a clinical cancer registry and separated into three cohorts according to HER2-status (human epidermal growth factor receptor 2) and treatment with or without Trastuzumab (a humanized monoclonal antibody). Propensity score matching was used to balance the cohorts. Afterwards, documented information about disease events (recurrence of cancer, metastases, remission of local/regional recurrences, remission of metastases and death) found in the dataset was leveraged to calculate the annual transition probabilities for every cohort. PMID:27054173
Pre-test analytical support for experiments quench-10, -11 and -12
Pre-test analyses using MELCOR1.8.5, SCDAP/RELAP5 and SCDAPSIM have been performed in collaboration of PSI and FZK to support FZK QUENCH programme of electrically-heated bundle tests on reflood of a degraded core. The experiments include QUENCH-10 and -11, recently carried out in the EU 5th Framework LACOMERA programme, with analytical support in the 6th Framework SARNET network of excellence, and QUENCH-12 to be performed in 2006 in support of the project ISTC1648-2. Each test involves novel features that pose challenges in the planning analyses to determine the test protocol and that require code and input changes to accommodate the test conditions. Special versions of the SCDAP codes were developed to simulate the effect of air on Zircaloy oxidation in the PWR air ingress test QUENCH-10, following pre-oxidation in steam. The analyses highlighted potential difficulties during the air oxidation and reflood phases that were avoided by changes in the test protocol. A more gradual thermal excursion could be achieved, facilitating control of the test, interpretation of data, and minimising the risk of a major excursion during quench. QUENCH-11 involved the steady boildown of an initially water-filled PWR bundle. Additional heating and water supplies were needed to give the desired conditions, and these needed to be tightly specified. Data from pre-tests with lower maximum temperatures were used to benchmark the models for defining the main test. QUENCH-12 examines the effect of WWER bundle configuration and cladding on heat-up, oxidation, and quench response. The bundle is significantly modified with changes to cladding material (Zr/1%Nb instead of Zry-4), electrical heating, and geometry, hence to radiative heat transfer, hydraulics and oxidation characteristics. Oxidation correlations for Zr/1%Nb in steam were introduced into special versions of SCDAP. Pre-test calculations suggest that the modified kinetics have only a minor effect on the thermal response, but
Pre-test analysis for the KNGR DVI performance test facility using FLUENT
Pre-test analysis using a FLUENT code has been performed for the KGNR(Korean Next Generation Reactor) DVI(Direct Vessel Injection) performance test facility which is a full height and 1/24.3 volume scaled separate effect test facility. The ideal gas discharge condition is considered to simulation a steam discharge condition. The scale effects on the flow pattern, pressure distribution, and similarity for scaled model are numerically tested. From the various results for the scale effects, it was found that the similarity of hydraulics is founded
Tatiana Vladimirovna Mokina
2015-10-01
Full Text Available Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL syndrome is a congenital small-vessel disease running with recurrent lacunar infarcts and leading to gradually progressive subcortical, pseudobulbar, and cerebellar syndromes and dementia. Neuroimaging reveal multiple lacunar infarcts in the basal ganglia, thalamus, pons Varolii, and cerebral hemispheric white matter, as well as cerebral atrophy. The specific feature of the disease is white matter lesion adjacent to the temporal horns of the lateral ventricles and to the external capsules. The paper describes a patient with CADASIL syndrome. The latter runs a progressive course and includes the following neurological disorders: cognitive, pyramidal, extrapyramidal, and axial ones. This clinical case was differentially diagnosed with multiple sclerosis, including with consideration for neuroimaging findings. The CADASIL syndrome is a rare potentially menacing neurological condition that is observed in young patients and requires a detailed examination using current diagnostic techniques.
Fichman-Charchat, Helenice; Miranda, Cristina Vieira; Fernandes, Conceição Santos; Mograbi, Daniel; Oliveira, Rosinda Martins; Novaes, Regina; Aguiar, Daniele
2016-02-01
The diagnosis of early signs of Alzheimer's disease (AD) is a major challenge in a heterogeneous population. Objective To investigate the use of the Brief Cognitive Screening Battery (BCSB) for the diagnosis of mild AD in a geriatric outpatient unit of a public hospital in the city of Rio de Janeiro. Method BCSB was administered to 51 elderly adults with a clinical diagnosis of probable AD and 123 older adults without dementia (non-AD). Results AD patients performed worse than non-AD group in all BCSB tests, except Clock Drawing (p = 0.10). The ROC curves and Logistic Regression analysis indicated that delayed recall in the figure memory test was the best predictor, screening mild AD patients with sensibility and specificity superior to 80%. Conclusion The BCSB was accurate in identifying people with AD in a geriatric outpatient clinic at a public hospital, including elderly people with chronic diseases, physical frailty and cognitive impairment. PMID:26690839
Pre-test analysis results of a PWR steel lined pre-stressed concrete containment model
Pre-stressed concrete nuclear containment serves as the ultimate barrier against the release of radioactivity to the environment. This ultimate barrier must be checked for its ultimate load carrying capacity. BARC participated in a Round Robin analysis activity which is co-sponsored by Sandia National Laboratory, USA and Nuclear Power Engineering Corporation Japan for the pre-test prediction of a 1:4 size Pre-stressed Concrete Containment Vessel. In house finite element code ULCA was used to make the test predictions of displacements and strains at the standard output locations. The present report focuses on the important landmarks of the pre-test results, in sequential terms of first crack appearance, loss of pre-stress, first through thickness crack, rebar and liner yielding and finally liner tearing at the ultimate load. Global and local failure modes of the containment have been obtained from the analysis. Finally sensitivity of the numerical results with respect to different types of liners and different constitutive models in terms of bond strength between concrete and steel and tension-stiffening parameters are examined. The report highlights the important features which could be observed during the test and guidelines are given for improving the prediction in the post test computation after the test data is available. (author)
A well test analysis method accounting for pre-test operations
We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties
Pre-test CFD Calculations for a Bypass Flow Standard Problem
Rich Johnson
2011-11-01
The bypass flow in a prismatic high temperature gas-cooled reactor (HTGR) is the flow that occurs between adjacent graphite blocks. Gaps exist between blocks due to variances in their manufacture and installation and because of the expansion and shrinkage of the blocks from heating and irradiation. Although the temperature of fuel compacts and graphite is sensitive to the presence of bypass flow, there is great uncertainty in the level and effects of the bypass flow. The Next Generation Nuclear Plant (NGNP) program at the Idaho National Laboratory has undertaken to produce experimental data of isothermal bypass flow between three adjacent graphite blocks. These data are intended to provide validation for computational fluid dynamic (CFD) analyses of the bypass flow. Such validation data sets are called Standard Problems in the nuclear safety analysis field. Details of the experimental apparatus as well as several pre-test calculations of the bypass flow are provided. Pre-test calculations are useful in examining the nature of the flow and to see if there are any problems associated with the flow and its measurement. The apparatus is designed to be able to provide three different gap widths in the vertical direction (the direction of the normal coolant flow) and two gap widths in the horizontal direction. It is expected that the vertical bypass flow will range from laminar to transitional to turbulent flow for the different gap widths that will be available.
Purpose. - Conformal irradiation of non-small cell lung carcinoma (NSCLC) is largely based on a precise definition of the nodal clinical target volume (CTVn). The reduction of the number of nodal stations to be irradiated would render tumor dose escalation more achievable. The aim of this work was to design an mathematical tool based on documented data, that would predict the risk of metastatic involvement for each nodal station. Methods and material. - From the large surgical series published in the literature we looked at the main pre-treatment parameters that modify the risk of nodal invasion. The probability of involvement for the 17 nodal stations described by the American Thoracic Society (ATS) was computed from all these publications and then weighted according to the French epidemiological data. Starting from the primitive location of the tumour as the main characteristic, we built a probabilistic tree for each nodal station representing the risk distribution as a function of each tumor feature. From the statistical point of view, we used the inversion of probability trees method described by Weinstein and Feinberg. Results. -Taking into account all the different parameters of I the pre-treatment staging relative to each level of the ATS map brings up to 20,000 different combinations. The first chosen parameters in the tree were, depending on the tumour location, the histological classification, the metastatic stage, the nodal stage weighted in function of the sensitivity and specificity of the diagnostic examination used (PET scan, CAT scan) and the tumoral stage. A software is proposed to compute a predicted probability of involvement of each nodal station for any given clinical presentation.Conclusion. -To better define the CTVn in NSCLC 3DRT, we propose a software that evaluates the mediastinal nodal involvement risk from easily accessible individual pretreatment parameters. (authors)
Wood Stephen
2009-10-01
Full Text Available Abstract Background The rate of caesarean sections is increasing worldwide, yet medical literature informing women with uncomplicated pregnancies about relative risks and benefits of elective caesarean section (CS compared with vaginal delivery (VD remains scarce. A decision board may address this gap, providing systematic evidence-based information so that patients can more fully understand their treatment options. The objective of our study was to design and pre-test a decision board to guide clinical discussions and enhance informed decision-making related to delivery approach (CS or VD in uncomplicated pregnancy. Methods Development of the decision board involved two preliminary studies to determine women's preferred mode of risk presentation and a systematic literature review for the most comprehensive presentation of medical risks at the time (VD and CS. Forty women were recruited to pre-test the tool. Eligible subjects were of childbearing age (18-40 years but were not pregnant in order to avoid raising the expectation among pregnant women that CS was a universally available birth option. Women selected their preferred delivery approach and completed the Decisional Conflict Scale to measure decisional uncertainty before and after reviewing the decision board. They also answered open-ended questions reflecting what they had learned, whether or not the information had helped them to choose between birth methods, and additional information that should be included. Descriptive statistics were used to analyse sample characteristics and women's choice of delivery approach pre/post decision board. Change in decisional conflict was measured using Wilcoxon's sign rank test for each of the three subscales. Results The majority of women reported that they had learned something new (n = 37, 92% and that the tool had helped them make a hypothetical choice between delivery approaches (n = 34, 85%. Women wanted more information about neonatal risks and
TOPFLOW-PTS experiments. pre-test calculations with NEPTUNECFD code
Hypothetical Small Break Loss Of Coolant Accident is identified as one of the most severe transients leading to a potential huge Pressurized Thermal Shock on the Reactor Pressure Vessel (RPV). This may result in two-phase flow configurations in the cold legs, according to the operating conditions, and to reliably assess the RPV wall integrity, advanced two-phase flow simulations are required. Related needs in development and/or validation of these advanced models are important, and the on-going TOPFLOW-PTS experimental program was designed to provide a well documented data base to meet these needs. This paper focuses on pre-test NEPTUNECFD simulations of TOPFLOW-PTS experiments; these simulations were performed to (i) help in the definition of the test matrix and test procedure, and (ii) check the presence of the different key physical phenomena at the mock-up scale. (author)
NESC-1 spinning cylinder experiment. Pre-test fracture analysis evaluation
A pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical sub-clad flaw (70 mm deep) is more favourable to cleavage initiation near the base metal - cladding interface. (K.A.)
Pre-test prediction for LOBI test A1-04 (PREX test)
This report contains the pre-test prediction for the first LOBI test A1-04, which has been chosen as Pre-Prediction Exercise (LOBI-PREX). The test A1-04 will be a simulation of a nearly 200% double ended off-set shear break in the cold leg of the primary system of a four loop PWR. The prediction was performed with the RELAP4/Mod 6 computer code. The report gives the test specification (initial and boundary conditions), a brief description of the RELAP4 model used for the LOBI test facility and a short analysis of the predicted system behaviour. A complete RELAP4 input listing is given in Appendix A
FUMEX cases 1, 2, and 3 calculated pre-test and post-test results
Two versions (modified pre-test and modified post-test) of PIN-micro code were used to analyse the fuel rod behaviour of three FUMEX experiments. The experience of applying PIN-micro code with its simple structure and old conception of the steady-state operation shows significant difficulties in treating the complex processes like those in FUMEX experiments. These difficulties were partially overcame through different model modifications and corrections based on special engineering estimations and the results obtained as a whole do not seem unreasonable. The calculations have been performed by a group from two Bulgarian institutions in collaboration with specialists from the Kurchatov Research Center. 1 tab., 14 figs., 8 refs
BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM
The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper
Robson, Barry; Boray, Srinidhi
2015-11-01
We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management. PMID:26386548
Stokes, Donna
2012-10-01
The student success rate in the algebra-based Introductory General Physics I course at the University of Houston (UH) and across the United States is low in comparison to success rates in other service courses. In order to improve student success rates, we have implemented, in addition to interactive teaching techniques, pre-testing as an early intervention process to identify and remediate at-risk students. The pre-testing includes a math and problem-solving skills diagnostic exam and pre-tests administered prior to all regular exams. Students identified as at risk based on their scores on these pre-tests are given incentives to utilize a tutoring intervention consisting of on-line math tutoring to address math deficiencies and tutoring by graduate Physics Teaching Assistants to address student understanding of the physics concepts. Results from 503 students enrolled in three sections of the course showed that 78% of the students identified as at-risk students by the diagnostic exam who completed the math tutorial successfully completed the course, as compared to 45% of at-risk students who did not complete the math tutorial. Results of the pre-testing before each regular exam showed that all students who were identified as at risk based on pre-test scores had positive gains ranging from 9 -- 32% for the three regular exams. However, the large standard deviations of these gains indicate that they are not statistically significant; therefore, pretesting before exams will not be offer in the course. However, utilization of the math tutorials as remediation will continue to be offered to all sections of the algebra-based course at UH with the goal of significantly improving the overall success rates for the introductory physics courses.
Pre-test calculations for FAL-19 and FAL-20 using the INSPECT code
Pre-test calculations have been carried out for tests FAL-19 and FAL-20. These experiments will be conducted in early 1993 as part of the Falcon test matrix, and have the objective of studying iodine chemistry within the containment under conditions simulating aspects of a severe accident within a light water reactor. In order to make these predictions it was assumed that 10% of the iodine inventory entered the containment as I2, and that in FAL-19 (high humidity in the containment) reaction of I2 with steel was irreversible and in FAL-20 (low humidity) it was reversible. In FAL-20, I2 was predicted to transfer from the steel to paint and sump. Results predict rapid uptake by walls and very little long term volatility apart from a low level of CH3I. A final report of this work will be issued in December 1992 that also takes account of the role of non-aqueous aerosols on the iodine behaviour. (author)
Pre-test prediction report LOBI-MOD2 Test BT-12 large steam line break
The RETRAN-02 code has been selected by the CEGB for independent assessment of the thermal hydraulic component of the intact circuit fault safety case for Sizewell B. An important source of validation data for RETRAN is the European Community sponsored LOB1-MOD2 Integral Test Facility. One component of the agreed LOB1 test matrix is the large (100%) steam line break test BT-12 for which the UK has been designated as partner country. This report details the pre-test predictions undertaken in support of Test BT-12 using the RETRAN-02/Mod 3 code. Three separate analyses are presented. In addition to the best estimate prediction, two scoping predictions are presented which respectively minimise and maximise the primary cooldown. The best estimate calculation was undertaken using dynamic slip with multi-node steam generator representations. The maximum cooldown was obtained using a single bubble rise volume broken loop steam generator model to minimise the liquid carryover to the break. The minimum cooldown used full noding for the broken loop steam generator but without slip (ie equal phase velocities) to maximise the carryover. A number of modelling difficulties had to be overcome including steady state initialisation at the zero feed and steam flow hot standby condition. (author)
Epco Hasker
Full Text Available INTRODUCTION: Asymptomatic persons infected with the parasites causing visceral leishmaniasis (VL usually outnumber clinically apparent cases by a ratio of 4-10 to 1. We assessed the risk of progression from infection to disease as a function of DAT and rK39 serological titers. METHODS: We used available data on four cohorts from villages in India and Nepal that are highly endemic for Leishmania donovani. In each cohort two serosurveys had been conducted. Based on results of initial surveys, subjects were classified as seronegative, moderately seropositive or strongly seropositive using both DAT and rK39. Based on the combination of first and second survey results we identified seroconvertors for both markers. Seroconvertors were subdivided in high and low titer convertors. Subjects were followed up for at least one year following the second survey. Incident VL cases were recorded and verified. RESULTS: We assessed a total of 32,529 enrolled subjects, for a total follow-up time of 72,169 person years. Altogether 235 incident VL cases were documented. The probability of progression to disease was strongly associated with initial serostatus and with seroconversion; this was particularly the case for those with high titers and most prominently among seroconvertors. For high titer DAT convertors the hazard ratio reached as high as 97.4 when compared to non-convertors. The strengths of the associations varied between cohorts and between markers but similar trends were observed between the four cohorts and the two markers. DISCUSSION: There is a strongly increased risk of progressing to disease among DAT and/or rK39 seropositives with high titers. The options for prophylactic treatment for this group merit further investigation, as it could be of clinical benefit if it prevents progression to disease. Prophylactic treatment might also have a public health benefit if it can be corroborated that these asymptomatically infected individuals are infectious
Okazaki, Shintaro; Alonso Rivas, Javier
2002-01-01
Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…
Pre-test CFD simulations on TOPFLOW-PTS experiments with ANSYS CFX 12.0
Some scenarios for Small Break Loss Of Coolant Accidents (SB-LOCA) lead to an Emergency Core Cooling (ECC) water injection into the cold leg of a Pressurized Water Reactor (PWR). The cold water mixes there with a hot coolant present in the primary circuit. The mixture flows to the downcomer where further mixing of fluids takes place. Single-phase as well as two-phase PTS (Pressurized Thermal Shock) situations have to be considered. Pressurized Thermal Shock implies the occurrence of thermal loads on the Reactor Pressure Vessel wall. In order to predict thermal gradients in the structural components of the Reactor Pressure Vessel (RPV) wall knowledge of transient temperature distribution in the downcomer is needed. The prediction of the temperature distribution requires reliable Computational Fluid Dynamic simulations. In case of two-phase PTS situations the water level in the RPV has dropped down to the height position of the cold leg nozzle or below leading to a partially filled or totally uncovered cold leg. In the frame of the EU project NURISP (Nuclear Reactor Integrated Simulation Project) attempts are made to improve the CFD modelling for two-phase PTS situations. This paper presents pre-test simulations on TOPFLOW-PTS experiments. The experiments will be carried out on the TOPFLOW-PTS test facility of the Forschungszentrum Dresden-Rossendorf. For the numerical investigations in the frame of NURISP two reference cases were defined: one for steady air-water and one for steady steam-water flow. The simulations were performed by using the CFD-code ANSYS CFX 12.0. Best practice guidelines were considered as far as possible. A homogeneous model was applied for momentum equations. Turbulence was modelled with the homogeneous Shear Stress Transport turbulence model. In all simulations the cold leg was 50 pc full of water. In case of air-water simulation the operating conditions for both fluids were 40 deg. C - 50 deg. C and 22.5 bar for the temperature and pressure
Clark Melissa A
2007-09-01
Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients
NESC-1 spinning cylinder experiment: Pre-test fracture analysis evaluation
The NESC project (Network for the Evaluation of Steel Components) has been started in Europe with fundings from European Community (EC) and Health and Safety Executive (HSE, UK). This project contains several aspects of the structural integrity assessment procedure and more specifically nondestructive examination, fracture mechanics and materials characterization. A first test is being planned at the AEA Technology Laboratories (Risley, UK) on the Spinning Cylinder test facility. The experiment will be conducted on a large scale cladded cylinder containing surface and subclad cracks exposed to a pressurized thermal shock transient (PTS). The main purpose of the test is to obtain the cleavage initiation in base metal. Within the framework of this project, a pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied, the first one with a 40 mm deep semi-elliptical subclad flaw, the second with a 70 mm deep semi-elliptical subclad flaw in a thicker vessel. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical subclad flaw (70 mm deep) is more favorable to cleavage initiation near the base metal-cladding interface
The Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS): a Pre-test Study
de Jong, Merel; Tamminga, Sietske J; de Boer, Angela G E M; Frings-Dresen, Monique H.W.
2016-01-01
Background Returning to and continuing work is important to many cancer survivors, but also represents a challenge. We know little about subjective work outcomes and how cancer survivors perceive being returned to work. Therefore, we developed the Quality of Working Life Questionnaire for Cancer Survivors (QWLQ-CS). Our aim was to pre-test the items of the initial QWLQ-CS on acceptability and comprehensiveness. In addition, item retention was performed by pre-assessing the relevance scores an...
Comparison of different coupling CFD–STH approaches for pre-test analysis of a TALL-3D experiment
Papukchiev, Angel, E-mail: angel.papukchiev@grs.de [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany); Jeltsov, Marti; Kööp, Kaspar; Kudinov, Pavel [KTH Royal Institute of Technology, Stockholm (Sweden); Lerchl, Georg [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Garching n. Munich (Germany)
2015-08-15
Highlights: • Thermal-hydraulic system codes and CFD tools are coupled. • Pre-test calculations for the TALL-3D facility are performed. • Complex flow and heat transfer phenomena are modeled. • Comparative analyses have been performed. - Abstract: The system thermal-hydraulic (STH) code ATHLET was coupled with the commercial 3D computational fluid dynamics (CFD) software package ANSYS CFX to improve ATHLET simulation capabilities for flows with pronounced 3D phenomena such as flow mixing and thermal stratification. Within the FP7 European project THINS (Thermal Hydraulics of Innovative Nuclear Systems), validation activities for coupled thermal-hydraulic codes are being carried out. The TALL-3D experimental facility, operated by KTH Royal Institute of Technology in Stockholm, is designed for thermal-hydraulic experiments with lead-bismuth eutectic (LBE) coolant at natural and forced circulation conditions. GRS carried out pre-test simulations with ATHLET–ANSYS CFX for the TALL-3D experiment T01, while KTH scientists perform these analyses with the coupled code RELAP5/STAR CCM+. In the experiment T01 the main circulation pump is stopped, which leads to interesting thermal-hydraulic transient with local 3D phenomena. In this paper, the TALL-3D behavior during T01 is analyzed and the results of the coupled pre-test calculations, performed by GRS (ATHLET–ANSYS CFX) and KTH (RELAP5/STAR CCM+) are directly compared.
Interpretations of Negative Probabilities
Burgin, Mark
2010-01-01
In this paper, we give a frequency interpretation of negative probability, as well as of extended probability, demonstrating that to a great extent, these new types of probabilities, behave as conventional probabilities. Extended probability comprises both conventional probability and negative probability. The frequency interpretation of negative probabilities gives supportive evidence to the axiomatic system built in (Burgin, 2009; arXiv:0912.4767) for extended probability as it is demonstra...
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Briggs, William M
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
2013-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Muecke, Ralph [Dept. of Radiotherapy, St. Josefs-Hospital. Wiesbaden (Germany); Micke, Oliver [Dept. of Radiotherapy, Muenster Univ. Hospital (Germany); Reichl, Berthold [Dept. of Radiotherapy, Weiden Hospital (DE)] (and others)
2007-03-15
A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p <0.001); >58/{<=}58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis {<=} 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p <0.001), an age >58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs.
A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p 58/≤58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis ≤ 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p 58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W
2004-01-01
Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2011-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization o...
Probability and paternity testing.
Elston, R C
1986-01-01
A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of p...
Logical Probability Preferences
Saad, Emad
2013-01-01
We present a unified logical framework for representing and reasoning about both probability quantitative and qualitative preferences in probability answer set programming, called probability answer set optimization programs. The proposed framework is vital to allow defining probability quantitative preferences over the possible outcomes of qualitative preferences. We show the application of probability answer set optimization programs to a variant of the well-known nurse restoring problem, c...
Raundal, P. M.; Andersen, P. H.; Toft, Nils;
2015-01-01
Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre......-testing habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P < 0.001), increased MNT (P < 0.001) and decreased the discrepancy between applied and target force during stimulations (P < 0.001). Pre...
Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax
Ryerson, F.J.; Qualheim, B.J.
1983-12-01
Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables.
Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax
Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables
Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02
Thornburgh, Robert P.; Hilburger, Mark W.
2011-01-01
This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.
Agreeing Probability Measures for Comparative Probability Structures
Wakker, Peter
1981-01-01
It is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a $\\sigma$-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid for the general case, but only for $\\sigma$-algebras. Here the proof of Niiniluoto (1972) is supplemented. Furthermore an example is presented that reveals many misunderstandings in the literature. At the...
Benci, Vieri; Wenmackers, Sylvia
2011-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
The aim of this study was to investigate the factors influencing the development of early radiation adverse events in cervical and endometrial cancer patients after postoperative radiotherapy. The study included 109 patients with cervical and endometrial carcinoma after radical surgery. External beam pelvic irradiation was performed on a Co-60 machine with 'Box' technique to 50 Gy total dose in 2 Gy daily fractions. Early radiation adverse events were assessed according to Common Terminology Criteria for Adverse Events, v.3.0, for first time applied in Bulgaria by own designed questionnaire. Information on smoking habits, previous abdominal surgery, sensitivity to sunlight, family history and concomitant diseases was generated. DNA isolated from venous blood was used for genotype analysis with Polymerise Chain Reaction Fragment Length Polymorphisms (PCR-REFP). Standard statistical package and logistic regression analysis was applied for statistical evaluation. Only 2% of the patients did not developed any early radiation adverse events Majority of patients suffer grade 1 and 2 adverse events. No grade 4 and 5 events were recorded. Smoking increases the grade of gastrointestinal events and the summarized clinical radiosensitivity Sensitivity to sunlight was associated with moderate and severe skin reactions. Genetic factors influence the severity of genitourinary (XRCC1 194 (C>T), XRCC1 280 (G>A)) and skin adverse events (XRCC1 194 (C>T), XRCC1 280 (G>A)) and also the summarized clinical radiosensitivity (XRCC1 194 (C>T)). The risk factors for development of early radiation adverse events found in the present study are smoking, sensitivity to sunlight and the following SNPs XRCC1 194 (C>T), XRCC1 280 (G>A) u XRCC1 194 (C>T), XRCC1 280 (G>A).
Estimating extreme flood probabilities
Estimates of the exceedance probabilities of extreme floods are needed for the assessment of flood hazard at Department of Energy facilities. A new approach using a joint probability distribution of extreme rainfalls and antecedent soil moisture conditions, along with a rainfall runoff model, provides estimates of probabilities for floods approaching the probable maximum flood. This approach is illustrated for a 570 km2 catchment in Wisconsin and a 260 km2 catchment in Tennessee
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Introduction to probability models
Ross, Sheldon M
2006-01-01
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v
Pre-test analysis for the KNGR LBLOCA DVI performance test using a best estimate code MARS
Pre-test analysis using a MARS code has been performed for the KNGR (Korean Next Generation Reactor) DVI (Direct Vessel Injection) performance test facility which is a full height and 1/24.3 volume scaled separate effects test facility focusing on the identification of multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. From the steady state analyses for various test cases at late reflood condition, the degree of major thermal-hydraulic phenomena such as ECC bypass, ECC penetration, steam condensation, and water level sweep-out are quantified. The MARS code analysis results showed that: (a) multi-dimensional flow and temperature behaviors occurred in the downcomer region as expected, (b) the proximity of ECC injection to the break caused more ECC bypass and less steam condensation efficiency, (c) increasing steam flow rate resulted in more ECC bypass and less steam condensation, (d) and the high velocity of steam flow swept-out the water in the downcomer just below the cold leg nozzle. These results are comparable with those observed in the previous tests such as UPTF and CCTF. (author)
Rod bundles simulating the LOFT Core-1, both with and without rod external thermocouple simulators, will be tested to determine the effect of rod external thermocouples on time-to-DNB under blowdown conditions similar to those in LOFT. Pre-test predictions have been made using the RELAP4 computer code. The purposes of this analysis were (1) to predict blowdown orifice sizes which result in the closest simulation of coolant pressure, quality, and flow rate between the test section and the LOFT core for a LOFT 200 percent simulated cold leg break at a peak linear heat generation rate of 19 kw/ft, (2) to determine ranges for instrumentation, and (3) to estimate the time-to-DNB in the rod bundles. An exact simulation of the LOFT blowdown conditions, however, can not be obtained in the test section because the rod bundles have a uniform axial power profile and the Columbia test loop is not scaled to the LOFT configuration
Pre-test of the KYLIN-II thermal-hydraulics mixed circulation LBE loop using RELAP5
To investigate the behavior of lead bismuth eutectic (LBE) as coolant in China LEAd-based Research Reactor, Institute of Nuclear Energy Safety Institute (INEST), Chinese Academy of Sciences has built a multi-functional LBE experiment facility KYLIN-II. Mixed circulation loop, which is one of the KYLIN-II thermal-hydraulics loops, has the capability to drive the flowing LBE in different ways such as pump, gas lift and temperature difference (natural circulation). In this contribution, preliminary numerical simulations in support of the operation and experiment of KYLIN-II thermal-hydraulics mixed circulation LBE loop have been carried out and the obtained results have been studied. The RELAP5 Mod4.0 with LBE model has been utilized. Pre-test analysis showed the LBE circulation capability can reach the object under several driven patterns. The maximum velocity in fuel pin bundles can be larger than 0.15 m/s for natural circulation, 0.5 m/s for gas enhanced circulation, and 2 m/s for pump driven circulation. (author)
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to...
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)
Mariella B.L. Careaga
2015-03-01
Full Text Available Cognitive processes, such as learning and memory, are essential for our adaptation to environmental changes and consequently for survival. Numerous studies indicate that hormones secreted during stressful situations, such as glucocorticoids (GCs, adrenaline and noradrenaline, regulate memory functions, modulating aversive memory consolidation and retrieval, in an interactive and complementary way. Thus, the facilitatory effects of GCs on memory consolidation as well as their suppressive effects on retrieval are substantially explained by this interaction. On the other hand, low levels of GCs are also associated with negative effects on memory consolidation and retrieval and the mechanisms involved are not well understood. The present study sought to investigate the consequences of blocking the rise of GCs on fear memory retrieval in multiple tests, assessing the participation of β-adrenergic signaling on this effect. Metyrapone (GCs synthesis inhibitor, administered 90 min before the first test of contextual or auditory fear conditioning, negatively affected animals’ performances, but this effect did not persist on a subsequent test, when the conditioned response was again expressed. This result suggested that the treatment impaired fear memory retrieval during the first evaluation. The administration immediately after the first test did not affect the animals’ performances in contextual fear conditioning, suggesting that the drug did not interfere with processes triggered by memory reactivation. Moreover, metyrapone effects were independent of β-adrenergic signaling, since concurrent administration with propranolol, a β-adrenergic antagonist, did not modify the effects induced by metyrapone alone. These results demonstrate that pre-test metyrapone administration led to negative effects on fear memory retrieval and this action was independent of a β-adrenergic signaling.
Pre-Test Assessment of the Use Envelope of the Normal Force of a Wind Tunnel Strain-Gage Balance
Ulbrich, N.
2016-01-01
The relationship between the aerodynamic lift force generated by a wind tunnel model, the model weight, and the measured normal force of a strain-gage balance is investigated to better understand the expected use envelope of the normal force during a wind tunnel test. First, the fundamental relationship between normal force, model weight, lift curve slope, model reference area, dynamic pressure, and angle of attack is derived. Then, based on this fundamental relationship, the use envelope of a balance is examined for four typical wind tunnel test cases. The first case looks at the use envelope of the normal force during the test of a light wind tunnel model at high subsonic Mach numbers. The second case examines the use envelope of the normal force during the test of a heavy wind tunnel model in an atmospheric low-speed facility. The third case reviews the use envelope of the normal force during the test of a floor-mounted semi-span model. The fourth case discusses the normal force characteristics during the test of a rotated full-span model. The wind tunnel model's lift-to-weight ratio is introduced as a new parameter that may be used for a quick pre-test assessment of the use envelope of the normal force of a balance. The parameter is derived as a function of the lift coefficient, the dimensionless dynamic pressure, and the dimensionless model weight. Lower and upper bounds of the use envelope of a balance are defined using the model's lift-to-weight ratio. Finally, data from a pressurized wind tunnel is used to illustrate both application and interpretation of the model's lift-to-weight ratio.
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
On Quantum Conditional Probability
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Prabhu, Narahari
2011-01-01
Recent research in probability has been concerned with applications such as data mining and finance models. Some aspects of the foundations of probability theory have receded into the background. Yet, these aspects are very important and have to be brought back into prominence.
Probability, Nondeterminism and Concurrency
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Elena Druica
2007-05-01
Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.
Siew Nyet Moi; Abdullah Sopiah; Kueh Ngie King
2013-01-01
Aims: 1. To investigate the effects of concrete learning aids (Colour Balls) with Student Teams-Achievement Division (STAD) cooperative learning (CBCL) method on Form Four Arts Stream students’ performance in probability; 2. To find out students’ perception towards the use of CBCL method in learning probability. Study Design: Quasi experimental pre-test post-test control group design. Two treatment groups were employed in this design, they were CBCL (experimental gr...
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Stochastic Programming with Probability
Andrieu, Laetitia; Vázquez-Abad, Felisa
2007-01-01
In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Estimating Subjective Probabilities
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...
Estimating Subjective Probabilities
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Marshall, Jennings B.
2007-01-01
This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Probability in quantum mechanics
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
Iida Mariko
2011-06-01
Full Text Available Abstract Background Sub-Saharan Africa is among the countries where 10% of girls become mothers by the age of 16 years old. The United Republic of Tanzania located in Sub-Saharan Africa is one country where teenage pregnancy is a problem facing adolescent girls. Adolescent pregnancy has been identified as one of the reasons for girls dropping out from school. This study's purpose was to evaluate a reproductive health awareness program for the improvement of reproductive health for adolescents in urban Tanzania. Methods A quasi-experimental pre-test and post-test research design was conducted to evaluate adolescents' knowledge, attitude, and behavior about reproductive health before and after the program. Data were collected from students aged 11 to 16, at Ilala Municipal, Dar es Salaam, Tanzania. An anonymous 23-item questionnaire provided the data. The program was conducted using a picture drama, reproductive health materials and group discussion. Results In total, 313 questionnaires were distributed and 305 (97.4% were useable for the final analysis. The mean age for girls was 12.5 years and 13.2 years for boys. A large minority of both girls (26.8% and boys (41.4% had experienced sex and among the girls who had experienced sex, 51.2% reported that it was by force. The girls' mean score in the knowledge pre-test was 5.9, and 6.8 in post-test, which increased significantly (t = 7.9, p = 0.000. The mean behavior pre-test score was 25.8 and post-test was 26.6, which showed a significant increase (t = 3.0, p = 0.003. The boys' mean score in the knowledge pre-test was 6.4 and 7.0 for the post-test, which increased significantly (t = 4.5, p = 0.000. The mean behavior pre-test score was 25.6 and 26.4 in post-test, which showed a significant increase (t = 2.4, p = 0.019. However, the pre-test and post-test attitude scores showed no statistically significant difference for either girls or boys. Conclusions Teenagers have sexual experiences including
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving a...
New Zealand population and cancer statistics have been used to derive the probability that an existing cancer in an individual was the result of a known exposure to radiation. Hypothetical case histories illustrate how sex, race, age at exposure, age at presentation with disease, and the type of cancer affect this probability. The method can be used now to identify claims in which a link between exposure and disease is very strong or very weak, and the types of cancer and population sub-groups for which radiation is most likely to be the causative agent. Advantages and difficulties in using a probability of causation approach in legal or compensation hearings are outlined. The approach is feasible for any carcinogen for which reasonable risk estimates can be made
Minimum Probability Flow Learning
Sohl-Dickstein, Jascha; DeWeese, Michael R
2009-01-01
Learning in probabilistic models is often severely hampered by the general intractability of the normalization factor and its derivatives. Here we propose a new learning technique that obviates the need to compute an intractable normalization factor or sample from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the initial flow of probability away from the data distribution. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate the application of minimum probability flow learning to parameter estimation in Ising models, deep belief networks, multivariate Gaussian distributions and a continuous model with a highly general energy function defined as a power series. In the Ising model case, minimum probability flow learning outperforms cur...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.; Simon, H.- U.
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Quznetsov, Gunn
1998-01-01
The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.
Quznetsov, G. A.
2003-01-01
The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.
Transition probabilities for atoms
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Waste Package Misload Probability
J.K. Knudsen
2001-11-20
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.
Contributions to quantum probability
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
Contributions to quantum probability
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Waste Package Misload Probability
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa
2016-03-01
In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.
Objectifying Subjective Probabilities
Childers, Timothy
Dordrecht: Springer, 2012 - ( Weber , M.; Dieks, D.; Gonzalez, W.; Hartman, S.; Stadler, F.; Stöltzner, M.), s. 19-28. (The Philosophy of Science in a European Perspective. 3). ISBN 978-94-007-3029-8. [Pluralism in the Foundations of Statistics. Canterbury (GB), 09.09.2010-10.09.2010] R&D Projects: GA ČR(CZ) GAP401/10/1504 Institutional support: RVO:67985955 Keywords : probabilities * direct Inference Subject RIV: AA - Philosophy ; Religion
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Probability mapping of contaminants
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Measurement Uncertainty and Probability
Willink, Robin
2013-02-01
Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.
Integration, measure and probability
Pitt, H R
2012-01-01
This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t
After the reflooding tests in an extremely tight bundle (p/d=1.06, FLORESTAN 1) have been completed, new experiments for a wider lattice (p/d=1.242, FLORESTAN 2), which is employed in the recent APWR design of KfK, are planned at KfK to obtain the benchmark data for validation and improvement of calculation methods. This report presents the results of pre-test calculations for the FLORESTAN 2 experiment using FLUT-FDWR, a modified version of the GRS computer code FLUT for analysis of the most important behaviour during the reflooding phase after a LOCA in the APWR design. (orig.)
张晓辉; 陈成; 曾辉; 宁卫卫; 张楠; 黄建安
2016-01-01
Objective To screen the clinical risk factors of lung cancer in the patients with solitary pulmonary nodules ( SPN) ,and build the clinical prediction model to estimate the probability of malignancy.Methods A retrospective analysis was performed on the clinical data and chest imaging characteristics of 270 patients with SPN.Results Among 270 patients,there had 110 (40.7%) cases of lung cancer,and 160 (59.3%) benign lesions.On the analysis of imaging characteristics,lobulation, spiculated sign, pleural indentation sign, contrast enhancement, air bronchogram sign were associated with lung cancer ( P <0.05).Nodules with clear boundary,calcification,homogeneous density were associated with benign lesions (P<0.05).Single factor analysis showed that age, smoking history, malignant imaging characteristics and diameter were significantly affected the judgment of SPN whether it was benign or malignant(P<0.05).The multivariate analysis revealed that age,malignant imaging characteristics and diameter were independent risk factors of lung cancer in the patients with SPN (P<0.01).The clinical pre-diction model to estimate the probability of malignancy as following:P=ex/(1+ex),X=-5.882+0.050* age+1.672*ima-ging characteristic+0.123* the maximum diameter,where the e is the base of the natural logarithm.The cut-off value was 0.46. The sensitivity was 82%,specificity 85%,positive balue 80%,and negative predictive value 87%.The area under the ROC curve for our model was 0.901.Conclusion Age,malignant imaging characteristics and diameter are independent risk factors of lung cancer in the patients with SPN.Our prediction model is accurate and sufficient to estimate the malignancy of patients with SPN.%目的：筛选恶性孤立性肺结节（solitary pulmonary nodules，SPN）的危险因素，构建判断SPN良恶性的临床预测模型。方法回顾性分析孤立性肺结节患者270例的临床资料及胸部影像学特征。结果270例患者中，肺癌110例（40．7
Probable maximum flood control
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Accidents, probabilities and consequences
Following brief discussion of the safety of wind-driven power plants and solar power plants, some aspects of the safety of fast breeder and thermonuclear power plants are presented. It is pointed out that no safety evaluation of breeders comparable to the Rasmussen investigation has been carried out and that discussion of the safety aspects of thermonuclear power is only just begun. Finally, as an illustration of the varying interpretations of risk and safety analyses, four examples are given of predicted probabilities and consequences in Copenhagen of the maximum credible accident at the Barsebaeck plant, under the most unfavourable meterological conditions. These are made by the Environment Commission, Risoe Research Establishment, REO (a pro-nuclear group) and OOA (an anti-nuclear group), and vary by a factor of over 1000. (JIW)
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
薏苡叶化学成分的预试验%The Coix Leaves Chemical Composition of the Pre-test
谭冰; 黄锁义; 严焕宁; 史柳芝; 吕龙祥
2014-01-01
Experimental study on the Guangxi Coix leaves of the chemical composition of pre-test Chemical reaction identification method of production of the Coix leaves water extract , ethanol extract and petroleum ether extract of Guangxi Coix leaves chemical composition of the pretest. Through the pre-test , suggesting that from Guangxi Coix leaves contain flavonoids,Phenolic,Coumarin,Volatile oil,Phytosterol,Carbohydrate,Glycosides, Tannin,Organic acids,Alkaloids and other chemical constituents. This test provided the experimental basis for further biologically active constituents of the plant.%对广西薏苡叶的化学成分进行预试验研究。采用化学反应鉴别法分别对广西产薏苡叶水提取液、乙醇提取液和石油醚提取液进行化学成分预试。通过预试验，提示广西产薏苡叶中可能含有黄酮类、酚类、香豆素类、挥发油、植物甾醇、糖类、苷类、鞣质、有机酸、生物碱等化学成分。此试验为进一步进行该植物的生物活性成分研究提供了实验基础。
Savage s Concept of Probability
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
RANDOM VARIABLE WITH FUZZY PROBABILITY
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
The Logic of Parametric Probability
Norman, Joseph W
2012-01-01
The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.
Highlights: ► As preparation for the HCPB-TBM Breeder Unit out-of-pile testing campaign, a pre-test experiment (PREMUX) has been prepared and described. ► A new heater system based on a wire heater matrix has been developed for imitating the neutronic volumetric heating and it is compared with the conventional plate heaters. ► The test section is described and preliminary thermal results with the available models are presented and are to be benchmarked with PREMUX. ► The PREMUX integration in the air cooling loop L-STAR/LL in the Karlsruhe Institute for Technology is shown and future steps are discussed. -- Abstract: The complexity of the experimental set-up for testing a full-scaled Breeder Unit (BU) mock-up for the European Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) has motivated to build a pre-test mock-up experiment (PREMUX) consisting of a slice of the BU in the Li4SiO4 region. This pre-test aims at verifying the feasibility of the methods to be used for the subsequent testing of the full-scaled BU mock-up. Key parameters needed for the modeling of the breeder material is also to be determined by the Hot Wire Method (HWM). The modeling tools for the thermo-mechanics of the pebble beds and for the mock-up structure are to be calibrated and validated as well. This paper presents the setting-up of PREMUX in the L-STAR/LL facility at the Karlsruhe Institute of Technology. A key requirement of the experiments is to mimic the neutronic volumetric heating. A new heater concept is discussed and compared to several conventional heater configurations with respect to the estimated temperature distribution in the pebble beds. The design and integration of the thermocouple system in the heater matrix and pebble beds is also described, as well as other key aspects of the mock-up (dimensions, layout, cooling system, purge gas line, boundary conditions and integration in the test facility). The adequacy of these methods for the full-scaled BU mock-up is
Physics with exotic probability theory
Youssef, Saul
2001-01-01
Probability theory can be modified in essentially one way while maintaining consistency with the basic Bayesian framework. This modification results in copies of standard probability theory for real, complex or quaternion probabilities. These copies, in turn, allow one to derive quantum theory while restoring standard probability theory in the classical limit. The argument leading to these three copies constrain physical theories in the same sense that Cox's original arguments constrain alter...
Quantum Foundations : Is Probability Ontological ?
Rosinger, Elemer E
2007-01-01
It is argued that the Copenhagen Interpretation of Quantum Mechanics, founded ontologically on the concept of probability, may be questionable in view of the fact that within Probability Theory itself the ontological status of the concept of probability has always been, and is still under discussion.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Pretest probability assessment derived from attribute matching
Hollander Judd E; Diercks Deborah B; Pollack Charles V; Johnson Charles L; Kline Jeffrey A; Newgard Craig D; Garvey J Lee
2005-01-01
Abstract Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possib...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Probabilities of multiple quantum teleportation
Woesler, Richard
2002-01-01
Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...
Probability and statistics: selected problems
Machado, J.A. Tenreiro; Pinto, Carla M. A.
2014-01-01
Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.
Free Probability on a Direct Product of Noncommutative Probability Spaces
Cho, Ilwoo
2005-01-01
In this paper, we observevd the amalgamated free probability of direct product of noncommutative probability spaces. We defined the amalgamated R-transforms, amalgamated moment series and the amalgamated boxed convolution. They maks us to do the amalgamated R-transform calculus, like the scalar-valued case.
Expected utility with lower probabilities
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to...
Probability theory and its models
Humphreys, Paul
2008-01-01
This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.
Decision analysis with approximate probabilities
Whalen, Thomas
1992-01-01
This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Subjective probability models for lifetimes
Spizzichino, Fabio
2001-01-01
Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Survival probability and ruin probability of a risk model
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Tagliafico, Alberto, E-mail: alberto.tagliafico@unige.it [Institute of Anatomy, Department of Experimental Medicine, University of Genoa, Largo Rosanna Benzi 8, 16132 Genoa (Italy); Succio, Giulia; Serafini, Giovanni [Department of Radiology, Santa Corona Hospital, Pietra Ligure, Italy via XXV Aprile, 38- Pietra Ligure, 17027 Savona (Italy); Martinoli, Carlo [Radiology Department, DISC, Università di Genova, Largo Rosanna Benzi 8, 16138 Genova (Italy)
2012-10-15
Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well.
Objective: To evaluate brachial plexus MRI accuracy with surgical findings and clinical follow-up as reference standard in a large multicentre study. Materials and methods: The research was approved by the Institutional Review Boards, and all patients provided their written informed consent. A multicentre retrospective trial that included three centres was performed between March 2006 and April 2011. A total of 157 patients (men/women: 81/76; age range, 18–84 years) were evaluated: surgical findings and clinical follow-up of at least 12 months were used as the reference standard. MR imaging was performed with different equipment at 1.5 T and 3.0 T. The patient group was divided in five subgroups: mass lesion, traumatic injury, entrapment syndromes, post-treatment evaluation, and other. Sensitivity, specificity with 95% confidence intervals (CIs), positive predictive value (PPV), pre-test-probability (the prevalence), negative predictive value (NPV), pre- and post-test odds (OR), likelihood ratio for positive results (LH+), likelihood ratio for negative results (LH−), accuracy and post-test probability (post-P) were reported on a per-patient basis. Results: The overall sensitivity and specificity with 95% CIs were: 0.810/0.914; (0.697–0.904). Overall PPV, pre-test probability, NPV, LH+, LH−, and accuracy: 0.823, 0.331, 0.905, 9.432, 0.210, 0.878. Conclusions: The overall diagnostic accuracy of brachial plexus MRI calculated on a per-patient base is relatively high. The specificity of brachial plexus MRI in patients suspected of having a space-occupying mass is very high. The sensitivity is also high, but there are false-positive interpretations as well
Probable Inference and Quantum Mechanics
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Transition probabilities of Br II
Bengtson, R. D.; Miller, M. H.
1976-01-01
Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.
Induction, of and by Probability
Rendell, Larry
2013-01-01
This paper examines some methods and ideas underlying the author's successful probabilistic learning systems(PLS), which have proven uniquely effective and efficient in generalization learning or induction. While the emerging principles are generally applicable, this paper illustrates them in heuristic search, which demands noise management and incremental learning. In our approach, both task performance and learning are guided by probability. Probabilities are incrementally normalized and re...
Trajectory probability hypothesis density filter
García-Fernández, Ángel F.; Svensson, Lennart
2016-01-01
This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...
Hf Transition Probabilities and Abundances
Lawler, J. E.; Hartog, E.A. den; Labby, Z. E.; Sneden, C.; Cowan, J. J.; Ivans, I. I.
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement...
Gd Transition Probabilities and Abundances
Hartog, E.A. den; Lawler, J. E.; Sneden, C.; Cowan, J. J.
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has be...
Sm Transition Probabilities and Abundances
Lawler, J. E.; Hartog, E.A. den; Sneden, C.; Cowan, J. J.
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundanc...
Gaussian Probabilities and Expectation Propagation
Cunningham, John P.; Hennig, Philipp; Lacoste-Julien, Simon
2011-01-01
While Gaussian probability densities are omnipresent in applied mathematics, Gaussian cumulative probabilities are hard to calculate in any but the univariate case. We study the utility of Expectation Propagation (EP) as an approximate integration method for this problem. For rectangular integration regions, the approximation is highly accurate. We also extend the derivations to the more general case of polyhedral integration regions. However, we find that in this polyhedral case, EP's answer...
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Compliance with endogenous audit probabilities
Konrad, Kai A.; Lohse, Tim; Qari, Salmai
2015-01-01
This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...
Novel Bounds on Marginal Probabilities
Mooij, Joris M.; Kappen, Hilbert J
2008-01-01
We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...
Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A.
2012-01-01
In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling…
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Joint probability distributions for projection probabilities of random orthonormal states
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)
Joint probability distributions for projection probabilities of random orthonormal states
Alonso, L.; Gorin, T.
2016-04-01
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Born Rule and Noncontextual Probability
Logiurato, Fabrizio
2012-01-01
The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Probability and Statistics: 5 Questions
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Knowledge typology for imprecise probabilities.
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Fusion Probability in Dinuclear System
Hong, Juhee
2015-01-01
Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.
Interference of probabilities in dynamics
Zak, Michail, E-mail: michail.zak@gmail.com [Jet Propulsion Laboratory California Institute of Technology, Pasadena, CA 91109 (United States)
2014-08-15
A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.
Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.
2014-01-01
With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to
Pollock on probability in epistemology
Fitelson, Branden
2010-01-01
In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.
ESTIMATION OF AGE TRANSITION PROBABILITIES.
ZINTER, JUDITH R.
THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…
Transition probability and preferential gauge
Chen, C.Y.
1999-01-01
This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.
Quantum correlations; quantum probability approach
Majewski, W A
2014-01-01
This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...
Diverse Consequences of Algorithmic Probability
Özkural, Eray
2011-01-01
We reminisce and discuss applications of algorithmic probability to a wide range of problems in artificial intelligence, philosophy and technological society. We propose that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline. We also relate to our own work in incremental machine learning and philosophy of complexity.
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Fuzzy Markov chains: uncertain probabilities
James J. Buckley; Eslami, Esfandiar
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Probability representations of fuzzy systems
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability as a Physical Motive
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Probability densities in strong turbulence
Yakhot, Victor
2006-03-01
In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.
Probability, Information and Statistical Physics
Kuzemsky, A. L.
2016-03-01
In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Probability biases as Bayesian inference
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
The probability of extraterrestrial life
Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)
Classical Probability and Quantum Outcomes
James D. Malley
2014-05-01
Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Relative transition probabilities of cobalt
Roig, R. A.; Miller, M. H.
1974-01-01
Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.
Probability for primordial black holes
Bousso, R.; Hawking, S. W.
1995-11-01
We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.
Tight Bernoulli tail probability bounds
Dzindzalieta, Dainius
2014-01-01
The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...
Probability distributions of landslide volumes
M. T. Brunetti; Guzzetti, F.; M. Rossi
2009-01-01
We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...
Snyder, Hannah; Yeldandi, Vijay V.; Kumar, G. Prem; Liao, Chuanhong; Lakshmi, Vemu; Gandham, Sabitha R.; Muppudi, Uma; Oruganti, Ganesh; Schneider, John A
2012-01-01
In India, men who have sex with men (MSM) and truck drivers are high-risk groups that often do not access HIV testing due to stigma and high mobility. This study evaluated a field testing package (FTP) that identified HIV positive participants through video pre-test counseling, OraQuick oral fluid HIV testing, and telephonic post-test counseling and then connected them to government facilities. 598 MSM and truck drivers participated in the FTP and completed surveys covering sociodemographics,...
Generating target probability sequences and events
Ella, Vaignana Spoorthy
2013-01-01
Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.
To assess the performance of acoustic radiation force impulse (ARFI) imaging for identification of malignant liver lesions using meta-analysis. PubMed, the Cochrane Library, the ISI Web of Knowledge and the China National Knowledge Infrastructure were searched. The studies published in English or Chinese relating to evaluation accuracy of ARFI imaging for identification of malignant liver lesions were collected. A hierarchical summary receiver operating characteristic (HSROC) curve was used to examine the ARFI imaging accuracy. Clinical utility of ARFI imaging for identification of malignant liver lesions was evaluated by Fagan plot analysis. A total of eight studies which included 590 liver lesions were analysed. The summary sensitivity and specificity for identification of malignant liver lesions were 0.86 (95 % confidence interval (CI) 0.74-0.93) and 0.89 (95 % CI 0.81-0.94), respectively. The HSROC was 0.94 (95 % CI 0.91-0.96). After ARFI imaging results over the cut-off value for malignant liver lesions (''positive'' result), the corresponding post-test probability for the presence (if pre-test probability was 50 %) was 89 %; in ''negative'' measurement, the post-test probability was 13 %. ARFI imaging has a high accuracy in the classification of liver lesions. (orig.)
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.