WorldWideScience

Sample records for intermediate pre-test probability

  1. The accuracy of clinical and biochemical estimates in defining the pre-test probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Garvie, N.W.; Salehzahi, F.; Kuitert, L.

    2002-01-01

    Full text: The PIOPED survey confirmed the significance of the high probability ventilation/perfusion scan (HP V/Q scan) in establishing the diagnosis of pulmonary embolism (PE). In an interesting sentence, however, the authors indicated that 'the clinicians' assessment of the likelihood of PE (prior probability)' can substantially increase the predictive value of the investigation. The criteria used for this assessment were not published, and this statement conflicts with the belief that the clinical diagnosis of pulmonary embolism is unreliable. A medical history was obtained from 668 patients undergoing V/Q lung scans for suspected PE, and certain clinical features linked to PE were, when present, documented. These included pleuritic chest pain, haemoptysis, dyspnoea, clinical evidence of DVT, recent surgery and right ventricular strain pattern an ECG. D-Dimer levels and initial arterial oxygen saturation (PaO2) levels were also obtained. The prevalence of these clinical and biochemical criteria was then compared between HP (61) and normal (171) scans after exclusion of all equivocal or intermediate scan outcomes (436), (where lung scintigraphy was unable to provide a definite diagnosis). D-Dimer and/or oxygen saturation levels, were similarly compared in each group. A true positive result was scored for each clinical or biochemical criterion when linked with a high probability scan and, conversely, a false positive score when the scan outcome was normal. In this fashion, the positive predictive value (PPV) and, when appropriate, the negative predictive value (NPV) was obtained for each risk factor. In the context of PE, DVT and post-operative status prove the more reliable predictors of a high probability outcome. Where both features were present, the PPV rose to 0.57. A normal D-Dimer level was a better excluder of PE than a normal oxygen saturation level (NPV 0.78-v-0.44). Conversely, a raised D-Dimer, or reduced oxygen saturation, were both a little value in

  2. Pre-test probability risk scores and their use in contemporary management of patients with chest pain: One year stress echo cohort study

    Science.gov (United States)

    Demarco, Daniela Cassar; Papachristidis, Alexandros; Roper, Damian; Tsironis, Ioannis; Byrne, Jonathan; Monaghan, Mark

    2015-01-01

    Objectives To compare how patients with chest pain would be investigated, based on the two guidelines available for UK cardiologists, on the management of patients with stable chest pain. The UK National Institute of Clinical Excellence (NICE) guideline which was published in 2010 and the European society of cardiology (ESC) guideline published in 2013. Both guidelines utilise pre-test probability risk scores, to guide the choice of investigation. Design We undertook a large retrospective study to investigate the outcomes of stress echocardiography. Setting A large tertiary centre in the UK in a contemporary clinical practice. Participants Two thirds of the patients in the cohort were referred from our rapid access chest pain clinics. Results We found that the NICE risk score overestimates risk by 20% compared to the ESC Risk score. We also found that based on the NICE guidelines, 44% of the patients presenting with chest pain, in this cohort, would have been investigated invasively, with diagnostic coronary angiography. Using the ESC guidelines, only 0.3% of the patients would be investigated invasively. Conclusion The large discrepancy between the two guidelines can be easily reduced if NICE adopted the ESC risk score. PMID:26673458

  3. Subsequent investigation and management of patients with intermediate-category and - probability ventilation - perfusion scintigraphy

    International Nuclear Information System (INIS)

    Walsh, G.; Jones, D.N.

    2000-01-01

    The authors wished to determine the proportion of patients with intermediate-category and intermediate-probability ventilation-perfusion scintigraphy (IVQS) who proceed to further imaging for investigation of thromboembolism, to identify the defining clinical parameters and to determine the proportion of patients who have a definite imaging diagnosis of thromboembolism prior to discharge from hospital on anticoagulation therapy. One hundred and twelve VQS studies performed at the Flinders Medical Centre over a 9-month period were reported as having intermediate category and probability for pulmonary embolism. Medical case notes were available for review in 99 of these patients and from these the pretest clinical probability, subsequent patient progress and treatment were recorded. Eight cases were excluded because they were already receiving anticoagulation therapy. In the remaining 91 patients the pretest clinical probability was considered to be low in 25; intermediate in 30; and high in 36 cases. In total, 51.6% (n = 47) of these patients (8% (n = 2) with low, 66% (n = 20) with intermediate, and 69.4% (n = 25) with high pretest probability) proceeded to CT pulmonary angiography (CTPA) and/or lower limb duplex Doppler ultrasound (DUS) evaluation. Of the patients with IVQS results, 30.7% (n 28) were evaluated with CTPA. No patient with a low, all patients with a high and 46% of patients with an intermediate pretest probability initially received anticoagulation therapy. This was discontinued in three patients with high and in 12 patients with intermediate clinical probability prior to discharge from hospital. Overall, 40% of patients discharged on anticoagulation therapy (including 39% of those with a high pretest probability) had a positive imaging diagnosis of thromboembolism The results suggest that, although the majority of patients with intermediate-to-high pretest probability and IVQS proceed to further imaging investigation, CTPA is relatively underused in

  4. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  5. The role of spiral CT in patients with intermediate probability V/Q scans: can spiral CT replace pulmonary angiography?

    International Nuclear Information System (INIS)

    Vu, T.; Glenn, D.; Lovett, I.; Moses, J.; Wadhwa, S.S.; Nour, R.

    2000-01-01

    Full text: Spiral CT (SCT) has been advocated as a replacement for pulmonary angiography (PA)in patients with intermediate probability (IP) ventilation-perfusion lung scans (V/Q). More generally it has been proposed as a replacement for V/Q to detect Pulmonary Embolism. This study investigates the accuracy of SCT in the IP patient group 31 patients with IP scans (Modified PIOPED criteria) who were not at high risk of contrast nephrotoxicity were enrolled to have both SCT and PA within the 24 hours following their V/Q. Patients were classified as IP due to a single segmental mismatch (n=7) or a matched V/Q abnormality corresponding to CXR opacity (n=21), or both (n=3). PA is the gold standard for the detection of PE. SCT was read by an experienced radiologist blinded to the PA results. SCT was performed according to standard protocol. All SCT were technically satisfactory for interpretation. Pulmonary embolism was present in 9/31 patients (29%). Of the patients with PE detected by PA, SCT was positive in 4 (44% sensitivity). Of the 22 patients who did not have PE, SCT was negative in 21 and positive in one (96% specificity). In conclusion SCT has limited sensitivity for the detection of PE in patients with IP lung scans. SCT may not be an adequate replacement for PA. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  6. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... corrective action does not resolve the deficiency, you may request to use the contaminated system as an... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION...

  7. L-myo-inosose-1 as a probable intermediate in the reaction catalyzed by myo-inositol oxygenase

    International Nuclear Information System (INIS)

    Naber, N.I.; Swan, J.S.; Hamilton, G.A.

    1986-01-01

    In previous investigations, it was necessary to have Fe(II) and cysteine present in order to assay the catalytic activity of purified hog kidney myo-inositol oxygenase. In the present study it was found that, if this purified nonheme iron enzyme is slowly frozen in solution with glutathione and stored at -20 degrees C, it is fully active in the absence of activators if catalase is present to remove adventitious H 2 O 2 . With this simpler assay system it was possible to clarify the effects of several variables on the enzymic reaction. Thus, the maximum velocity is pH-dependent with a maximum around pH 9.5, but the apparent Km for myo-inositol (air atmosphere) remains constant at 5.0 mM throughout a broad pH range. The enzyme is quite specific for its substrate myo-inositol, is very sensitive to oxidants and reductants, but is not affected by a variety of complexing agents, nucleotides, sulfhydryl reagents, etc. In other experiments it was found that L-myo-inosose-1, a potential intermediate in the enzymic reaction, is a potent competitive inhibitor (Ki = 62 microM), while other inososes and a solution thought to contain D-glucodialdehyde, another potential intermediate, are weak inhibitors. Also, both a kinetic deuterium isotope effect (kH/kD = 2.1) and a tritium isotope effect (kH/kT = 7.5) are observed for the enzymic reaction when [1-2H]- and [1-3H]-myo-inositol are used as reactants. These latter results are considered strong evidence that the oxygenase reaction proceeds by a pathway involving L-myo-inosose-1 as an intermediate rather than by an alternative pathway that would have D-glucodialdehyde as the intermediate

  8. Determination of diffuseness parameter to estimate the survival probability of projectile using Woods-Saxon formula at intermediate beam energies

    International Nuclear Information System (INIS)

    Kumar, Rajiv; Goyal, Monika; Roshni; Singh, Pradeep; Kharab, Rajesh

    2017-01-01

    In present work, the S-matrix has been evaluated by using simple Woods-Saxon formula as well as the realistic expression for a number of projectiles varying from 26N e to 76 Ge at intermediate incident beam energies ranging from 30 MeV/A to 300 MeV/A. The target is 197 Au in each and every case. The realistic S-matrix is compared with that of obtained by using the simple Woods-Saxon formula. The motive of this comparison is to fix the value of otherwise free Δ so that the much involved evaluation of realistic S-matrix can be replaced by the simple Woods-Saxon formula

  9. Relation between exercise-induced ventricular arrhythmias and myocardial perfusion abnormalities in patients with intermediate pretest probability of coronary artery disease

    International Nuclear Information System (INIS)

    Elhendy, A.; Sozzi, F.B.; Van Domburg, R.T.; Bax, J.J.; Roelandt, J.R.T.C.

    2000-01-01

    We studied 302 patients (mean age 54±9 years, 152 men and 150 women) with intermediate pretest probability of CAD (range=0.25- 0.80, mean=0.43±0.20) by upright bicycle exercise stress test in conjunction with technetium-99m single-photon emission tomography (SPET) imaging. Exercise-induced VAs (frequent or complex premature ventricular contractions or ventricular tachycardia) occurred in 65 patients (22%). No significant difference was found between patients with and patient without VAs regarding the pretest probability of CAD (0.45±0.21 vs 0.43±0.20). Patients with exercise-induced VAs had a higher prevalence of perfusion abnormalities (52% vs 26%, P=0.002) and ischaemic electrocardiographic changes (31% vs 16%, P<0.05) compared to patients without VAs. A higher prevalence of perfusion abnormalities in patients with VAs was observed in both men (67% vs 35%, P<0.01) and women (38% vs 16%, P<0.05). However, the positive predictive value of exercise-induced VAs for the presence of myocardial perfusion abnormalities was higher in men than in women (67% vs 38%, P<0.05). The presence of abnormal myocardial perfusion was the only independent predictor of exercise-induced VAs (OR 2.2; 95% CI, 1.2-4.2) by multivariate analysis of clinical and stress test variables. It is concluded that in patients with intermediate pretest probability of CAD, exercise-induced VAs are predictive of a higher prevalence of myocardial perfusion abnormalities in both men and women. However, the positive predictive value of exercise-induced VAs for perfusion abnormalities is higher in men. Because of the underestimation of ischaemia by electrocardiographic changes, exercise-induced VAs should be interpreted as a marker of a higher probability of CAD. (orig./MG) (orig.)

  10. IPHAS J025827.88+635234.9 and IPHAS J051814.33+294113.0: Two probable eclipsing intermediate polars

    Science.gov (United States)

    Joshi, Arti; Pandey, Jeewan Chandra

    2018-04-01

    We present photometry in the R-band and linear polarimetry of two cataclysmic variables, namely IPHAS J025827.88 + 635234.9 and IPHAS J051814.33 + 294113.0. The data were obtained from 1-m class tele-scopes of the Aryabhatta Research Institute of Observational Sciences (ARIES; Nainital, India). In addition to the deep eclipse, strong short-period oscillations are also found. We derived a pulse period of (1203 ± 25) s for IPHAS J025827.88 + 635234.9 and (3277 ± 81) s for IPHAS J051814.33 + 294113.0. The presence of both orbital and spin modulations in these systems indicate that they belong to a class of intermediate polars. The full width at half depth of the eclipse is also found to be variable from epoch to epoch for IPHAS J025827.88 + 635234.9. The presence of a variable linear polarization of high value in these two sources indicates that these systems possess a strong magnetic field.

  11. Ocular oxyspirurosis of primates in zoos: intermediate host, worm morphology, and probable origin of the infection in the Moscow zoo

    Directory of Open Access Journals (Sweden)

    Ivanova E.

    2007-12-01

    Full Text Available Over the last century, only two cases of ocular oxyspirurosis were recorded in primates, both in zoos, and two species were described: in Berlin, Germany, Oxyspirura (O. conjunctivalis from the lemurid Microcebus murinus, later also found in the lorisid Loris gracilis; in Jacksonville, Florida, O. (O. youngi from the cercopithecid monkey Erythrocebus patas. In the present case from the Moscow zoo, oxyspirurosis was recorded in several species of Old World lemuriforms and lorisiforms, and some South American monkeys. i The intermediate host was discovered to be a cockroach, as for O. (O. mansoni, a parasite of poultry. The complete sequence identity between ITS-1 rDNA from adult nematodes of the primate and that of the larval worms from the vector, Nauphoete cinerea, confirmed their conspecificity. ii Parasites from Moscow zoo recovered from Nycticebus c. coucang were compared morphologically to those from other zoos. The length and shape of the gubernaculum, used previously as a distinct character, were found to be variable. However, the vulvar bosses arrangement, the distal extremity of left spicule and the position of papillae of the first postcloacal pair showed that the worms in the different samples were not exactly identical and that each set seemed characteristic of a particular zoo. iii The presence of longitudinal cuticular crests in the infective stage as well as in adult worms was recorded. Together with several other morphological and biological characters (long tail and oesophagus, cockroach vector, this confirmed that Oxyspirura is not closely related to Thelazia, another ocular parasite genus. iv The disease in the Moscow zoo is thought to have started with Nycticebus pygmaeus imported fromVietnam, thus the suggestion was that Asiatic lorisids were at the origin of the Moscow set of cases. The natural host(s for the Berlin and Jacksonville cases remain unknown but they are unlikely to be the species found infected in zoos

  12. 40 CFR 89.406 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Pre-test procedures. 89.406 Section 89.406 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Procedures § 89.406 Pre-test procedures. (a) Allow a minimum of 30 minutes warmup in the standby or operating...

  13. 40 CFR 90.408 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ....408 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... during service accumulation is allowed only in accordance with § 90.118. (b) Engine pre-test preparation... by § 90.324(a). If necessary, allow the heated sample line, filters, and pumps to reach operating...

  14. 40 CFR 91.408 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ....408 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... accordance with § 91.117. (b) Engine pre-test preparation. (1) Drain and charge the fuel tank(s) with the..., including the sample probe, using mode 1 from Table 2 in appendix A of this subpart. The emission sampling...

  15. Cognitive Laboratory Experiences : On Pre-testing Computerised Questionnaires

    NARCIS (Netherlands)

    Snijkers, G.J.M.E.

    2002-01-01

    In the literature on questionnaire design and survey methodology, pre-testing is mentioned as a way to evaluate questionnaires (i.e. investigate whether they work as intended) and control for measurement errors (i.e. assess data quality). As the American Statistical Association puts it (ASA, 1999,

  16. Computed tomography coronary angiography accuracy in women and men at low to intermediate risk of coronary artery disease

    International Nuclear Information System (INIS)

    Dharampal, Anoeshka S.; Papadopoulou, Stella L.; Rossi, Alexia; Weustink, Annick C.; Mollet, Nico R.A.; Meijboom, W. Bob; Neefjes, Lisan A.; Nieman, Koen; Feijter, Pim J. de; Boersma, Eric; Krestin, Gabriel P.

    2012-01-01

    To investigate the diagnostic accuracy of CT coronary angiography (CTCA) in women at low to intermediate pre-test probability of coronary artery disease (CAD) compared with men. In this retrospective study we included symptomatic patients with low to intermediate risk who underwent both invasive coronary angiography and CTCA. Exclusion criteria were previous revascularisation or myocardial infarction. The pre-test probability of CAD was estimated using the Duke risk score. Thresholds of less than 30 % and 30-90 % were used for determining low and intermediate risk, respectively. The diagnostic accuracy of CTCA in detecting obstructive CAD (≥50 % lumen diameter narrowing) was calculated on patient level. P < 0.05 was considered significant. A total of 570 patients (46 % women [262/570]) were included and stratified as low (women 73 % [80/109]) and intermediate risk (women 39 % [182/461]). Sensitivity, specificity, PPV and NPV were not significantly different in and between women and men at low and intermediate risk. For women vs. men at low risk they were 97 % vs. 100 %, 79 % vs. 90 %, 80 % vs. 80 % and 97 % vs. 100 %, respectively. For intermediate risk they were 99 % vs. 99 %, 72 % vs. 83 %, 88 % vs. 93 % and 98 % vs. 99 %, respectively. CTCA has similar diagnostic accuracy in women and men at low and intermediate risk. (orig.)

  17. Prediction model for recurrence probabilities after intravesical chemotherapy in patients with intermediate-risk non-muscle-invasive bladder cancer, including external validation

    NARCIS (Netherlands)

    Lammers, R.J.M.; Hendriks, J.C.M.; Rodriguez Faba, O.; Witjes, W.P.J.; Palou, J.; Witjes, J.A.

    2016-01-01

    PURPOSE: To develop a model to predict recurrence for patients with intermediate-risk (IR) non-muscle-invasive bladder cancer (NMIBC) treated with intravesical chemotherapy which can be challenging because of the heterogeneous characteristics of these patients. METHODS: Data from three Dutch trials

  18. Qualitative pre-test of Energy Star advertising : final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    Natural Resources Canada launched a print advertising campaign and one 30-second television commercial to promote the Energy Star symbol and to acquaint the public with the program that identifies energy efficient products that reduce energy use, save money and reduce greenhouse gas emissions that contribute to climate change. The Communications Branch of Natural Resources Canada wanted to pre-test the television and print ads. Each print ad focused on a particular product category, including home comfort, appliances, electronics and office equipment. The qualitative research methodology was used in the pre-testing because it is the best learning tool for understanding the range and depth of reactions toward a subject at any given time. The findings were not quantifiable because they are not representative of the population at large. Ten focus groups were surveyed in January 2003 in 5 Canadian centres with a total of 83 participants aged 18 to 54. The target groups included people who were informed about climate change issues as well as those who were note. Participants were questioned about the Energy Star Product. Findings were consistent across all 5 locations. There was some general awareness of EnerGuide on appliances in all groups, but generally a low awareness of the Energy Star symbol. Most people did not place energy efficiency as a high priority when purchasing appliances. This report presented the main findings of attitudes towards climate change, Kyoto and energy efficiency. The reaction to the television and print ads was also included along with opinions regarding their main weaknesses and strengths. Some recommendations for improvement were also included. Samples of the print advertisements were included in both English and French. tabs., figs.

  19. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  20. Choreographer Pre-Testing Code Analysis and Operational Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, David J. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Harrison, Christopher B. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Perr, C. W. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Hurd, Steven A [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2014-07-01

    Choreographer is a "moving target defense system", designed to protect against attacks aimed at IP addresses without corresponding domain name system (DNS) lookups. It coordinates actions between a DNS server and a Network Address Translation (NAT) device to regularly change which publicly available IP addresses' traffic will be routed to the protected device versus routed to a honeypot. More details about how Choreographer operates can be found in Section 2: Introducing Choreographer. Operational considerations for the successful deployment of Choreographer can be found in Section 3. The Testing & Evaluation (T&E) for Choreographer involved 3 phases: Pre-testing, Code Analysis, and Operational Testing. Pre-testing, described in Section 4, involved installing and configuring an instance of Choreographer and verifying it would operate as expected for a simple use case. Our findings were that it was simple and straightforward to prepare a system for a Choreographer installation as well as configure Choreographer to work in a representative environment. Code Analysis, described in Section 5, consisted of running a static code analyzer (HP Fortify) and conducting dynamic analysis tests using the Valgrind instrumentation framework. Choreographer performed well, such that only a few errors that might possibly be problematic in a given operating situation were identified. Operational Testing, described in Section 6, involved operating Choreographer in a representative environment created through EmulyticsTM . Depending upon the amount of server resources dedicated to Choreographer vis-á-vis the amount of client traffic handled, Choreographer had varying degrees of operational success. In an environment with a poorly resourced Choreographer server and as few as 50-100 clients, Choreographer failed to properly route traffic over half the time. Yet, with a well-resourced server, Choreographer handled over 1000 clients without missrouting. Choreographer

  1. On pre-test sensitisation and peer assessment to enhance learning gain in science education

    NARCIS (Netherlands)

    Bos, Floor/Floris

    2009-01-01

    *The main part of this thesis focuses on designing, optimising, and studying the embedding of two types of interventions: pre-testing and peer assessment, both supported by or combined with ICT-tools. * Pre-test sensitisation is used intentionally to boost the learning gain of the main intervention,

  2. Intermediate treatments

    Science.gov (United States)

    John R. Jones; Wayne D. Shepperd

    1985-01-01

    Intermediate treatments are those applied after a new stand is successfully established and before the final harvest. These include not only intermediate cuttings - primarily thinning - but also fertilization, irrigation, and protection of the stand from damaging agents.

  3. Pre-test evaluation of LLTR series II Test A-7

    International Nuclear Information System (INIS)

    Knittle, D.

    1981-03-01

    The purpose of this report is to present pre-test predictions of pressure histories for the A-7 test to be conducted in the Large Leak Test Rig (LLTR) at the Energy Technology Engineering Center (ETEC) in April 1981

  4. Targeting as the basis for pre-test market of lithium-ion battery

    Science.gov (United States)

    Yuniaristanto, Zakaria, R.; Saputri, V. H. L.; Sutopo, W.; Kadir, E. A.

    2017-11-01

    This article discusses about market segmentation and targeting as a first step in pre-test market of a new technology. The benefits of targeting towards pre-test market are pre-test market can be conducted to focus on selected target markets so there is no bias during the pre-test market. In determining the target market then do some surveys to identify the state of market in the future, so that the marketing process is not misplaced. Lithium ion battery which is commercialized through start-up companies is the case study. This start-up companies must be able to respond the changes and bring in customers as well as maintain them so that companies can survive and evolve to achieve its objectives. The research aims to determine market segments and target market effectively. Marketing strategy (segmentation and targeting) is used to make questionnaire and cluster analysis in data processing. Respondents were selected by purposive sampling and have obtained data as many as 80 samples. As the results study, there are three segments for lithium ion battery with their own distinguished characteristics and there are two segments that can be used as the target market for the company.

  5. Achievement of course outcome in vector calculus pre-test questions ...

    African Journals Online (AJOL)

    No Abstract. Keywords: pre-test; course outcome; bloom taxanomy; Rasch measurement model; vector calculus. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians ...

  6. Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students

    Science.gov (United States)

    Montecinos, Alicia M.

    2014-01-01

    A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  9. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  10. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  11. Intermediate Fragment

    DEFF Research Database (Denmark)

    Kruse Aagaard, Anders

    2015-01-01

    This text and its connected exhibition are aiming to reflect both on the thoughts, the processes and the outcome of the design and production of the artefact ‘Intermediate Fragment’ and making as a contemporary architectural tool in general. Intermediate Fragment was made for the exhibition ‘Enga...... of realising an exhibition object was conceived, but expanded, refined and concretised through this process. The context of the work shown here is an interest in a tighter, deeper connection between experimentally obtained material knowledge and architectural design....

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. Pre-test analyses for the NESC1 spinning cylinder experiment

    International Nuclear Information System (INIS)

    Fokkens, J.H.

    1995-10-01

    The spinning cylinder experiment organised by the Network for the Evaluation of Steel Components (NESC) is designed to investigate the cleavage initiation behaviour of both surface breaking and subclad defects in simulated end of life RPV material, exposed to a pressurised thermal shock transient. Pre-test structural integrity assessments are performed by the NESC Structural Analysis Task Group (TG3). The results of these structural integrity assessments are used to determine the design of the experiment and especially the sizes of the introduced defects. In this report the results of the pre-test analyses performed by the group Applied Mechanics at ECN - Nuclear Energy are described. Elastic as well as elasto-plastic structural analyses are performed for a surface breaking and a subclad defect in a forged cylinder with a 4 mm cladding. The semi elliptical defects have a depth of 40 mm and an aspect ratio of 1:3. (orig.)

  14. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  15. Pre-test evaluation of LLTR Series II Test A-6

    International Nuclear Information System (INIS)

    Knittle, D.

    1980-11-01

    Purpose of this report is to present pre-test predictions of pressure histories for the A6 test to be conducted in the Large Leak Test Facility (LLTF) at the Energy Technology Engineering Center. A6 is part of a test program being conducted to evaluate the effects of leaks produced by a double-ended guillotine rupture of a single tube. A6 will provide data on the CRBR prototypical double rupture disc performance

  16. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  17. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  18. Pre-test analysis results of a PWR steel lined pre-stressed concrete containment model

    International Nuclear Information System (INIS)

    Basha, S.M.; Ghosh, Barnali; Patnaik, R.; Ramanujam, S.; Singh, R.K.; Kushwaha, H.S.; Venkat Raj, V.

    2000-02-01

    Pre-stressed concrete nuclear containment serves as the ultimate barrier against the release of radioactivity to the environment. This ultimate barrier must be checked for its ultimate load carrying capacity. BARC participated in a Round Robin analysis activity which is co-sponsored by Sandia National Laboratory, USA and Nuclear Power Engineering Corporation Japan for the pre-test prediction of a 1:4 size Pre-stressed Concrete Containment Vessel. In house finite element code ULCA was used to make the test predictions of displacements and strains at the standard output locations. The present report focuses on the important landmarks of the pre-test results, in sequential terms of first crack appearance, loss of pre-stress, first through thickness crack, rebar and liner yielding and finally liner tearing at the ultimate load. Global and local failure modes of the containment have been obtained from the analysis. Finally sensitivity of the numerical results with respect to different types of liners and different constitutive models in terms of bond strength between concrete and steel and tension-stiffening parameters are examined. The report highlights the important features which could be observed during the test and guidelines are given for improving the prediction in the post test computation after the test data is available. (author)

  19. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  20. Pre-test analysis of ATLAS SBO with RCP seal leakage scenario using MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Quang Huy; Lee, Sang Young; Oh, Seung Jong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    This study presents a pre-test calculation for the Advanced Thermal-hydraulic Test Loop for Accident Simulation (ATLAS) SBO experiment with RCP seal leakage scenario. Initially, turbine-driven auxfeed water pumps are used. Then, outside cooling water injection method is used for long term cooling. The analysis results would be useful for conducting the experiment to verify the APR 1400 extended SBO optimum mitigation strategy using outside cooling water injection in future. The pre-test calculation for ATLAS extended SBO with RCP seal leakage and outside cooling water injection scenario is performed. After Fukushima nuclear accident, the capability of coping with the extended station blackout (SBO) becomes important. Many NPPs are applying FLEX approach as main coping strategies for extended SBO scenarios. In FLEX strategies, outside cooling water injection to reactor cooling system (RCS) and steam generators (SGs) is considered as an effective method to remove residual heat and maintain the inventory of the systems during the accident. It is worthwhile to examine the soundness of outside cooling water injection method for extended SBO mitigation by both calculation and experimental demonstration. From the calculation results, outside cooling water injection into RCS and SGs is verified as an effective method during extended SBO when RCS and SGs depressurization is sufficiently performed.

  1. Planning and pre-testing: the key to effective AIDS education materials.

    Science.gov (United States)

    Ostfield, M L; Romocki, L S

    1991-06-01

    The steps in designing and producing effective AIDS prevention educational materials are outlines, using as an example a brochure originated in St. Lucia for clients at STD clinics. The brochure was intended to be read by clients as they waited for their consultation, thus it was targeted to a specific audience delimited by age, sex, language, educational level, religion and associated medical or behavioral characteristics. When researching the audience, it is necessary to learn the medium they best respond to, what they know already, what is their present behavior, how they talk about AIDS, what terms they use, how they perceive the benefits of AIDS prevention behavior, what sources of information they trust. The minimum number of key messages should be selected. Next the most appropriate channel of communication is identified. Mass media are not always best for a target audience, "little media" such as flyers and give-always may be better. The draft is then pre-tested by focus groups and interviews, querying about the text separately, then images, color, format, style. Listen to the way the respondents talk about the draft. Modify the draft and pre-test again. Fine-tune implications of the message for realism in emotional responses, respect, self-esteem, admiration and trust. To achieve wide distribution it is a good idea to involve community leaders to production of the materials, so they will be more likely to take part in the distribution process.

  2. Intermediate uveitis

    Directory of Open Access Journals (Sweden)

    Babu B

    2010-01-01

    Full Text Available Intermediate uveitis (IU is described as inflammation in the anterior vitreous, ciliary body and the peripheral retina. In the Standardization of Uveitis Nomenclature (SUN working group′s international workshop for reporting clinical data the consensus reached was that the term IU should be used for that subset of uveitis where the vitreous is the major site of the inflammation and if there is an associated infection (for example, Lyme disease or systemic disease (for example, sarcoidosis. The diagnostic term pars planitis should be used only for that subset of IU where there is snow bank or snowball formation occurring in the absence of an associated infection or systemic disease (that is, "idiopathic". This article discusses the clinical features, etiology, pathogenesis, investigations and treatment of IU.

  3. Pre-test calculations for FAL-19 and FAL-20 using the ITHACA code

    International Nuclear Information System (INIS)

    Bradley, S.J.; Ketchell, N.

    1992-08-01

    Falcon is a small scale experimental apparatus, designed to simulate the transport of fission products through the primary circuit and containment of a nuclear power reactor under severe accident conditions. Information gained from the experiments in Falcon will be used to guide and assist in understanding the much larger Phebus-FP experiments. This report presents the results of pre-test calculations performed using ITHACA for the two tests: FAL-19 and FAL-20. Initial calculations were concerned solely with the thermal-hydraulic conditions in the containment while later ones briefly investigated the effect of the injection of an insoluble aerosol into the containment with the same thermal-hydraulic conditions. (author)

  4. NESC-1 spinning cylinder experiment. Pre-test fracture analysis evaluation

    International Nuclear Information System (INIS)

    Moinereau, D.; Pitard-Bouet, J.M.

    1996-10-01

    A pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical sub-clad flaw (70 mm deep) is more favourable to cleavage initiation near the base metal - cladding interface. (K.A.)

  5. FUMEX cases 1, 2, and 3 calculated pre-test and post-test results

    Energy Technology Data Exchange (ETDEWEB)

    Stefanova, S; Vitkova, M; Passage, G; Manolova, M; Simeonova, V [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika; Scheglov, A; Proselkov, V [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation); Kharalampieva, Ts [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1994-12-31

    Two versions (modified pre-test and modified post-test) of PIN-micro code were used to analyse the fuel rod behaviour of three FUMEX experiments. The experience of applying PIN-micro code with its simple structure and old conception of the steady-state operation shows significant difficulties in treating the complex processes like those in FUMEX experiments. These difficulties were partially overcame through different model modifications and corrections based on special engineering estimations and the results obtained as a whole do not seem unreasonable. The calculations have been performed by a group from two Bulgarian institutions in collaboration with specialists from the Kurchatov Research Center. 1 tab., 14 figs., 8 refs.

  6. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM

    International Nuclear Information System (INIS)

    DEGRASSI, G.; HOFMAYER, C.; MURPHY, C.; SUZUKI, K.; NAMITA, Y.

    2003-01-01

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. 3D Orthorhombic Elastic Wave Propagation Pre-Test Simulation of SPE DAG-1 Test

    Science.gov (United States)

    Jensen, R. P.; Preston, L. A.

    2017-12-01

    A more realistic representation of many geologic media can be characterized as a dense system of vertically-aligned microfractures superimposed on a finely-layered horizontal geology found in shallow crustal rocks. This seismic anisotropy representation lends itself to being modeled as an orthorhombic elastic medium comprising three mutually orthogonal symmetry planes containing nine independent moduli. These moduli can be determined by observing (or prescribing) nine independent P-wave and S-wave phase speeds along different propagation directions. We have developed an explicit time-domain finite-difference (FD) algorithm for simulating 3D elastic wave propagation in a heterogeneous orthorhombic medium. The components of the particle velocity vector and the stress tensor are governed by a set of nine, coupled, first-order, linear, partial differential equations (PDEs) called the velocity-stress system. All time and space derivatives are discretized with centered and staggered FD operators possessing second- and fourth-order numerical accuracy, respectively. Additionally, we have implemented novel perfectly matched layer (PML) absorbing boundary conditions, specifically designed for orthorhombic media, to effectively suppress grid boundary reflections. In support of the Source Physics Experiment (SPE) Phase II, a series of underground chemical explosions at the Nevada National Security Site, the code has been used to perform pre-test estimates of the Dry Alluvium Geology - Experiment 1 (DAG-1). Based on literature searches, realistic geologic structure and values for orthorhombic P-wave and S-wave speeds have been estimated. Results and predictions from the simulations are presented.

  9. Pre-test analysis for identification of natural circulation instabilities in TALL-3D facility

    Energy Technology Data Exchange (ETDEWEB)

    Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se; Jeltsov, Marti, E-mail: marti@safety.sci.kth.se; Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se; Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se

    2017-04-01

    Highlights: • Global optimum search method was used to identify a region of instability. • Parametric study was used for detailed investigation of system behavior modes. • The results include identification of sustained mass flow rate oscillations. • Recommendations are made for selection of optimal experimental conditions. - Abstract: TALL-3D facility is a lead-bismuth eutectic (LBE) thermal-hydraulic loop designed to provide experimental data on thermal-hydraulics phenomena for validation of stand-alone and coupled System Thermal Hydraulics (STH) and Computational Fluid Dynamics (CFD) codes. Pre-test analysis is crucial for proper choice of experimental conditions at which the experimental data would be most useful for code validation and benchmarking. The goal of this work is to identify these conditions at which the experiment is challenging for the STH codes yet minimizes the 3D-effects from the test section on the loop dynamics. The analysis is focused on the identification of limit cycle flow oscillations in the TALL-3D facility main heater leg using a global optimum search tool GA-NPO to find a general region in the parameter space where oscillatory behavior is expected. As a second step a grid study is conducted outlining the boundaries between different stability modes. Phenomena, simulation results and methodology for selection of the test parameters are discussed in detail and recommendations for experiments are provided.

  10. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  11. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  12. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  13. A Teaching Method on Basic Chemistry for Freshman (II) : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2004-01-01

    This report deals with review of a teaching method on basic chemistry for freshman in this first semester. We tried to review this teaching method with pre-test and post-test by means of the official and private questionnaires. Several hints and thoughts on teaching skills are obtained from this analysis.

  14. A Teaching Method on Basic Chemistry for Freshman : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2003-01-01

    This report deals with a teaching method on basic chemistry for freshman. This teaching method contains guidance and instruction to how to understand basic chemistry. Pre-test and post-test have been put into practice each time. Each test was returned to students at class in the following weeks.

  15. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  16. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A

    2007-09-01

    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  17. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie

    2008-01-01

    The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...... preoperative DVT in colorectal cancer patients admitted for surgery. Preoperative D-dimer test and compression ultrasonography for DVT were performed in 193 consecutive patients with newly diagnosed colorectal cancer. Diagnostic accuracy indices of the D-dimer test were assessed according to the PTP score...... in ruling out preoperative DVT in colorectal cancer patients admitted for surgery....

  18. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  19. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  20. Using cognitive pre-testing methods in the development of a new evidenced-based pressure ulcer risk assessment instrument

    Directory of Open Access Journals (Sweden)

    S. Coleman

    2016-11-01

    Full Text Available Abstract Background Variation in development methods of Pressure Ulcer Risk Assessment Instruments has led to inconsistent inclusion of risk factors and concerns about content validity. A new evidenced-based Risk Assessment Instrument, the Pressure Ulcer Risk Primary Or Secondary Evaluation Tool - PURPOSE-T was developed as part of a National Institute for Health Research (NIHR funded Pressure Ulcer Research Programme (PURPOSE: RP-PG-0407-10056. This paper reports the pre-test phase to assess and improve PURPOSE-T acceptability, usability and confirm content validity. Methods A descriptive study incorporating cognitive pre-testing methods and integration of service user views was undertaken over 3 cycles comprising PURPOSE-T training, a focus group and one-to-one think-aloud interviews. Clinical nurses from 2 acute and 2 community NHS Trusts, were grouped according to job role. Focus group participants used 3 vignettes to complete PURPOSE-T assessments and then participated in the focus group. Think-aloud participants were interviewed during their completion of PURPOSE-T. After each pre-test cycle analysis was undertaken and adjustment/improvements made to PURPOSE-T in an iterative process. This incorporated the use of descriptive statistics for data completeness and decision rule compliance and directed content analysis for interview and focus group data. Data were collected April 2012-June 2012. Results Thirty-four nurses participated in 3 pre-test cycles. Data from 3 focus groups, 12 think-aloud interviews incorporating 101 PURPOSE-T assessments led to changes to improve instrument content and design, flow and format, decision support and item-specific wording. Acceptability and usability were demonstrated by improved data completion and appropriate risk pathway allocation. The pre-test also confirmed content validity with clinical nurses. Conclusions The pre-test was an important step in the development of the preliminary PURPOSE-T and the

  1. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  2. HIV pre-test information, discussion or counselling? A review of guidance relevant to the WHO European Region.

    Science.gov (United States)

    Bell, Stephen A; Delpech, Valerie; Raben, Dorthe; Casabona, Jordi; Tsereteli, Nino; de Wit, John

    2016-02-01

    In the context of a shift from exceptionalism to normalisation, this study examines recommendations/evidence in current pan-European/global guidelines regarding pre-test HIV testing and counselling practices in health care settings. It also reviews new research not yet included in guidelines. There is consensus that verbal informed consent must be gained prior to testing, individually, in private, confidentially, in the presence of a health care provider. All guidelines recommend pre-test information/discussion delivered verbally or via other methods (information sheet). There is agreement about a minimum standard of information to be provided before a test, but guidelines differ regarding discussion about issues encouraging patients to think about implications of the result. There is heavy reliance on expert consultation in guideline development. Referenced scientific evidence is often more than ten years old and based on US/UK research. Eight new papers are reviewed. Current HIV testing and counselling guidelines have inconsistencies regarding the extent and type of information that is recommended during pre-test discussions. The lack of new research underscores a need for new evidence from a range of European settings to support the process of expert consultation in guideline development. © The Author(s) 2015.

  3. CONSTOR registered V/TC drop tests. Pre-test analysis by finite element method

    International Nuclear Information System (INIS)

    Voelzer, W.; Koenig, S.; Klein, K.; Tso, C.F.; Owen, S.; Monk, C.

    2004-01-01

    The CONSTOR registered family of steel-concrete-steel sandwich cask designs have been developed to fulfil both the internationally valid IAEA criteria for transportation and the requirements for long-term intermediate storage in the US and various European countries. A comprehensive drop testing programme using a full-scale prototype test cask (CONSTOR registered V/TC) has been developed as part of the application for a transport license in both Germany and the US. The drop tests using the full-scale cask will be performed by BAM at test facilities in Horstwalde. The tests will include five different 9m drops onto flat unyielding targets and seven different 1m drops onto a punch. The first drop test, a 9m side drop, will be performed during PATRAM 2004. The other drop tests will take place during the following year. The development of the cask design and the formulation of the drop test programme has been supported by an extensive series of finite element analyses. The objectives of the finite element analyses were; to provide an intermediate step in demonstrating the performance of the CONSTOR registered in fulfilling the requirements of 10 CFR 71 and the IAEA transport regulations. To justify the selection of drop tests. To predict the performance of V/TC during the drop tests. To estimate the strain and acceleration time histories at measuring points on the test cask and to aid in the setting up of the test instrumentation. To develop an analysis model that can be used in future safety analyses for transport and storage license applications and which can confidently be used to demonstrate the performance of the package. This paper presents an overview of the analyses performed, including a summary of all the different drop orientations that were considered. The major assumptions employed during the analyses are also discussed, as are the specifics of the modelling techniques that were employed. At the end of the paper, the key results obtained from the analyses

  4. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  5. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  6. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  7. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  8. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  9. Intermediality and media change

    OpenAIRE

    2012-01-01

    This book is about intermediality as an approach to analysing and understanding media change. Intermediality and Media Change is critical of technological determinism that characterises 'new media discourse' about the ongoing digitalization, framed as a revolution and creating sharp contrasts between old and new media. Intermediality instead emphasises paying attention to continuities between media of all types and privileges a comparative perspective on technological changes in media over ti...

  10. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  11. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  12. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  13. Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02

    Science.gov (United States)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2011-01-01

    This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.

  14. Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax

    International Nuclear Information System (INIS)

    Ryerson, F.J.; Qualheim, B.J.

    1983-12-01

    Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables

  15. Pre-test habituation improves the reliability of a handheld test of mechanical nociceptive threshold in dairy cows

    DEFF Research Database (Denmark)

    Raundal, P. M.; Andersen, P. H.; Toft, Nils

    2015-01-01

    Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre-testing...... habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P test...

  16. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  17. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  18. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  19. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  1. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  2. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  3. an intermediate moisture meat

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-04

    Jul 4, 2008 ... traditional SM muscle without compromising quality. ... technique is intermediate moisture food processing. ... Traditionally, most tsire suya producers use ..... quality of Chinese purebred and European X Chinese crossbred ...

  4. Bacterial intermediate filaments

    DEFF Research Database (Denmark)

    Charbon, Godefroid; Cabeen, M.; Jacobs-Wagner, C.

    2009-01-01

    Crescentin, which is the founding member of a rapidly growing family of bacterial cytoskeletal proteins, was previously proposed to resemble eukaryotic intermediate filament (IF) proteins based on structural prediction and in vitro polymerization properties. Here, we demonstrate that crescentin...

  5. Mapping Intermediality in Performance

    NARCIS (Netherlands)

    2010-01-01

    Mapping Intermediality in Performance benadert het vraagstuk van intermedialiteit met betrekking tot performance (vooral theater) vanuit vijf verschillende invalshoeken: performativiteit en lichaam; tijd en ruimte; digitale cultuur en posthumanisme; netwerken; pedagogiek en praxis. In deze boeiende

  6. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  7. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  8. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. Developing and pre-testing a decision board to facilitate informed choice about delivery approach in uncomplicated pregnancy

    Directory of Open Access Journals (Sweden)

    Wood Stephen

    2009-10-01

    Full Text Available Abstract Background The rate of caesarean sections is increasing worldwide, yet medical literature informing women with uncomplicated pregnancies about relative risks and benefits of elective caesarean section (CS compared with vaginal delivery (VD remains scarce. A decision board may address this gap, providing systematic evidence-based information so that patients can more fully understand their treatment options. The objective of our study was to design and pre-test a decision board to guide clinical discussions and enhance informed decision-making related to delivery approach (CS or VD in uncomplicated pregnancy. Methods Development of the decision board involved two preliminary studies to determine women's preferred mode of risk presentation and a systematic literature review for the most comprehensive presentation of medical risks at the time (VD and CS. Forty women were recruited to pre-test the tool. Eligible subjects were of childbearing age (18-40 years but were not pregnant in order to avoid raising the expectation among pregnant women that CS was a universally available birth option. Women selected their preferred delivery approach and completed the Decisional Conflict Scale to measure decisional uncertainty before and after reviewing the decision board. They also answered open-ended questions reflecting what they had learned, whether or not the information had helped them to choose between birth methods, and additional information that should be included. Descriptive statistics were used to analyse sample characteristics and women's choice of delivery approach pre/post decision board. Change in decisional conflict was measured using Wilcoxon's sign rank test for each of the three subscales. Results The majority of women reported that they had learned something new (n = 37, 92% and that the tool had helped them make a hypothetical choice between delivery approaches (n = 34, 85%. Women wanted more information about neonatal risks and

  11. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  13. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  16. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  17. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. [Therapy of intermediate uveitis].

    Science.gov (United States)

    Doycheva, D; Deuter, C; Zierhut, M

    2014-12-01

    Intermediate uveitis is a form of intraocular inflammation in which the vitreous body is the major site of inflammation. Intermediate uveitis is primarily treated medicinally and systemic corticosteroids are the mainstay of therapy. When recurrence of uveitis or side effects occur during corticosteroid therapy an immunosuppressive treatment is required. Cyclosporine A is the only immunosuppressive agent that is approved for therapy of uveitis in Germany; however, other immunosuppressive drugs have also been shown to be effective and well-tolerated in patients with intermediate uveitis. In severe therapy-refractory cases when conventional immunosuppressive therapy has failed, biologics can be used. In patients with unilateral uveitis or when the systemic therapy is contraindicated because of side effects, an intravitreal steroid treatment can be carried out. In certain cases a vitrectomy may be used.

  20. Bentonite buffer pre-test. Core drilling of drillholes ONK-PP264...267 in ONKALO at Olkiluoto 2010

    International Nuclear Information System (INIS)

    Toropainen, V.

    2010-12-01

    Suomen Malmi Oy (Smoy) core drilled four drillholes for bentonite buffer pre-test in ONKALO at Eurajoki, Olkiluoto in July 2010. The identification numbers of the holes are ONK-PP264..267, and the lengths of the drillholes are approximately 4.30 metres each. The drillholes are 75.7 mm by diameter. The drillholes were drilled in a niche at access tunnel chainage 1475. The hydraulic DE 130 drilling rig was used for the work. The drilling water was taken from the ONKALO drilling water pipeline and premixed sodium fluorescein was used as a label agent in the drilling water. In addition to drilling, the drillcores were logged and reported by geologist. Geological logging included the following parameters: lithology, foliation, fracture parameters, fractured zones, core loss, weathering, fracture frequency, RQD and rock quality. The main rock type in the drillholes is pegmatitic granite. The average fracture frequency in the drill cores is 4.0 pcs / m and the average RQD value 94.2 %. (orig.)

  1. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  2. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  3. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  4. Mobile communication and intermediality

    DEFF Research Database (Denmark)

    Helles, Rasmus

    2013-01-01

    communicative affordances of mobile devices in order to understand how people choose between them for different purposes. It is argued that mobile communication makes intermediality especially central, as the choice of medium is detached from the location of stationary media and begins to follow the user across......The article argues the importance of intermediality as a concept for research in mobile communication and media. The constant availability of several, partially overlapping channels for communication (texting, calls, email, Facebook, etc.) requires that we adopt an integrated view of the various...

  5. Money distribution with intermediation

    OpenAIRE

    Teles, Caio Augusto Colnago

    2013-01-01

    This pap er analyzes the distribution of money holdings in a commo dity money search-based mo del with intermediation. Intro ducing heterogeneity of costs to the Kiyotaki e Wright ( 1989 ) mo del, Cavalcanti e Puzzello ( 2010) gives rise to a non-degenerated distribution of money. We extend further this mo del intro ducing intermediation in the trading pro cess. We show that the distribution of money matters for savings decisions. This gives rises to a xed p oint problem for the ...

  6. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  7. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  10. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  11. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  12. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  13. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  14. Grouped to Achieve: Are There Benefits to Assigning Students to Heterogeneous Cooperative Learning Groups Based on Pre-Test Scores?

    Science.gov (United States)

    Werth, Arman Karl

    Cooperative learning has been one of the most widely used instructional practices around the world since the early 1980's. Small learning groups have been in existence since the beginning of the human race. These groups have grown in their variance and complexity overtime. Classrooms are getting more diverse every year and instructors need a way to take advantage of this diversity to improve learning. The purpose of this study was to see if heterogeneous cooperative learning groups based on student achievement can be used as a differentiated instructional strategy to increase students' ability to demonstrate knowledge of science concepts and ability to do engineering design. This study includes two different groups made up of two different middle school science classrooms of 25-30 students. These students were given an engineering design problem to solve within cooperative learning groups. One class was put into heterogeneous cooperative learning groups based on student's pre-test scores. The other class was grouped based on random assignment. The study measured the difference between each class's pre-post gains, student's responses to a group interaction form and interview questions addressing their perceptions of the makeup of their groups. The findings of the study were that there was no significant difference between learning gains for the treatment and comparison groups. There was a significant difference between the treatment and comparison groups in student perceptions of their group's ability to stay on task and manage their time efficiently. Both the comparison and treatment groups had a positive perception of the composition of their cooperative learning groups.

  15. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  16. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  17. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  18. Pre-Test pan Work Plan sebagai Strategi Pembelajaran Efektif pada Praktikum Bahan Teknik Lanjut Jurusan Pendidikan Teknik Mesin FT UNY

    Directory of Open Access Journals (Sweden)

    Nurdjito Nurdjito

    2013-09-01

    Full Text Available To find the most effective learning strategy for the practicum in the laboratory of materials of the department of Mechanical Engineering Education, Faculty of Engineering, Yogyakarta State University (YSU, a study that aims to determine the effect of applying pre-test and work plan on the learning activities and the achievement of students in the laboratory was conducted. This action research used the purposive random sampling technique. Pre-test and work plan were conducted as the treatment. The data of study was collected through a test to analyse the students’ achievement scores, then they were analyzed using t-test with SPSS. The results of this study indicated that the application of pre-test and work plan in addition to the standard module was proven to be more effective than the  normative learning using the module with t = 3.055 p = 0.003 <0.05. The implementation of the pre-test and work plan in addition to the use of standard modules is able to  improve the students’ motivation, independence and readiness to learn as well as the cooperation among the students, therefore the achievement is also improved. The mastery of competencies increased significantly proved by the increasing values of mode 66 to 85 (the experiment, and mean 73.12 into 79.32 (experiment.

  19. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  20. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  1. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  2. The Intermediate Neutrino Program

    CERN Document Server

    Adams, C.; Ankowski, A.M.; Asaadi, J.A.; Ashenfelter, J.; Axani, S.N.; Babu, K.; Backhouse, C.; Band, H.R.; Barbeau, P.S.; Barros, N.; Bernstein, A.; Betancourt, M.; Bishai, M.; Blucher, E.; Bouffard, J.; Bowden, N.; Brice, S.; Bryan, C.; Camilleri, L.; Cao, J.; Carlson, J.; Carr, R.E.; Chatterjee, A.; Chen, M.; Chen, S.; Chiu, M.; Church, E.D.; Collar, J.I.; Collin, G.; Conrad, J.M.; Convery, M.R.; Cooper, R.L.; Cowen, D.; Davoudiasl, H.; de Gouvea, A.; Dean, D.J.; Deichert, G.; Descamps, F.; DeYoung, T.; Diwan, M.V.; Djurcic, Z.; Dolinski, M.J.; Dolph, J.; Donnelly, B.; Dwyer, D.A.; Dytman, S.; Efremenko, Y.; Everett, L.L.; Fava, A.; Figueroa-Feliciano, E.; Fleming, B.; Friedland, A.; Fujikawa, B.K.; Gaisser, T.K.; Galeazzi, M.; Galehouse, D.C.; Galindo-Uribarri, A.; Garvey, G.T.; Gautam, S.; Gilje, K.E.; Gonzalez-Garcia, M.; Goodman, M.C.; Gordon, H.; Gramellini, E.; Green, M.P.; Guglielmi, A.; Hackenburg, R.W.; Hackenburg, A.; Halzen, F.; Han, K.; Hans, S.; Harris, D.; Heeger, K.M.; Herman, M.; Hill, R.; Holin, A.; Huber, P.; Jaffe, D.E.; Johnson, R.A.; Joshi, J.; Karagiorgi, G.; Kaufman, L.J.; Kayser, B.; Kettell, S.H.; Kirby, B.J.; Klein, J.R.; Kolomensky, Yu. G.; Kriske, R.M.; Lane, C.E.; Langford, T.J.; Lankford, A.; Lau, K.; Learned, J.G.; Ling, J.; Link, J.M.; Lissauer, D.; Littenberg, L.; Littlejohn, B.R.; Lockwitz, S.; Lokajicek, M.; Louis, W.C.; Luk, K.; Lykken, J.; Marciano, W.J.; Maricic, J.; Markoff, D.M.; Martinez Caicedo, D.A.; Mauger, C.; Mavrokoridis, K.; McCluskey, E.; McKeen, D.; McKeown, R.; Mills, G.; Mocioiu, I.; Monreal, B.; Mooney, M.R.; Morfin, J.G.; Mumm, P.; Napolitano, J.; Neilson, R.; Nelson, J.K.; Nessi, M.; Norcini, D.; Nova, F.; Nygren, D.R.; Orebi Gann, G.D.; Palamara, O.; Parsa, Z.; Patterson, R.; Paul, P.; Pocar, A.; Qian, X.; Raaf, J.L.; Rameika, R.; Ranucci, G.; Ray, H.; Reyna, D.; Rich, G.C.; Rodrigues, P.; Romero, E.Romero; Rosero, R.; Rountree, S.D.; Rybolt, B.; Sanchez, M.C.; Santucci, G.; Schmitz, D.; Scholberg, K.; Seckel, D.; Shaevitz, M.; Shrock, R.; Smy, M.B.; Soderberg, M.; Sonzogni, A.; Sousa, A.B.; Spitz, J.; St. John, J.M.; Stewart, J.; Strait, J.B.; Sullivan, G.; Svoboda, R.; Szelc, A.M.; Tayloe, R.; Thomson, M.A.; Toups, M.; Vacheret, A.; Vagins, M.; Van de Water, R.G.; Vogelaar, R.B.; Weber, M.; Weng, W.; Wetstein, M.; White, C.; White, B.R.; Whitehead, L.; Whittington, D.W.; Wilking, M.J.; Wilson, R.J.; Wilson, P.; Winklehner, D.; Winn, D.R.; Worcester, E.; Yang, L.; Yeh, M.; Yokley, Z.W.; Yoo, J.; Yu, B.; Yu, J.; Zhang, C.

    2015-01-01

    The US neutrino community gathered at the Workshop on the Intermediate Neutrino Program (WINP) at Brookhaven National Laboratory February 4-6, 2015 to explore opportunities in neutrino physics over the next five to ten years. Scientists from particle, astroparticle and nuclear physics participated in the workshop. The workshop examined promising opportunities for neutrino physics in the intermediate term, including possible new small to mid-scale experiments, US contributions to large experiments, upgrades to existing experiments, R&D plans and theory. The workshop was organized into two sets of parallel working group sessions, divided by physics topics and technology. Physics working groups covered topics on Sterile Neutrinos, Neutrino Mixing, Neutrino Interactions, Neutrino Properties and Astrophysical Neutrinos. Technology sessions were organized into Theory, Short-Baseline Accelerator Neutrinos, Reactor Neutrinos, Detector R&D and Source, Cyclotron and Meson Decay at Rest sessions.This report summ...

  3. The Intermediate Neutrino Program

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C.; et al.

    2015-03-23

    The US neutrino community gathered at the Workshop on the Intermediate Neutrino Program (WINP) at Brookhaven National Laboratory February 4-6, 2015 to explore opportunities in neutrino physics over the next five to ten years. Scientists from particle, astroparticle and nuclear physics participated in the workshop. The workshop examined promising opportunities for neutrino physics in the intermediate term, including possible new small to mid-scale experiments, US contributions to large experiments, upgrades to existing experiments, R&D plans and theory. The workshop was organized into two sets of parallel working group sessions, divided by physics topics and technology. Physics working groups covered topics on Sterile Neutrinos, Neutrino Mixing, Neutrino Interactions, Neutrino Properties and Astrophysical Neutrinos. Technology sessions were organized into Theory, Short-Baseline Accelerator Neutrinos, Reactor Neutrinos, Detector R&D and Source, Cyclotron and Meson Decay at Rest sessions.This report summarizes discussion and conclusions from the workshop.

  4. The Intermediate Neutrino Program

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C. [Yale Univ., New Haven, CT (United States); Alonso, J. R. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Ankowski, A. M. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Asaadi, J. A. [Syracuse Univ., NY (United States); Ashenfelter, J. [Yale Univ., New Haven, CT (United States); Axani, S. N. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Babu, K [Oklahoma State Univ., Stillwater, OK (United States); Backhouse, C. [California Inst. of Technology (CalTech), Pasadena, CA (United States); Band, H. R. [Yale Univ., New Haven, CT (United States); Barbeau, P. S. [Duke Univ., Durham, NC (United States); Barros, N. [Univ. of Pennsylvania, Philadelphia, PA (United States); Bernstein, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Betancourt, M. [Illinois Inst. of Technology, Chicago, IL (United States); Bishai, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Blucher, E. [Univ. of Chicago, IL (United States); Bouffard, J. [State Univ. of New York (SUNY), Albany, NY (United States); Bowden, N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brice, S. [Illinois Inst. of Technology, Chicago, IL (United States); Bryan, C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Camilleri, L. [Columbia Univ., New York, NY (United States); Cao, J. [Inst. of High Energy Physics, Beijing (China); Carlson, J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carr, R. E. [Columbia Univ., New York, NY (United States); Chatterjee, A. [Univ. of Texas, Arlington, TX (United States); Chen, M. [Univ. of California, Irvine, CA (United States); Chen, S. [Tsinghua Univ., Beijing (China); Chiu, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Church, E. D. [Illinois Inst. of Technology, Chicago, IL (United States); Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Collar, J. I. [Univ. of Chicago, IL (United States); Collin, G. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, J. M. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Convery, M. R. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Cooper, R. L. [Indiana Univ., Bloomington, IN (United States); Cowen, D. [Pennsylvania State Univ., University Park, PA (United States); Davoudiasl, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Gouvea, A. D. [Northwestern Univ., Evanston, IL (United States); Dean, D. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Deichert, G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Descamps, F. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); DeYoung, T. [Michigan State Univ., East Lansing, MI (United States); Diwan, M. V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Djurcic, Z. [Argonne National Lab. (ANL), Argonne, IL (United States); Dolinski, M. J. [Drexel Univ., Philadelphia, PA (United States); Dolph, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Donnelly, B. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Dwyer, D. A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dytman, S. [Univ. of Pittsburgh, PA (United States); Efremenko, Y. [Univ. of Tennessee, Knoxville, TN (United States); Everett, L. L. [Univ. of Wisconsin, Madison, WI (United States); Fava, A. [University of Padua, Padova (Italy); Figueroa-Feliciano, E. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Fleming, B. [Yale Univ., New Haven, CT (United States); Friedland, A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fujikawa, B. K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gaisser, T. K. [Univ. of Delaware, Newark, DE (United States); Galeazzi, M. [Univ. of Miami, FL (United States); Galehouse, DC [Univ. of Akron, OH (United States); Galindo-Uribarri, A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Garvey, G. T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gautam, S. [Tribhuvan Univ., Kirtipur (Nepal); Gilje, K. E. [Illinois Inst. of Technology, Chicago, IL (United States); Gonzalez-Garcia, M. [Stony Brook Univ., NY (United States); Goodman, M. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Gordon, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Gramellini, E. [Yale Univ., New Haven, CT (United States); Green, M. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Guglielmi, A. [University of Padua, Padova (Italy); Hackenburg, R. W. [Brookhaven National Lab. (BNL), Upton, NY (United States); Hackenburg, A. [Yale Univ., New Haven, CT (United States); Halzen, F. [Univ. of Wisconsin, Madison, WI (United States); Han, K. [Yale Univ., New Haven, CT (United States); Hans, S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Harris, D. [Illinois Inst. of Technology, Chicago, IL (United States); Heeger, K. M. [Yale Univ., New Haven, CT (United States); Herman, M. [Brookhaven National Lab. (BNL), Upton, NY (United States); Hill, R. [Univ. of Chicago, IL (United States); Holin, A. [Univ. College London, Bloomsbury (United Kingdom); Huber, P. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Jaffe, D. E. [Brookhaven National Lab. (BNL), Upton, NY (United States); Johnson, R. A. [Univ. of Cincinnati, OH (United States); Joshi, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Karagiorgi, G. [Univ. of Manchester (United Kingdom); Kaufman, L. J. [Indiana Univ., Bloomington, IN (United States); Kayser, B. [Illinois Inst. of Technology, Chicago, IL (United States); Kettell, S. H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Kirby, B. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Klein, J. R. [Univ. of Texas, Arlington, TX (United States); Kolomensky, Y. G. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Kriske, R. M. [Univ. of Minnesota, Minneapolis, MN (United States); Lane, C. E. [Drexel Univ., Philadelphia, PA (United States); Langford, T. J. [Yale Univ., New Haven, CT (United States); Lankford, A. [Univ. of California, Irvine, CA (United States); Lau, K. [Univ. of Houston, TX (United States); Learned, J. G. [Univ. of Hawaii, Honolulu, HI (United States); Ling, J. [Univ. of Illinois, Urbana-Champaign, IL (United States); Link, J. M. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Lissauer, D. [Brookhaven National Lab. (BNL), Upton, NY (United States); Littenberg, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Littlejohn, B. R. [Illinois Inst. of Technology, Chicago, IL (United States); Lockwitz, S. [Illinois Inst. of Technology, Chicago, IL (United States); Lokajicek, M. [Inst. of Physics of the Academy of Sciences of Czech Republic, Prague (Czech Republic); Louis, W. C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Luk, K. [Univ. of California, Berkeley, CA (United States); Lykken, J. [Illinois Inst. of Technology, Chicago, IL (United States); Marciano, W. J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Maricic, J. [Univ. of Hawaii, Honolulu, HI (United States); Markoff, D. M. [North Carolina Central Univ., Durham, NC (United States); Caicedo, D. A. M. [Illinois Inst. of Technology, Chicago, IL (United States); Mauger, C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mavrokoridis, K. [Univ. of Liverpool (United Kingdom); McCluskey, E. [Illinois Inst. of Technology, Chicago, IL (United States); McKeen, D. [Univ. of Washington, Seattle, WA (United States); McKeown, R. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Mills, G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mocioiu, I. [Pennsylvania State Univ., University Park, PA (United States); Monreal, B. [Univ. of California, Santa Barbara, CA (United States); Mooney, M. R. [Brookhaven National Lab. (BNL), Upton, NY (United States); Morfin, J. G. [Illinois Inst. of Technology, Chicago, IL (United States); Mumm, P. [National Inst. of Standards and Technology (NIST), Boulder, CO (United States); Napolitano, J. [Temple Univ., Philadelphia, PA (United States); Neilson, R. [Drexel Univ., Philadelphia, PA (United States); Nelson, J. K. [College of William and Mary, Williamsburg, VA (United States); Nessi, M. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Norcini, D. [Yale Univ., New Haven, CT (United States); Nova, F. [Univ. of Texas, Austin, TX (United States); Nygren, D. R. [Univ. of Texas, Arlington, TX (United States); Gann, GDO [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Palamara, O. [Illinois Inst. of Technology, Chicago, IL (United States); Parsa, Z. [Brookhaven National Lab. (BNL), Upton, NY (United States); Patterson, R. [California Inst. of Technology (CalTech), Pasadena, CA (United States); Paul, P. [Stony Brook Univ., NY (United States); Pocar, A. [Univ. of Massachusetts, Amherst, MA (United States); Qian, X. [Brookhaven National Lab. (BNL), Upton, NY (United States); Raaf, J. L. [Illinois Inst. of Technology, Chicago, IL (United States); Rameika, R. [Illinois Inst. of Technology, Chicago, IL (United States); Ranucci, G. [National Inst. of Nuclear Physics, Milano (Italy); Ray, H. [Univ. of Florida, Gainesville, FL (United States); Reyna, D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rich, G. C. [Triangle Universities Nuclear Lab., Durham, NC (United States); Rodrigues, P. [Univ. of Rochester, NY (United States); Romero, E. R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Rosero, R. [Brookhaven National Lab. (BNL), Upton, NY (United States); Rountree, S. D. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rybolt, B. [Univ. of Tennessee, Knoxville, TN (United States); Sanchez, M. C. [Iowa State Univ., Ames, IA (United States); Santucci, G. [Stony Brook Univ., NY (United States); Schmitz, D. [Univ. of Chicago, IL (United States); Scholberg, K. [Duke Univ., Durham, NC (United States); Seckel, D. [Univ. of Delaware, Newark, DE (United States); Shaevitz, M. [Columbia Univ., New York, NY (United States); Shrock, R. [Stony Brook Univ., NY (United States); Smy, M. B. [Univ. of California, Irvine, CA (United States); Soderberg, M. [Syracuse Univ., NY (United States); Sonzogni, A. [Brookhaven National Lab. (BNL), Upton, NY (United States); Sousa, A. B. [Univ. of Cincinnati, OH (United States); Spitz, J. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); John, J. M. S. [Univ. of Cincinnati, OH (United States); Stewart, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Strait, J. B. [Illinois Inst. of Technology, Chicago, IL (United States); Sullivan, G. [Univ. of Maryland, College Park, MD (United States); Svoboda, R. [Univ. of California, Davis, CA (United States); Szelc, A. M. [Yale Univ., New Haven, CT (United States); Tayloe, R. [Indiana Univ., Bloomington, IN (United States); Thomson, M. A. [Univ. of Cambridge (United Kingdom); Toups, M. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Vacheret, A. [Univ. of Oxford (United Kingdom); Vagins, M. [Univ. of California, Irvine, CA (United States); Water, R. G. V. D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vogelaar, R. B. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Weber, M. [Bern (Switzerland); Weng, W. [Brookhaven National Lab. (BNL), Upton, NY (United States); Wetstein, M. [Univ. of Chicago, IL (United States); White, C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); White, B. R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Whitehead, L. [Univ. of Houston, TX (United States); Whittington, D. W. [Indiana Univ., Bloomington, IN (United States); Wilking, M. J. [Stony Brook Univ., NY (United States); Wilson, R. J. [Colorado State Univ., Fort Collins, CO (United States); Wilson, P. [Illinois Inst. of Technology, Chicago, IL (United States); Winklehner, D. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Winn, D. R. [Fairfield Univ., CT (United States); Worcester, E. [Brookhaven National Lab. (BNL), Upton, NY (United States); Yang, L. [Univ. of Illinois, Urbana-Champaign, IL (United States); Yeh, M [Brookhaven National Lab. (BNL), Upton, NY (United States); Yokley, Z. W. [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Yoo, J. [Illinois Inst. of Technology, Chicago, IL (United States); Yu, B. [Brookhaven National Lab. (BNL), Upton, NY (United States); Yu, J. [Univ. of Texas, Arlington, TX (United States); Zhang, C. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-04-03

    The US neutrino community gathered at the Workshop on the Intermediate Neutrino Program (WINP) at Brookhaven National Laboratory February 4-6, 2015 to explore opportunities in neutrino physics over the next five to ten years. Scientists from particle, astroparticle and nuclear physics participated in the workshop. The workshop examined promising opportunities for neutrino physics in the intermediate term, including possible new small to mid-scale experiments, US contributions to large experiments, upgrades to existing experiments, R&D plans and theory. The workshop was organized into two sets of parallel working group sessions, divided by physics topics and technology. Physics working groups covered topics on Sterile Neutrinos, Neutrino Mixing, Neutrino Interactions, Neutrino Properties and Astrophysical Neutrinos. Technology sessions were organized into Theory, Short-Baseline Accelerator Neutrinos, Reactor Neutrinos, Detector R&D and Source, Cyclotron and Meson Decay at Rest sessions.This report summarizes discussion and conclusions from the workshop.

  5. Intermediate energy data

    International Nuclear Information System (INIS)

    Koning, A.J.; Fukahori, T.; Hasegawa, A.

    1998-01-01

    Subgroup 13 (SG13) on Intermediate Energy Nuclear data was formed by NEA Nuclear Science Committee to solve common problems of these types of data for nuclear applications. An overview is presented in this final report of the present activities of SG13, including data needs, high-priority nuclear data request list (nuclides), compilation of experimental data, specialists meetings and benchmarks, data formats and data libraries. Some important accomplishments are summarized, and recommendations are presented. (R.P.)

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. A Study into the Effects of Competitive Team-Based Learning and 'Learning Together' on the Oral Performance of Intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Mahdi Mardani

    2015-08-01

    Full Text Available The present study intended to look into and compare the possible effects of Competitive Team-Based Learning (CTBL with Learning Together (LT or Cooperative Group-Based Learning (CGBL – the most popular method of Cooperative Learning (CL -- on oral performance of Iranian EFL intermediate students. After administering the oral interview, this researcher selected a group of 40 almost homogeneous Iranian intermediate students and randomly assigned them to control and experimental groups – 20 per group. Based on their scores, the experimental class were divided into 5 almost heterogeneous teams - four members each. But in the control group, the participants were allowed to shape their own favourable groups. For six weeks (18 sessions of 90 minutes each, both the groups received the same course materials, instructor, curriculum, out of-class and in-class assignments, schedule of instruction and equivalent methods of evaluation, but the experimental group experienced language learning via CTBL rather than via the CGBL as their counterparts in the control group. At the end of the course again the oral interview was administered to both the groups. Then the obtained scores on pre-test and post-test were analyzed through different statistical procedures. The results of the study rejected the null hypothesis and provided evidence supporting the hypothesis that CTBL can have a more significant effect on improving the oral performance of Iranian intermediate students. This researcher will discuss the probable causes for the results of the study, and will shed light on the pedagogical implications. She will also suggest recommendations for further research.

  8. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  9. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  10. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  14. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  15. The intermediate state in Patd

    African Journals Online (AJOL)

    ) Jesus had assumed. (concerning the 'intermediate state') as existing, anything which does not exist. Three basic things about the intermediate state emerge from the parable: (a) Jesus recognizes that at the moment of death, in ipso articulo.

  16. [Intermediate energy nuclear physics

    International Nuclear Information System (INIS)

    1989-01-01

    This report summarizes work in experimental Intermediate Energy Nuclear Physics carried out between October 1, 1988 and October 1, 1989 at the Nuclear Physics Laboratory of the University of Colorado, Boulder, under grant DE-FG02-86ER-40269 with the United States Department of Energy. The experimental program is very broadly based, including pion-nucleon studies at TRIUMF, inelastic pion scattering and charge exchange reactions at LAMPF, and nucleon charge exchange at LAMPF/WNR. In addition, a number of other topics related to accelerator physics are described in this report

  17. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  19. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  20. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  1. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  2. An assessment of time involved in pre-test case review and counseling for a whole genome sequencing clinical research program.

    Science.gov (United States)

    Williams, Janet L; Faucett, W Andrew; Smith-Packard, Bethanny; Wagner, Monisa; Williams, Marc S

    2014-08-01

    Whole genome sequencing (WGS) is being used for evaluation of individuals with undiagnosed disease of suspected genetic origin. Implementing WGS into clinical practice will place an increased burden upon care teams with regard to pre-test patient education and counseling about results. To quantitate the time needed for appropriate pre-test evaluation of participants in WGS testing, we documented the time spent by our clinical research group on various activities related to program preparation, participant screening, and consent prior to WGS. Participants were children or young adults with autism, intellectual or developmental disability, and/or congenital anomalies, who have remained undiagnosed despite previous evaluation, and their biologic parents. Results showed that significant time was spent in securing allocation of clinical research space to counsel participants and families, and in acquisition and review of participant's medical records. Pre-enrollment chart review identified two individuals with existing diagnoses resulting in savings of $30,000 for the genome sequencing alone, as well as saving hours of personnel time for genome interpretation and communication of WGS results. New WGS programs should plan for costs associated with additional pre-test administrative planning and patient evaluation time that will be required to provide high quality care.

  3. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  4. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  5. Impact of MCNP Unresolved Resonance Probability-Table Treatment on Uranium and Plutonium Benchmarks

    International Nuclear Information System (INIS)

    Mosteller, R.D.; Little, R.C.

    1999-01-01

    A probability-table treatment recently has been incorporated into an intermediate version of the MCNP Monte Carlo code named MCNP4XS. This paper presents MCNP4XS results for a variety of uranium and plutonium criticality benchmarks, calculated with and without the probability-table treatment. It is shown that the probability-table treatment can produce small but significant reactivity changes for plutonium and 233 U systems with intermediate spectra. More importantly, it can produce substantial reactivity increases for systems with large amounts of 238 U and intermediate spectra

  6. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  7. Discourses and Models of Intermediality

    OpenAIRE

    Schröter, Jens

    2011-01-01

    In his article "Discourses and Models of Intermediality" Jens Schröter discusses the question as to what relations do different discourses pose between different "media." Schröter identifies four models of discourse: 1) synthetic intermediality: a "fusion" of different media to super-media, a model with roots in the Wagnerian concept of Gesamtkunstwerk with political connotations, 2) formal (or transmedial) intermediality: a concept based on formal structures not "specific" to one medium but ...

  8. Information acquisition and financial intermediation

    OpenAIRE

    Boyarchenko, Nina

    2012-01-01

    This paper considers the problem of information acquisition in an intermediated market, where the specialists have access to superior technology for acquiring information. These informational advantages of specialists relative to households lead to disagreement between the two groups, changing the shape of the intermediation-constrained region of the economy and increasing the frequency of periods when the intermediation constraint binds. Acquiring the additional information is, however, cost...

  9. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  10. The MHD intermediate shock interaction with an intermediate wave: Are intermediate shocks physical?

    International Nuclear Information System (INIS)

    Wu, C.C.

    1988-01-01

    Contrary to the usual belief that MHD intermediate shocks are extraneous, the authors have recently shown by numerical solutions of dissipative MHD equations that intermediate shocks are admissible and can be formed through nonlinear steepening from a continuous wave. In this paper, he clarifies the differences between the conventional view and the results by studying the interaction of an MHD intermediate shock with an intermediate wave. The study reaffirms his results. In addition, the study shows that there exists a larger class of shocklike solutions in the time-dependent dissiaptive MHD equations than are given by the MHD Rankine-Hugoniot relations. it also suggests a mechanism for forming rotational discontinuities through the interaction of an intermediate shock with an intermediate wave. The results are of importance not only to the MHD shock theory but also to studies such as magnetic field reconnection models

  11. Comparison of pre-test analyses with the Sizewell-B 1:10 scale prestressed concrete containment test

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Parks, M.B.

    1991-01-01

    This paper describes pretest analyses of a one-tenth scale model of the Sizewell-B prestressed concrete containment building. The work was performed by ANATECH Research Corp. under contract with Sandia National Laboratories (SNL). Hydraulic testing of the model was conducted in the United Kingdom by the Central Electricity Generating Board (CEGB). In order to further their understanding of containment behavior, the USNRC, through an agreement with the United Kingdom Atomic Energy Authority (UKAEA), also participated in the test program with SNL serving as their technical agent. The analyses that were conducted included two global axisymmetric models with ''bonded'' and ''unbonded'' analytical treatment of meridional tendons, a 3D quarter model of the structure, an axisymmetric representation of the equipment hatch region, and local plan stress and r-θ models of a buttress. Results of these analyses are described and compared with the results of the test. A global hoop failure at midheight of the cylinder and a shear/bending type failure at the base of the cylinder wall were both found to have roughly equal probability of occurrence; however, the shear failure mode had higher uncertainty associated with it. Consequently, significant effort was dedicated to improving the modeling capability for concrete shear behavior. This work is also described briefly. 5 refs., 7 figs

  12. Comparison of pre-test analyses with the Sizewell-B 1:10 scale prestressed concrete containment test

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Parks, M.B.

    1991-01-01

    This paper describes pretest analyses of a one-tenth scale model of the 'Sizewell-B' prestressed concrete containment building. The work was performed by ANATECH Research Corp. under contract with Sandia National Laboratories (SNL). Hydraulic testing of the model was conducted in the United Kingdom by the Central Electricity Generating Board (CEGB). In order to further their understanding of containment behavior, the USNRC, through an agreement with the United Kingdom Atomic Energy Authority (UKAEA), also participated in the test program with SNL serving as their technical agent. The analyses that were conducted included two global axisymmetric models with 'bonded' and 'unbonded' analytical treatment of meridional tendons, a 3D quarter model of the structure, an axisymmetric representation of the equipment hatch region, and local plane stress and r-θ models of a buttress. Results of these analyses are described and compared with the results of the test. A global hoop failure at midheight of the cylinder and a shear/bending type failure at the base of the cylinder wall were both found to have roughly equal probability of occurrence; however, the shear failure mode had higher uncertainty associated with it. Consequently, significant effort was dedicated to improving the modeling capability for concrete shear behavior. This work is also described briefly. (author)

  13. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. Intermediate valence spectroscopy

    International Nuclear Information System (INIS)

    Gunnarsson, O.; Schoenhammer, K.

    1987-01-01

    Spectroscopic properties of intermediate valence compounds are studied using the Anderson model. Due to the large orbital and spin degeneracy N/sub f/ of the 4f-level, 1/N/sub f/ can be treated as a small parameter. This approach provides exact T = 0 results for the Anderson impurity model in the limit N/sub f/ → ∞, and by adding 1/N/sub f/ corrections some properties can be calculated accurately even for N/sub f/ = 1 or 2. In particular valence photoemission and resonance photoemission spectroscopies are studied. A comparison of theoretical and experimental spectra provides an estimate of the parameters in the model. Core level photoemission spectra provide estimates of the coupling between the f-level and the conduction states and of the f-level occupancy. With these parameters the model gives a fair description of other electron spectroscopies. For typical parameters the model predicts two structures in the f-spectrum, namely one structure at the f-level and one at the Fermi energy. The resonance photoemission calculation gives a photon energy dependence for these two peaks in fair agreement with experiment. The peak at the Fermi energy is partly due to a narrow Kondo resonance, resulting from many-body effects and the presence of a continuous, partly filled conduction band. This resonance is related to a large density of low-lying excitations, which explains the large susceptibility and specific heat observed for these systems at low temperatures. 38 references, 11 figures, 2 tables

  15. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  16. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  17. Welding. Performance Objectives. Intermediate Course.

    Science.gov (United States)

    Vincent, Kenneth

    Several intermediate performance objectives and corresponding criterion measures are listed for each of nine terminal objectives for an intermediate welding course. The materials were developed for a 36-week (3 hours daily) course designed to prepare the student for employment in the field of welding. Electric welding and specialized (TIG & MIG)…

  18. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  19. Intermediate neutron spectrum problems and the intermediate neutron spectrum experiment

    International Nuclear Information System (INIS)

    Jaegers, P.J.; Sanchez, R.G.

    1996-01-01

    Criticality benchmark data for intermediate energy spectrum systems does not exist. These systems are dominated by scattering and fission events induced by neutrons with energies between 1 eV and 1 MeV. Nuclear data uncertainties have been reported for such systems which can not be resolved without benchmark critical experiments. Intermediate energy spectrum systems have been proposed for the geological disposition of surplus fissile materials. Without the proper benchmarking of the nuclear data in the intermediate energy spectrum, adequate criticality safety margins can not be guaranteed. The Zeus critical experiment now under construction will provide this necessary benchmark data

  20. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  1. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  2. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  3. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  4. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  5. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  6. Intermediate Levels of Visual Processing

    National Research Council Canada - National Science Library

    Nakayama, Ken

    1998-01-01

    ...) surface representation, here we have shown that there is an intermediate level of visual processing, between the analysis of the image and higher order representations related to specific objects; (2...

  7. The Effectiveness of Problem Based Learning (PBL on Intermediate Financial Accounting Subject

    Directory of Open Access Journals (Sweden)

    Nunuk Suryanti

    2016-12-01

    Full Text Available This research aims to know the effectiveness of Problem Based Learning (PBL Model comparing to Drill Model on Intermediate Financial Accounting subject. The research was a quasi-experimental research. Population was four classes of Accounting Education students in the year of 2014/2015 at Faculty of Educational Science and Teaching of Riau Islamic University (UIR. Sample was taken by using purposive sampling. Then, it used Problem Based Learning (PBL at experimental class and Drill Model at controlled class. Data was collected by using interview, observation, and tests (pre-test and post-test. Moreover, data were analyzed by using independent sample test. Findings show that there is no any difference of learning outcomes between students who taught by Problem Based Learning (PBL Model and Drill Model on Intermediate Financial Accounting.

  8. Set-up of a pre-test mock-up experiment in preparation for the HCPB Breeder Unit mock-up experimental campaign

    Energy Technology Data Exchange (ETDEWEB)

    Hernández, F., E-mail: francisco.hernandez@kit.edu [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany); Kolb, M. [Karlsruhe Institute of Technology (KIT), Institute for Applied Materials (IAM-WPT) (Germany); Ilić, M.; Kunze, A. [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany); Németh, J. [KFKI Research Institute for Particle and Nuclear Physics (Hungary); Weth, A. von der [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany)

    2013-10-15

    Highlights: ► As preparation for the HCPB-TBM Breeder Unit out-of-pile testing campaign, a pre-test experiment (PREMUX) has been prepared and described. ► A new heater system based on a wire heater matrix has been developed for imitating the neutronic volumetric heating and it is compared with the conventional plate heaters. ► The test section is described and preliminary thermal results with the available models are presented and are to be benchmarked with PREMUX. ► The PREMUX integration in the air cooling loop L-STAR/LL in the Karlsruhe Institute for Technology is shown and future steps are discussed. -- Abstract: The complexity of the experimental set-up for testing a full-scaled Breeder Unit (BU) mock-up for the European Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) has motivated to build a pre-test mock-up experiment (PREMUX) consisting of a slice of the BU in the Li{sub 4}SiO{sub 4} region. This pre-test aims at verifying the feasibility of the methods to be used for the subsequent testing of the full-scaled BU mock-up. Key parameters needed for the modeling of the breeder material is also to be determined by the Hot Wire Method (HWM). The modeling tools for the thermo-mechanics of the pebble beds and for the mock-up structure are to be calibrated and validated as well. This paper presents the setting-up of PREMUX in the L-STAR/LL facility at the Karlsruhe Institute of Technology. A key requirement of the experiments is to mimic the neutronic volumetric heating. A new heater concept is discussed and compared to several conventional heater configurations with respect to the estimated temperature distribution in the pebble beds. The design and integration of the thermocouple system in the heater matrix and pebble beds is also described, as well as other key aspects of the mock-up (dimensions, layout, cooling system, purge gas line, boundary conditions and integration in the test facility). The adequacy of these methods for the full-scaled BU

  9. Use of a National Continuing Medical Education Meeting to Provide Simulation-Based Training in Temporary Hemodialysis Catheter Insertion Skills: A Pre-Test Post-Test Study

    Directory of Open Access Journals (Sweden)

    Edward G Clark

    2014-10-01

    Full Text Available Background: Simulation-based-mastery-learning (SBML is an effective method to train nephrology fellows to competently insert temporary, non-tunneled hemodialysis catheters (NTHCs. Previous studies of SBML for NTHC-insertion have been conducted at a local level. Objectives: Determine if SBML for NTHC-insertion can be effective when provided at a national continuing medical education (CME meeting. Describe the correlation of demographic factors, prior experience with NTHC-insertion and procedural self-confidence with simulated performance of the procedure. Design: Pre-test – post-test study. Setting: 2014 Canadian Society of Nephrology annual meeting. Participants: Nephrology fellows, internal medicine residents and medical students. Measurements: Participants were surveyed regarding demographics, prior NTHC-insertion experience, procedural self-confidence and attitudes regarding the training they received. NTHC-insertion skills were assessed using a 28-item checklist. Methods: Participants underwent a pre-test of their NTHC-insertion skills at the internal jugular site using a realistic patient simulator and ultrasound machine. Participants then had a training session that included a didactic presentation and 2 hours of deliberate practice using the simulator. On the following day, trainees completed a post-test of their NTHC-insertion skills. All participants were required to meet or exceed a minimum passing score (MPS previously set at 79%. Trainees who did not reach the MPS were required to perform more deliberate practice until the MPS was achieved. Results: Twenty-two individuals participated in SBML training. None met or exceeded the MPS at baseline with a median checklist score of 20 (IQR, 7.25 to 21. Seventeen of 22 participants (77% completed post-testing and improved their scores to a median of 27 (IQR, 26 to 28; p < 0.001. All met or exceeded the MPS on their first attempt. There were no significant correlations between demographics

  10. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study.

    Science.gov (United States)

    Clark, Edward G; Paparello, James J; Wayne, Diane B; Edwards, Cedric; Hoar, Stephanie; McQuillan, Rory; Schachter, Michael E; Barsuk, Jeffrey H

    2014-01-01

    Simulation-based-mastery-learning (SBML) is an effective method to train nephrology fellows to competently insert temporary, non-tunneled hemodialysis catheters (NTHCs). Previous studies of SBML for NTHC-insertion have been conducted at a local level. Determine if SBML for NTHC-insertion can be effective when provided at a national continuing medical education (CME) meeting. Describe the correlation of demographic factors, prior experience with NTHC-insertion and procedural self-confidence with simulated performance of the procedure. Pre-test - post-test study. 2014 Canadian Society of Nephrology annual meeting. Nephrology fellows, internal medicine residents and medical students. Participants were surveyed regarding demographics, prior NTHC-insertion experience, procedural self-confidence and attitudes regarding the training they received. NTHC-insertion skills were assessed using a 28-item checklist. Participants underwent a pre-test of their NTHC-insertion skills at the internal jugular site using a realistic patient simulator and ultrasound machine. Participants then had a training session that included a didactic presentation and 2 hours of deliberate practice using the simulator. On the following day, trainees completed a post-test of their NTHC-insertion skills. All participants were required to meet or exceed a minimum passing score (MPS) previously set at 79%. Trainees who did not reach the MPS were required to perform more deliberate practice until the MPS was achieved. Twenty-two individuals participated in SBML training. None met or exceeded the MPS at baseline with a median checklist score of 20 (IQR, 7.25 to 21). Seventeen of 22 participants (77%) completed post-testing and improved their scores to a median of 27 (IQR, 26 to 28; p < 0.001). All met or exceeded the MPS on their first attempt. There were no significant correlations between demographics, prior experience or procedural self-confidence with pre-test performance. Small sample-size and

  11. Design of an intermediate-scale experiment to validate unsaturated- zone transport models

    International Nuclear Information System (INIS)

    Siegel, M.D.; Hopkins, P.L.; Glass, R.J.; Ward, D.B.

    1991-01-01

    An intermediate-scale experiment is being carried out to evaluate instrumentation and models that might be used for transport-model validation for the Yucca Mountain Site Characterization Project. The experimental test bed is a 6-m high x 3-m diameter caisson filled with quartz sand with a sorbing layer at an intermediate depth. The experiment involves the detection and prediction of the migration of fluid and tracers through an unsaturated porous medium. Pre-test design requires estimation of physical properties of the porous medium such as the relative permeability, saturation/pressure relations, porosity, and saturated hydraulic conductivity as well as geochemical properties such as surface complexation constants and empircial K d 'S. The pre-test characterization data will be used as input to several computer codes to predict the fluid flow and tracer migration. These include a coupled chemical-reaction/transport model, a stochastic model, and a deterministic model using retardation factors. The calculations will be completed prior to elution of the tracers, providing a basis for validation by comparing the predictions to observed moisture and tracer behavior

  12. The Effect of Problem Solving Task on Critical Reading of Intermediate EFL Learners in Iranian Context

    Directory of Open Access Journals (Sweden)

    Masoud Khalili Sabet

    2017-12-01

    Full Text Available The attempt in this study is to investigate the effect of teaching critical thinking through problem solving on  reading comprehension performance of EFL intermediate learners. In so doing, forty including twenty male and twenty female intermediate students studying English in an institute in Ardabil, Iran, were selected based on their scores on Preliminary English Test and assigned into control and experimental groups. Afterwards, the sample TOEFL reading comprehension pre-test was administered to both of these groups to ensure homogeneity. The learners in experimental group were taught through problem solving instruction and the learners in control group were taught through traditional method of instructing reading comprehension. After ten sessions of instruction, the same sample TOEFL reading comprehension as post-test was given to the learners to measure the possible differences between pre-test and post-test. The finding revealed teaching problem solving had statistically significant effect on EFL learners reading comprehension performance. Conclusion can be drawn to confirm that teaching critical thinking through problem solving bring better understanding of the text.

  13. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  14. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  15. Deep and intermediate mediterranean water in the western Alboran Sea

    Science.gov (United States)

    Parrilla, Gregorio; Kinder, Thomas H.; Preller, Ruth H.

    1986-01-01

    Hydrographic and current meter data, obtained during June to October 1982, and numerical model experiments are used to study the distribution and flow of Mediterranean waters in the western Alboran Sea. The Intermediate Water is more pronounced in the northern three-fourths of the sea, but its distribution is patchy as manifested by variability of the temperature and salinity maxima at scales ≤10 km. Current meters in the lower Intermediate Water showed mean flow toward the Strait at 2 cm s -1. A reversal of this flow lasted about 2 weeks. A rough estimate of the mean westward Intermediate Water transport was 0.4 × 10 6 m 3 s -1, about one-third of the total outflow, so that the best estimates of the contributions of traditionally defined Intermediate Water and Deep Water account for only about one-half of the total outflow. The Deep Water was uplifted against the southern continental slope from Alboran Island (3°W) to the Strait. There was also a similar but much weaker banking against the Spanish slope, but a deep current record showed that the eastward recirculation implied by this banking is probably intermittent. Two-layer numerical model experiments simulated the Intermediate Water flow with a flat bottom and the Deep Water with realistic bottom topography. Both experiments replicated the major circulation features, and the Intermediate Water flow was concentrated in the north because of rotation and the Deep Water flow in the south because of topographic control.

  16. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  17. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  18. Reservoir characterization and final pre-test analysis in support of the compressed-air-energy-storage Pittsfield aquifer field test in Pike County, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, L.E.; McCann, R.A.

    1983-06-01

    The work reported is part of a field experimental program to demonstrate and evaluate compressed air energy storage in a porous media aquifer reservoir near Pittsfield, Illinois. The reservoir is described. Numerical modeling of the reservoir was performed concurrently with site development. The numerical models were applied to predict the thermohydraulic performance of the porous media reservoir. This reservoir characterization and pre-test analysis made use of evaluation of bubble development, water coning, thermal development, and near-wellbore desaturation. The work was undertaken to define the time required to develop an air storage bubble of adequate size, to assess the specification of instrumentation and above-ground equipment, and to develop and evaluate operational strategies for air cycling. A parametric analysis was performed for the field test reservoir. (LEW)

  19. Influence of nucleon density distribution in nucleon emission probability

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.

    2014-01-01

    Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield

  20. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  1. Reactions of stabilized Criegee Intermediates

    Science.gov (United States)

    Vereecken, Luc; Harder, Hartwig; Novelli, Anna

    2014-05-01

    Carbonyl oxides (Criegee intermediates) were proposed as key intermediates in the gas phase ozonolysis of alkenes in 1975 by Rudolf Criegee. Despite the importance of ozonolysis in atmospheric chemistry, direct observation of these intermediates remained elusive, with only indirect experimental evidence for their role in the oxidation of hydrocarbons, e.g. through scavenging experiments. Direct experimental observation of stabilized CI has only been achieved since 2008. Since then, a concerted effort using experimental and theoretical means is in motion to characterize the chemistry and kinetics of these reactive intermediates. We present the results of theoretical investigations of the chemistry of Criegee intermediates with a series of coreactants which may be of importance in the atmosphere, in experimental setups, or both. This includes the CI+CI cross-reaction, which proceeds with a rate coefficient near the collision limit and can be important in experimental conditions. The CI + alkene reactions show strong dependence of the rate coefficient depending on the coreactants, but is generally found to be rather slow. The CI + ozone reaction is sufficiently fast to occur both in experiment and the free troposphere, and acts as a sink for CI. The reaction of CI with hydroperoxides, ROOH, is complex, and leads both to the formation of oligomers, as to the formation of reactive etheroxides, with a moderately fast rate coefficient. The importance of these reactions is placed in the context of the reaction conditions in different atmospheric environments ranging from unpolluted to highly polluted.

  2. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  3. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  4. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  5. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  6. Search for intermediate vector bosons

    International Nuclear Information System (INIS)

    Cline, D.B.; Rubbia, C.; van der Meer, S.

    1982-01-01

    Over the past 15 years a new class of unified theories has been developed to describe the forces acting between elementary particles. The most successful of the new theories establishes a link between electromagnetism and the weak force. A crucial prediction of this unified electroweak theory is the existence of three massive particles called intermediate vector bosons. If these intermediate vector bosons exist and if they have properties attributed to them by electroweak theory, they should soon be detected, as the world's first particle accelerator with enough energy to create such particles has recently been completed at the European Organization for Nuclear Research (CERN) in Geneva. The accelerator has been converted to a colliding beam machine in which protons and antiprotons collide head on. According to electroweak theory, intermediate vector bosons can be created in proton-antiproton collisions. (SC)

  7. Search for intermediate vector bosons

    International Nuclear Information System (INIS)

    Klajn, D.B.; Rubbia, K.; Meer, S.

    1983-01-01

    Problem of registration and search for intermediate vector bosons is discussed. According to weak-current theory there are three intermediate vector bosons with +1(W + )-1(W - ) and zero (Z 0 ) electric charges. It was suggested to conduct the investigation into particles in 1976 by cline, Rubbia and Makintair using proton-antiproton beams. Major difficulties of the experiment are related to the necessity of formation of sufficient amount of antiparticles and the method of antiproton beam ''cooling'' for the purpose of reduction of its random movements. The stochastic method was suggested by van der Meer in 1968 as one of possible cooling methods. Several large detectors were designed for searching intermediate vector bosons

  8. Gravity with Intermediate Goods Trade

    Directory of Open Access Journals (Sweden)

    Sujin Jang

    2017-12-01

    Full Text Available This paper derives the gravity equation with intermediate goods trade. We extend a standard monopolistic competition model to incorporate intermediate goods trade, and show that the gravity equation with intermediates trade is identical to the one without it except in that gross output should be used as the output measure instead of value added. We also show that the output elasticity of trade is significantly underestimated when value added is used as the output measure. This implies that with the conventional gravity equation, the contribution of output growth can be substantially underestimated and the role of trade costs reduction can be exaggerated in explaining trade expansion, as we demonstrate for the case of Korea's trade growth between 1995 and 2007.

  9. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  11. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  12. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  15. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  16. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  17. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  18. Larval helminths in intermediate hosts

    DEFF Research Database (Denmark)

    Fredensborg, Brian Lund; Poulin, R

    2005-01-01

    Density-dependent effects on parasite fitness have been documented from adult helminths in their definitive hosts. There have, however, been no studies on the cost of sharing an intermediate host with other parasites in terms of reduced adult parasite fecundity. Even if larval parasites suffer a ...

  19. Intermediate statistics in quantum maps

    Energy Technology Data Exchange (ETDEWEB)

    Giraud, Olivier [H H Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Marklof, Jens [School of Mathematics, University of Bristol, University Walk, Bristol BS8 1TW (United Kingdom); O' Keefe, Stephen [School of Mathematics, University of Bristol, University Walk, Bristol BS8 1TW (United Kingdom)

    2004-07-16

    We present a one-parameter family of quantum maps whose spectral statistics are of the same intermediate type as observed in polygonal quantum billiards. Our central result is the evaluation of the spectral two-point correlation form factor at small argument, which in turn yields the asymptotic level compressibility for macroscopic correlation lengths. (letter to the editor)

  20. Intermediality and the Child Performer

    Science.gov (United States)

    Budd, Natasha

    2016-01-01

    This report details examples of praxis in the creation and presentation of "Joy Fear and Poetry": an intermedial theatre performance in which children aged 7-12 years generated aesthetic gestures using a range of new media forms. The impetus for the work's development was a desire to make an intervention into habituated patterns of…

  1. Material Voices: Intermediality and Autism

    Science.gov (United States)

    Trimingham, Melissa; Shaughnessy, Nicola

    2016-01-01

    Autism continues to be regarded enigmatically; a community that is difficult to access due to perceived disruptions of interpersonal connectedness. Through detailed observations of two children participating in the Arts and Humanities Research Council funded project "Imagining Autism: Drama, Performance and Intermediality as Interventions for…

  2. White blood cell and platelet count as adjuncts to standard clinical evaluation for risk assessment in patients at low probability of acute aortic syndrome.

    Science.gov (United States)

    Morello, Fulvio; Cavalot, Giulia; Giachino, Francesca; Tizzani, Maria; Nazerian, Peiman; Carbone, Federica; Pivetta, Emanuele; Mengozzi, Giulio; Moiraghi, Corrado; Lupia, Enrico

    2017-08-01

    Pre-test probability assessment is key in the approach to suspected acute aortic syndromes (AASs). However, most patients with AAS-compatible symptoms are classified at low probability, warranting further evaluation for decision on aortic imaging. White blood cell count, platelet count and fibrinogen explore pathophysiological pathways mobilized in AASs and are routinely assayed in the workup of AASs. However, the diagnostic performance of these variables for AASs, alone and as a bundle, is unknown. We tested the hypothesis that white blood cell count, platelet count and/or fibrinogen at presentation may be applied as additional tools to standard clinical evaluation for pre-test risk assessment in patients at low probability of AAS. This was a retrospective observational study conducted on consecutive patients managed in our Emergency Department from 2009 to 2014 for suspected AAS. White blood cell count, platelet count and fibrinogen were assayed during evaluation in the Emergency Department. The final diagnosis was obtained by computed tomography angiography. The pre-test probability of AAS was defined according to guidelines. Of 1210 patients with suspected AAS, 1006 (83.1%) were classified at low probability, and 271 (22.4%) were diagnosed with AAS. Within patients at low probability, presence of at least one alteration among white blood cell count >9*10 3 /µl, platelet count probability, white blood cell count >9*10 3 /µl and platelet count probability, the estimated risk of AAS based on the number of alterations amongst white blood cell count >9*10 3 /µl and platelet count probability to fine-tune risk assessment of AAS.

  3. Unrelated Hematopoietic Stem Cell Donor Matching Probability and Search Algorithm

    Directory of Open Access Journals (Sweden)

    J.-M. Tiercy

    2012-01-01

    Full Text Available In transplantation of hematopoietic stem cells (HSCs from unrelated donors a high HLA compatibility level decreases the risk of acute graft-versus-host disease and mortality. The diversity of the HLA system at the allelic and haplotypic level and the heterogeneity of HLA typing data of the registered donors render the search process a complex task. This paper summarizes our experience with a search algorithm that includes at the start of the search a probability estimate (high/intermediate/low to identify a HLA-A, B, C, DRB1, DQB1-compatible donor (a 10/10 match. Based on 2002–2011 searches about 30% of patients have a high, 30% an intermediate, and 40% a low probability search. Search success rate and duration are presented and discussed in light of the experience of other centers. Overall a 9-10/10 matched HSC donor can now be identified for 60–80% of patients of European descent. For high probability searches donors can be selected on the basis of DPB1-matching with an estimated success rate of >40%. For low probability searches there is no consensus on which HLA incompatibilities are more permissive, although HLA-DQB1 mismatches are generally considered as acceptable. Models for the discrimination of more detrimental mismatches based on specific amino acid residues rather than specific HLA alleles are presented.

  4. Evaluation of the theory-based Quality Improvement in Physical Therapy (QUIP) programme: a one-group, pre-test post-test pilot study.

    Science.gov (United States)

    Rutten, Geert M; Harting, Janneke; Bartholomew, L Kay; Schlief, Angelique; Oostendorp, Rob A B; de Vries, Nanne K

    2013-05-25

    Guideline adherence in physical therapy is far from optimal, which has consequences for the effectiveness and efficiency of physical therapy care. Programmes to enhance guideline adherence have, so far, been relatively ineffective. We systematically developed a theory-based Quality Improvement in Physical Therapy (QUIP) programme aimed at the individual performance level (practicing physiotherapists; PTs) and the practice organization level (practice quality manager; PQM). The aim of the study was to pilot test the multilevel QUIP programme's effectiveness and the fidelity, acceptability and feasibility of its implementation. A one-group, pre-test, post-test pilot study (N = 8 practices; N = 32 PTs, 8 of whom were also PQMs) done between September and December 2009. Guideline adherence was measured using clinical vignettes that addressed 12 quality indicators reflecting the guidelines' main recommendations. Determinants of adherence were measured using quantitative methods (questionnaires). Delivery of the programme and management changes were assessed using qualitative methods (observations, group interviews, and document analyses). Changes in adherence and determinants were tested in the paired samples T-tests and expressed in effect sizes (Cohen's d). Overall adherence did not change (3.1%; p = .138). Adherence to three quality indicators improved (8%, 24%, 43%; .000 ≤ p ≤ .023). Adherence to one quality indicator decreased (-15.7%; p = .004). Scores on various determinants of individual performance improved and favourable changes at practice organizational level were observed. Improvements were associated with the programme's multilevel approach, collective goal setting, and the application of self-regulation; unfavourable findings with programme deficits. The one-group pre-test post-test design limits the internal validity of the study, the self-selected sample its external validity. The QUIP programme has the potential to change physical

  5. Classical model of intermediate statistics

    International Nuclear Information System (INIS)

    Kaniadakis, G.

    1994-01-01

    In this work we present a classical kinetic model of intermediate statistics. In the case of Brownian particles we show that the Fermi-Dirac (FD) and Bose-Einstein (BE) distributions can be obtained, just as the Maxwell-Boltzmann (MD) distribution, as steady states of a classical kinetic equation that intrinsically takes into account an exclusion-inclusion principle. In our model the intermediate statistics are obtained as steady states of a system of coupled nonlinear kinetic equations, where the coupling constants are the transmutational potentials η κκ' . We show that, besides the FD-BE intermediate statistics extensively studied from the quantum point of view, we can also study the MB-FD and MB-BE ones. Moreover, our model allows us to treat the three-state mixing FD-MB-BE intermediate statistics. For boson and fermion mixing in a D-dimensional space, we obtain a family of FD-BE intermediate statistics by varying the transmutational potential η BF . This family contains, as a particular case when η BF =0, the quantum statistics recently proposed by L. Wu, Z. Wu, and J. Sun [Phys. Lett. A 170, 280 (1992)]. When we consider the two-dimensional FD-BE statistics, we derive an analytic expression of the fraction of fermions. When the temperature T→∞, the system is composed by an equal number of bosons and fermions, regardless of the value of η BF . On the contrary, when T=0, η BF becomes important and, according to its value, the system can be completely bosonic or fermionic, or composed both by bosons and fermions

  6. Insignificant disease among men with intermediate-risk prostate cancer.

    Science.gov (United States)

    Hong, Sung Kyu; Vertosick, Emily; Sjoberg, Daniel D; Scardino, Peter T; Eastham, James A

    2014-12-01

    A paucity of data exists on the insignificant disease potentially suitable for active surveillance (AS) among men with intermediate-risk prostate cancer (PCa). We tried to identify pathologically insignificant disease and its preoperative predictors in men who underwent radical prostatectomy (RP) for intermediate-risk PCa. We analyzed data of 1,630 men who underwent RP for intermediate-risk disease. Total tumor volume (TTV) data were available in 332 men. We examined factors associated with classically defined pathologically insignificant cancer (organ-confined disease with TTV ≤0.5 ml with no Gleason pattern 4 or 5) and pathologically favorable cancer (organ-confined disease with no Gleason pattern 4 or 5) potentially suitable for AS. Decision curve analysis was used to assess clinical utility of a multivariable model including preoperative variables for predicting pathologically unfavorable cancer. In the entire cohort, 221 of 1,630 (13.6 %) total patients had pathologically favorable cancer. Among 332 patients with TTV data available, 26 (7.8 %) had classically defined pathologically insignificant cancer. Between threshold probabilities of 20 and 40 %, decision curve analysis demonstrated that using multivariable model to identify AS candidates would not provide any benefit over simply treating all men who have intermediate-risk disease with RP. Although a minority of patients with intermediate-risk disease may harbor pathologically favorable or insignificant cancer, currently available conventional tools are not sufficiently able to identify those patients.

  7. Mechanisms of deterioration of intermediate moisture food systems

    Science.gov (United States)

    Labuza, T. P.

    1972-01-01

    A study of shelf stability in intermediate moisture foods was made. Major efforts were made to control lipid oxidation and nonenzymatic browning. In order to determine means of preventing these reactions, model systems were developed having the same water activity content relationship of intermediate moisture foods. Models were based on a cellulose-lipid and protein-lipid system with glycerol added as the humectant. Experiments with both systems indicate that lipid oxidation is promoted significantly in the intermediate moisture range. The effect appeared to be related to increased mobility of either reactants or catalysts, since when the amount of water in the system reached a level where capillary condensation occurred and thus free water was present, the rates of oxidation increased. With added glycerol, which is water soluble and thus increases the amount of mobile phase, the increase in oxidation rate occurs at a lower relative humidity. The rates of oxidation were maximized at 61% RH and decreased again at 75% RH probably due to dilution. No significant non-enzymatic browning occurred in the protein-lipid systems. Prevention of oxidation by the use of metal chelating agents was enhanced in the cellulose system, whereas, with protein present, the lipid soluble chain terminating antioxidants (such as BHA) worked equally as well. Preliminary studies of foods adjusted to the intermediate moisture range bear out the results of oxidation in model systems. It can be concluded that for most fat containing intermediate moisture foods, rancidity will be the reaction most limiting stability.

  8. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Experimental evidence for the reducibility of multifragment emission probabilities

    International Nuclear Information System (INIS)

    Wozniak, G.J.; Tso, K.; Phair, L.

    1995-01-01

    Multifragmentation has been studied for 36 Ar-induced reactions on a 197 Au target at E/A = 80 and 110 MeV and for 129 Xe-induced reactions on several targets ( nat Cu, 89 y, 165 ho, 197 Au) and E/A = 40, 50 and 60 MeV. The probability of emitting n intermediate-mass-fragments is shown to be binomial at each transversal energy and reducible to an elementary binary probability p. For each target and at each bombarding energy, this probability p shows a thermal nature by giving linear Arrhenius plots. For the 129 Xe-induced reactions, a nearly universal linear Arrhenius plot is observed at each bombarding energy, indicating a large degree of target independence

  10. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  11. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  12. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  13. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  17. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Pre-test genetic counseling services for hereditary breast and ovarian cancer delivered by non-genetics professionals in the state of Florida.

    Science.gov (United States)

    Vadaparampil, S T; Scherr, C L; Cragun, D; Malo, T L; Pal, T

    2015-05-01

    Genetic counseling and testing for hereditary breast and ovarian cancer now includes practitioners from multiple healthcare professions, specialties, and settings. This study examined whether non-genetics professionals (NGPs) perform guideline-based patient intake and informed consent before genetic testing. NGPs offering BRCA testing services in Florida (n = 386) were surveyed about clinical practices. Among 81 respondents (response rate = 22%), approximately half reported: sometimes scheduling a separate session for pre-test counseling lasting 11-30 min prior to testing, discussing familial implications of testing, benefits and limitations of risk management options, and discussing the potential psychological impact and insurance-related issues. Few constructed a three-generation pedigree, discussed alternative hereditary cancer syndromes, or the meaning of a variant result. This lack of adherence to guideline-based practice may result in direct harm to patients and their family members. NGPs who are unable to deliver guideline adherent cancer genetics services should focus on identification and referral of at-risk patients to in person or telephone services provided by genetics professionals. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Changes in screening behaviors and attitudes toward screening from pre-test genetic counseling to post-disclosure in Lynch syndrome families

    Science.gov (United States)

    Burton-Chase, Allison M.; Hovick, Shelly R.; Peterson, Susan K.; Marani, Salma K.; Vernon, Sally W.; Amos, Christopher I.; Frazier, Marsha L.; Lynch, Patrick M.; Gritz, Ellen R.

    2013-01-01

    Purpose This study examined colonoscopy adherence and attitudes towards colorectal cancer (CRC) screening in individuals who underwent Lynch syndrome genetic counseling and testing. Methods We evaluated changes in colonoscopy adherence and CRC screening attitudes in 78 cancer-unaffected relatives of Lynch syndrome mutation carriers before pre-test genetic counseling (baseline) and at 6 and 12 months post-disclosure of test results (52 mutation-negative, 26 mutation-positive). Results While both groups were similar at baseline, at 12 months post-disclosure, a greater number of mutation-positive individuals had had a colonoscopy compared with mutation-negative individuals. From baseline to 12 months post-disclosure, the mutation-positive group demonstrated an increase in mean scores on measures of colonoscopy commitment, self-efficacy, and perceived benefits of CRC screening, and a decrease in mean scores for perceived barriers to CRC screening. Mean scores on colonoscopy commitment decreased from baseline to 6 months in the mutation-negative group. Conclusion Adherence to risk-appropriate guidelines for CRC surveillance improved after genetic counseling and testing for Lynch syndrome. Mutation-positive individuals reported increasingly positive attitudes toward CRC screening after receiving genetic test results, potentially reinforcing longer term colonoscopy adherence. PMID:23414081

  1. Demand characteristics, pre-test attitudes and time-on-task trends in the effects of chewing gum on attention and reported mood in healthy volunteers.

    Science.gov (United States)

    Allen, A P; Smith, A P

    2012-10-01

    Previous research has indicated that chewing gum enhances reported alertness, but has variable effects on attention. Demand characteristics may explain these effects. The current study investigated the effects of gum and demand characteristics on attention and reported mood over time. Participants completed measures of mood and attention, with and without chewing gum. To manipulate demand characteristics, they were told that the hypothesised effect of gum was either positive or negative, or no hypothesis was mentioned. Attitudes towards gum were assessed pre- and post-manipulation. Gum increased reported alertness; this effect was only significant for positive and neutral demand characteristics. Vigilance accuracy was reduced for chewing gum, but only in the fourth minute of the task, and gum reduced focussed attention accuracy, but only for the first 64 trials. Demand characteristics did not moderate time-on-task effects. Gum improved selective attention. A positive effect on response organisation was observed; this was significant when demand characteristics and pre-test attitudes to gum were both negative. The results suggest that demand characteristics moderate effects on self-reported alertness and response organisation, but cannot explain time-on-task effects or variable main effects on other aspects of attention. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  3. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  4. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  5. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  6. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  7. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  8. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  9. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  10. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  11. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  12. Correlated Default and Financial Intermediation

    OpenAIRE

    Gregory Phelan

    2015-01-01

    Financial intermediation naturally arises when knowledge about the aggregate state is valuable for managing investments and lenders cannot easily observe the aggregate state. I show this using a costly enforcement model in which lenders need ex-post incentives to enforce payments from defaulted loans and borrowers' payoffs are correlated. When projects have correlated outcomes, learning the state of one project (via enforcement) provides information about the states of other projects. A large...

  13. MHD intermediate shock discontinuities: Pt. 1

    International Nuclear Information System (INIS)

    Kennel, C.F.; Blandford, R.D.; Coppi, P.

    1989-01-01

    Recent numerical investigations have focused attention once more on the role of intermediate shocks in MHD. Four types of intermediate shock are identified using a graphical representation of the MHD Rankine-Hugoniot conditions. This same representation can be used to exhibit the close relationship of intermediate shocks to switch-on shocks and rotational discontinuities. The conditions under which intermediate discontinuities can be found are elucidated. The variations in velocity, pressure, entropy and magnetic-field jumps with upstream parameters in intermediate shocks are exhibited graphically. The evolutionary arguments traditionally advanced against intermediate shocks may fail because the equations of classical MHD are not strictly hyperbolic. (author)

  14. Can education improve clinical practice concerning delirium in older hospitalised patients? Results of a pre-test post-test study on an educational intervention for nursing staff.

    Science.gov (United States)

    van Velthuijsen, Eveline L; Zwakhalen, Sandra M G; Warnier, Ron M J; Ambergen, Ton; Mulder, Wubbo J; Verhey, Frans R J; Kempen, Gertrudis I J M

    2018-04-02

    Delirium is a common and serious complication of hospitalisation in older adults. It can lead to prolonged hospital stay, institutionalisation, and even death. However, it often remains unrecognised or is not managed adequately. The aim of this study was to evaluate the effects of an educational intervention for nursing staff on three aspects of clinical practice concerning delirium in older hospitalised patients: the frequency and correctness of screening for delirium using the 13-item Delirium Observation Screening score (DOS), and the frequency of geriatric consultations requested for older patients. The a priori expectations were that there would be an increase in all three of these outcomes. We designed an educational intervention and implemented this on two inpatient hospital units. Before providing the educational session, the nursing staff was asked to fill out two questionnaires about delirium in older hospitalised patients. The educational session was then tailored to each unit based on the results of these questionnaires. Additionally, posters and flyers with information on the screening and management of delirium were provided and participants were shown where to find additional information. Relevant data (outcomes, demographics and background patient data) were collected retrospectively from digital medical files. Data was retrospectively collected for four different time points: three pre-test and one post-test. There was a significant increase in frequency of delirium screening (P = 0.001), and both units showed an increase in the correctness of the screening. No significant effect of the educational intervention was found for the proportion of patients who received a geriatric consultation (P = 0.083). The educational intervention was fairly successful in making positive changes in clinical practice: after the educational session an improvement in the frequency and correctness of screening for delirium was observed. A trend, though not

  15. Effects of video-feedback on the communication, clinical competence and motivational interviewing skills of practice nurses: a pre-test posttest control group study.

    Science.gov (United States)

    Noordman, Janneke; van der Weijden, Trudy; van Dulmen, Sandra

    2014-10-01

    To examine the effects of individual video-feedback on the generic communication skills, clinical competence (i.e. adherence to practice guidelines) and motivational interviewing skills of experienced practice nurses working in primary care. Continuing professional education may be necessary to refresh and reflect on the communication and motivational interviewing skills of experienced primary care practice nurses. A video-feedback method was designed to improve these skills. Pre-test/posttest control group design. Seventeen Dutch practice nurses and 325 patients participated between June 2010-June 2011. Nurse-patient consultations were videotaped at two moments (T0 and T1), with an interval of 3-6 months. The videotaped consultations were rated using two protocols: the Maastrichtse Anamnese en Advies Scorelijst met globale items (MAAS-global) and the Behaviour Change Counselling Index. Before the recordings, nurses were allocated to a control or video-feedback group. Nurses allocated to the video-feedback group received video-feedback between T0 and T1. Data were analysed using multilevel linear or logistic regression. Nurses who received video-feedback appeared to pay significantly more attention to patients' request for help, their physical examination and gave significantly more understandable information. With respect to motivational interviewing, nurses who received video-feedback appeared to pay more attention to 'agenda setting and permission seeking' during their consultations. Video-feedback is a potentially effective method to improve practice nurses' generic communication skills. Although a single video-feedback session does not seem sufficient to increase all motivational interviewing skills, significant improvement in some specific skills was found. Nurses' clinical competences were not altered after feedback due to already high standards. © 2014 John Wiley & Sons Ltd.

  16. Smokers' responses to television advertisements about the serious harms of tobacco use: pre-testing results from 10 low- to middle-income countries.

    Science.gov (United States)

    Wakefield, Melanie; Bayly, Megan; Durkin, Sarah; Cotter, Trish; Mullin, Sandra; Warne, Charles

    2013-01-01

    While television advertisements (ads) that communicate the serious harms of smoking are effective in prompting quitting-related thoughts and actions, little research has been conducted among smokers in low- to middle-income countries to guide public education efforts. 2399 smokers aged 18-34 years in 10 low- to middle-income countries (Bangladesh, China, Egypt, India, Indonesia, Mexico, Philippines, Russia, Turkey and Vietnam) viewed and individually rated the same five anti-smoking ads on a standard questionnaire and then engaged in a structured group discussion about each ad. Multivariate logistic regression analysis, with robust SEs to account for the same individual rating multiple ads, was performed to compare outcomes (message acceptance, perceived personalised effectiveness, feel uncomfortable, likelihood of discussing the ad) across ads and countries, adjusting for covariates. Ads by country interactions were examined to assess consistency of ratings across countries. Three ads with graphic imagery performed consistently highly across all countries. Two of these ads showed diseased human tissue or body parts, and a third used a disgust-provoking metaphor to demonstrate tar accumulation in smokers' lungs. A personal testimonial ad performed more variably, as many smokers did not appreciate that the featured woman's lung cancer was due to smoking or that her altered physical appearance was due to chemotherapy. An ad using a visual metaphor for lung disease was also more variable, mostly due to lack of understanding of the term 'emphysema'. Television ads that graphically communicate the serious harms of tobacco use are likely to be effective with smokers in low- to middle-income countries and can be readily translated and adapted for local use. Ads with complex medical terms or metaphors, or those that feature personal testimonials, are more variable and at least require more careful pre-testing and adaptation to maximise their potential.

  17. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  18. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  19. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  20. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  1. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  2. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  3. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  4. Role of Intermediate Filaments in Vesicular Traffic

    Directory of Open Access Journals (Sweden)

    Azzurra Margiotta

    2016-04-01

    Full Text Available Intermediate filaments are an important component of the cellular cytoskeleton. The first established role attributed to intermediate filaments was the mechanical support to cells. However, it is now clear that intermediate filaments have many different roles affecting a variety of other biological functions, such as the organization of microtubules and microfilaments, the regulation of nuclear structure and activity, the control of cell cycle and the regulation of signal transduction pathways. Furthermore, a number of intermediate filament proteins have been involved in the acquisition of tumorigenic properties. Over the last years, a strong involvement of intermediate filament proteins in the regulation of several aspects of intracellular trafficking has strongly emerged. Here, we review the functions of intermediate filaments proteins focusing mainly on the recent knowledge gained from the discovery that intermediate filaments associate with key proteins of the vesicular membrane transport machinery. In particular, we analyze the current understanding of the contribution of intermediate filaments to the endocytic pathway.

  5. Influence of an Intermediate Option on the Description-Experience Gap and Information Search

    OpenAIRE

    Neha Sharma; Shoubhik Debnath; Varun Dutt

    2018-01-01

    Research shows that people tend to overweight small probabilities in description and underweight them in experience, thereby leading to a different pattern of choices between description and experience; a phenomenon known as the Description-Experience (DE) gap. However, little is known on how the addition of an intermediate option and contextual framing influences the DE gap and people’s search strategies. This paper tests the effects of an intermediate option and contextual framing on the DE...

  6. ESL intermediate/advanced writing

    CERN Document Server

    Munoz Page, Mary Ellen; Jaskiewicz, Mary

    2011-01-01

    Master ESL (English as a Second Language) Writing with the study guide designed for non-native speakers of English. Skill-building lessons relevant to today's topics help ESL students write complete sentences, paragraphs, and even multi-paragraph essays. It's perfect for classroom use or self-guided writing preparation.DETAILS- Intermediate drills for improving skills with parallel structure, mood, correct shifting errors & dangling participles- Advanced essay drills focusing on narrative, descriptive, process, reaction, comparison and contrast- Superb preparation for students taking the TOEFL

  7. Photonuclear reactions at intermediate energy

    International Nuclear Information System (INIS)

    Koch, J.H.

    1982-01-01

    The dominant feature of photonuclear reactions at intermediate energies is the excitation of the δ resonance and one can therefore use such reactions to study the dynamics of δ propagation in a nucleus. Following an introductory section the author comments on photoabsorption on a single nucleon in Section II. A review of the δ-n Greens function and of the photonuclear amplitude is given in Section III. Results for photoabsorption on 4 He are shown in Section IV and compared with the data. Coherent π 0 photoproduction is discussed in Section V and calculations for 12 C are compared to recent measurements. (Auth.)

  8. Pelamis WEC - intermediate scale demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Yemm, R.

    2003-07-01

    This report describes the successful building and commissioning of an intermediate 1/7th scale model of the Pelamis Wave Energy Converter (WEC) and its testing in the wave climate of the Firth of Forth. Details are given of the design of the semi-submerged articulated structure of cylindrical elements linked by hinged joints. The specific programme objectives and conclusions, development issues addressed, and key remaining risks are discussed along with development milestones to be passed before the Pelamis WEC is ready for full-scale prototype testing.

  9. Intermediality: Bridge to Critical Media Literacy.

    Science.gov (United States)

    Pailliotet, Ann Watts; Semali, Ladislaus; Rodenberg, Rita K.; Giles, Jackie K.; Macaul, Sherry L.

    2000-01-01

    Defines "intermediality" as the ability to critically read and write with and across varied symbol systems. Relates it to critical media literacy. Offers rationales for teaching critical media literacy in general, and intermedial instruction in particular. Identifies seven guiding intermedial elements: theory, texts, processes, contexts,…

  10. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  11. The Effect of Using Video Technology on Improving Reading Comprehension of Iranian Intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Amir Mohammadian

    2018-04-01

    Full Text Available With the development of educational technology, the concept of technology-enhanced multimedia instructions is using widely in the educational settings. Technology can be employed in teaching different skills such as listening, reading, speaking and writing. Among these skills, reading comprehension is the skill in which EFL learners have some problems to master. Regarding this issue, the present study aimed at investigating the effect of video materials on improving reading comprehension of Iranian intermediate EFL learners. A Longman Placement Test was administered to 30 EFL learners to ensure that learners are at the same level of proficiency. The students were chosen from the state high schools in Chabahar.  The participants were regarded as intermediate learners and were divided into two groups (one experimental group and one control group. Then, a pre-test of reading comprehension was administered to assess the participants’ reading comprehension. The participants of experimental group used video files to improve their reading comprehension while the control group received conventional approaches of teaching reading comprehension. Finally, all the participants were assigned a 40-item multiple-choice reading comprehension post-test. The results of the study indicated that video materials had a significant effect on promoting reading comprehension of Iranian intermediate EFL learners (p = .000, <.05.

  12. Post-crisis financial intermediation

    Directory of Open Access Journals (Sweden)

    Ilie MIHAI

    2015-09-01

    Full Text Available The recent financial crisis that begun in 2007 in the US, which then swept around the world, has left deep scars on the already wrinkled face of the global economy. Some national and regional economies, which had money for expensive makeup, or created money[1], managed to blur or hide the scars left by the crisis, others are still facing difficulties in overcoming the effects of this. The rapacity of banks, their greed and risk ignorance, were the origin of the outbreak of the last major economic and financial crisis but unfortunately those who were responsible or, rather, irresponsible, paid little or nothing at all for the burden of their bad loan portfolio. This cost has been supported by the population, either directly by paying high interest and fees [Mihai I., 2007], or indirectly, through the use of public budgets to cover the losses of banks, most of which had private capital. In this context, we intend to examine the state of financial intermediation in Romania in the post-crisis period, and to primarily follow: (i The structure and evolution of the banking system; (ii Non-government credit situation; (iii The level of savings; (iiii Loan-deposit ratio; (v The degree of financial intermediation and disintegration phenomenon etc., and to articulate some conclusions and suggestions on the matters that have been explored.

  13. Intermediate-Mass Black Holes

    Science.gov (United States)

    Miller, M. Coleman; Colbert, E. J. M.

    2004-01-01

    The mathematical simplicity of black holes, combined with their links to some of the most energetic events in the universe, means that black holes are key objects for fundamental physics and astrophysics. Until recently, it was generally believed that black holes in nature appear in two broad mass ranges: stellar-mass (M~3 20 M⊙), which are produced by the core collapse of massive stars, and supermassive (M~106 1010 M⊙), which are found in the centers of galaxies and are produced by a still uncertain combination of processes. In the last few years, however, evidence has accumulated for an intermediate-mass class of black holes, with M~102 104 M⊙. If such objects exist they have important implications for the dynamics of stellar clusters, the formation of supermassive black holes, and the production and detection of gravitational waves. We review the evidence for intermediate-mass black holes and discuss future observational and theoretical work that will help clarify numerous outstanding questions about these objects.

  14. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  16. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  17. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  18. Transport description of intermediate processes in heavy ion collisions

    International Nuclear Information System (INIS)

    Ayik, S.; Shivakumar, B.; Shapira, D.

    1986-01-01

    An extension of the diffusion model is proposed in order to describe the intermediate processes and the compound nucleus formation in heavy ion collisions. The model describes the intermediate processes and fusion in terms of the formation and the evolution of a long-lived dinuclear molecular complex (DMC) and its subsequent decay by fragmentation. The colliding ions can be trapped into the pocket of the entrance channel nucleus-nucleus potential and a DMC is formed. This DMC acts as a doorway state towards formation of a completely equilibrated compound nucleus (CN). It evolves through the exchange of nucleons to different dinuclear configurations. At each stage of its evolution, there is a finite probability for direct fragmentation into outgoing channels by thermal penetration over the barrier. The doorway states that do not fragment relax into a CN configuration and are identified as the fusion yield. 8 refs

  19. Pre-Test CFD for the Design and Execution of the Enhanced Injection and Mixing Project at NASA Langley Research Center

    Science.gov (United States)

    Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.

    2014-01-01

    With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to

  20. Pre-test analysis of a LBLOCA using the design data of the ATLAS facility, a reduced-height integral effect test loop for PWRs

    International Nuclear Information System (INIS)

    Hyun-Sik Park; Ki-Yong Choi; Dong-Jin Euh; Tae-Soon Kwon; Won-Pil Baek

    2005-01-01

    Full text of publication follows: The simulation capability of the KAERI integral effect test facility, ATLAS (Advanced Thermalhydraulic Test Loop for Accident Simulation), has been assessed for a large-break loss-of-coolant accident (LBLOCA) transient. The ATLAS facility is a 1/2 height-scaled, 1/144 area-scaled (1/288 in volume scale), and full-pressure test loop based on the design features of the APR1400, an evolutionary pressurized water reactor that has been developed by Korean industry. The APR1400 has four mechanically separated hydraulic trains for the emergency core cooling system (ECCS) with direct vessel injection (DVI). The APR1400 design features have brought about several new safety issues related to the LBLOCA including the steam-water interaction, ECC bypass, and boiling in the reactor vessel downcomer. The ATLAS facility will be used to investigate the multiple responses between the systems or between the components during various anticipated transients. The ATLAS facility has been designed according to a scaling method that is mainly based on the model suggested by Ishii and Kataoka. The ATLAS facility is being evaluated against the prototype plant APR1400 with the same control logics and accident scenarios using the best-estimated code, MARS. This paper briefly introduces the basic design features of the ATLAS facility and presents the results of pre-test analysis for a postulated LBLOCA of a cold leg. The LBLOCA analyses has been conducted to assess the validity of the applied scaling law and the similarity between the ATLAS facility and the APR1400. As the core simulator of the ATLAS facility has the 10% capability of the scaled full power, the blowdown phase can not be simulated, and the starting point of the accident scenario is around the end of blowdown. So it is an important problem to find the correct initial conditions. For the analyzed LBLOCA scenario, the ATLAS facility showed very similar thermal-hydraulic characteristics to the APR

  1. The impact of Nursing Rounds on the practice environment and nurse satisfaction in intensive care: pre-test post-test comparative study.

    Science.gov (United States)

    Aitken, Leanne M; Burmeister, Elizabeth; Clayton, Samantha; Dalais, Christine; Gardner, Glenn

    2011-08-01

    Factors previously shown to influence patient care include effective decision making, team work, evidence based practice, staffing and job satisfaction. Clinical rounds have the potential to optimise these factors and impact on patient outcomes, but use of this strategy by intensive care nurses has not been reported. To determine the effect of implementing Nursing Rounds in the intensive care environment on patient care planning and nurses' perceptions of the practice environment and work satisfaction. Pre-test post-test 2 group comparative design. Two intensive care units in tertiary teaching hospitals in Australia. A convenience sample of registered nurses (n=244) working full time or part time in the participating intensive care units. Nurses in participating intensive care units were asked to complete the Practice Environment Scale-Nursing Work Index (PES-NWI) and the Nursing Worklife Satisfaction Scale (NWSS) prior to and after a 12 month period during which regular Nursing Rounds were conducted in the intervention unit. Issues raised during Nursing Rounds were described and categorised. The characteristics of the sample and scale scores were summarised with differences between pre and post scores analysed using t-tests for continuous variables and chi-square tests for categorical variables. Independent predictors of the PES-NWI were determined using multivariate linear regression. Nursing Rounds resulted in 577 changes being initiated for 171 patients reviewed; these changes related to the physical, psychological--individual, psychological--family, or professional practice aspects of care. Total PES-NWI and NWSS scores were similar before and after the study period in both participating units. The NWSS sub-scale of interaction between nurses improved in the intervention unit during the study period (pre--4.85±0.93; post--5.36±0.89, p=0.002) with no significant increase in the control group. Factors independently related to higher PES-NWI included

  2. Patient perspectives with abbreviated versus standard pre-test HIV counseling in the prenatal setting: a randomized-controlled, non-inferiority trial.

    Science.gov (United States)

    Cohan, Deborah; Gomez, Elvira; Greenberg, Mara; Washington, Sierra; Charlebois, Edwin D

    2009-01-01

    In the US, an unacceptably high percentage of pregnant women do not undergo prenatal HIV testing. Previous studies have found increased uptake of prenatal HIV testing with abbreviated pre-test counseling, however little is known about patient decision making, testing satisfaction and knowledge in this setting. A randomized-controlled, non-inferiority trial was conducted from October 2006 through February 2008 at San Francisco General Hospital (SFGH), the public teaching hospital of the City and County of San Francisco. A total of 278 English- and Spanish-speaking pregnant women were randomized to receive either abbreviated or standard nurse-performed HIV test counseling at the initial prenatal visit. Patient decision making experience was compared between abbreviated versus standard HIV counseling strategies among a sample of low-income, urban, ethnically diverse prenatal patients. The primary outcome was the decisional conflict score (DCS) using O'Connor low-literacy scale and secondary outcomes included satisfaction with test decision, basic HIV knowledge and HIV testing uptake. We conducted an intention-to-treat analysis of 278 women--134 (48.2%) in the abbreviated arm (AA) and 144 (51.8%) in the standard arm (SA). There was no significant difference in the proportion of women with low decisional conflict (71.6% in AA vs. 76.4% in SA, p = .37), and the observed mean difference between the groups of 3.88 (95% CI: -0.65, 8.41) did not exceed the non-inferiority margin. HIV testing uptake was very high (97. 8%) and did not differ significantly between the 2 groups (99.3% in AA vs. 96.5% in SA, p = .12). Likewise, there was no difference in satisfaction with testing decision (97.8% in AA vs. 99.3% in SA, p = .36). However, women in AA had significantly lower mean HIV knowledge scores (78.4%) compared to women in SA (83.7%, pprocess, while associated with slightly lower knowledge, does not compromise patient decision making or satisfaction regarding HIV testing

  3. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  4. On spallation and fragmentation of heavy ions at intermediate energies

    International Nuclear Information System (INIS)

    Musulmanbekov, G.; Al-Haidary, A.

    2002-01-01

    A new code for simulation of spallation and (multi)fragmentation of nuclei in proton and nucleus induced collisions at intermediate and high energies is developed. The code is a combination of modified intranuclear cascade model with traditional fission - evaporation part and multifragmentation part based on lattice representation of nuclear structure and percolation approach. The production of s-wave resonances and formation time concept included into standard intranuclear cascade code provides correct calculation of excitation energy of residues. This modified cascade code served as a bridge between low and high energy model descriptions of nucleus-nucleus collisions. A good agreement with experiments has been obtained for multiparticle production at intermediate and relatively high energies. Nuclear structure of colliding nuclei is represented as face centered cubic lattice. This representation, being isomorphic to the shell model of nuclear structure, allows to apply percolation approach for nuclear fragmentation. The offered percolation model includes both site and bond percolation. Broken sites represent holes left by nucleons knocked out at cascade state. Therefore, in the first cascade stage mutual rescattering of the colliding nuclei results in knocking some nucleons out of them. After this fast stage paltrily destruct and excited residues remain. On the second stage residual nuclei either evaporate nucleons and light nuclei up to alpha-particles or fragment into pieces with intermediate masses. The choice depends on residue's destruction degree. At low excitation energy and small destruction of the residue the evaporation and fission mechanisms are preferable. The more excitation energy and destruction the more probability of (multi)fragmentation process. Moreover, the more destruction degree of the residual the more the site percolation probability. It is concluded, that at low and intermediate excitation energies the fragmentation of nuclei is slow

  5. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  6. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  7. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  8. Offsetting the difficulties of the molecular model of atomic collisions in the intermediate velocity range

    International Nuclear Information System (INIS)

    Errea, L.F.; Mendez, L.; Riera, A.

    1991-01-01

    To offset the defective behavior of the molecular method of atomic collisions at intermediate energies, we propose a method to approximate the probability flux towards continuum and discrete states not included in the molecular basis. We check the degree of accuracy and limitations of the method for a model case where transition probabilities can be calculated exactly. An application to the benchmark case of He + +H + collisions is also presented, and yields complementary information on the properties of this approach

  9. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  10. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  11. RHIZOME AND DISCOURSE OF INTERMEDIALITY

    Directory of Open Access Journals (Sweden)

    Л Н Синельникова

    2017-12-01

    Full Text Available Rhizomaticity is a strategy and a regularity of text creation in a lot of modern commu-nicative discourse practices. What remains urgent is the problem of the systematic interdisciplinary de-scription of texts whose structure and language qualities are determined by the signs of the rhizome - a concept of post-modern philosophy introduced into the scientific field by the French philosopher Gilles Deleuze and the psychotherapist Félix Guattari (Deleuze, Guattari 1996. The rhizome (Fr. rhizome - rootstock, tuber, bulb, mycelium possesses the following qualities: it is non-linear, open and directed towards the unpredictability of discourse transformations through the possibilities of structure development in any direction; there is no centre or periphery in the rhizome, and any discourse element can become ‘a vital structure’ for text-creation. The rhizome does not have non-intersecting boundaries; and in the space of the rhizomatic discourse environment, an increase of reality facets takes place, non-standard associative con-nections appear, multiplication effects are formed, which create new meanings. Rhizomaticity is the quality of texts being organised by the laws of rhizomatic logic (V.F. Sharkov 2007, by the terms of which ‘su-perposition’ of discourses can take place, a transition from one semiotic system to another. The article makes an attempt to correlate the qualities of the rhizome with the signs of the intermedia discourse, which is built on the semiotic interaction of different media. The moving lines of the rhizome, its ‘branch-ing’ qualities can be found in poetic texts, in the evaluating segments of political discourse, in advertising discourse, in internet communications, which represent rhizomorphic environments. An analysis of examples from these spheres has shown that the rhizomatic approach opens new facets of intermediality. The author uses the methods of discourse analysis to prove that the openness and non

  12. Intermediate Syndrome Following Organophosphate Insecticide Poisoning

    Directory of Open Access Journals (Sweden)

    Chen-Chang Yang

    2007-11-01

    Full Text Available Acute organophosphate insecticide poisoning can manifest 3 different phases of toxic effects, namely, acute cholinergic crisis, intermediate syndrome (IMS, and delayed neuropathy. Among them, IMS has been considered as a major contributing factor of organophosphate-related morbidity and mortality because of its frequent occurrence and probable consequence of respiratory failure. Despite a high incidence, the pathophysiology that underlies IMS remains unclear. Previously proposed mechanisms of IMS include different susceptibility of various cholinergic receptors, muscle necrosis, prolonged acetylcholinesterase inhibition, inadequate oxime therapy, downregulation or desensitization of postsynaptic acetylcholine receptors, failure of postsynaptic acetylcholine release, and oxidative stress-related myopathy. The clinical manifestations of IMS typically occur within 24 to 96 hours, affecting conscious patients without cholinergic signs, and involve the muscles of respiration, proximal limb muscles, neck flexors, and muscles innervated by motor cranial nerves. With appropriate therapy that commonly includes artificial respiration, complete recovery develops 5–18 days later. Patients with atypical manifestations of IMS, especially a relapse or a continuum of acute cholinergic crisis, however, were frequently reported in clinical studies of IMS. The treatment of IMS is mainly supportive. Nevertheless, because IMS generally concurs with severe organophosphate toxicity and persistent inhibition of acetylcholinesterase, early aggressive decontamination, appropriate antidotal therapy, and prompt institution of ventilatory support should be helpful in ameliorating the magnitude and/or the incidence of IMS. Although IMS is well recognized as a disorder of neuromuscular junctions, its exact etiology, incidence, and risk factors are not clearly defined because existing studies are largely small-scale case series and do not employ a consistent and rigorous

  13. FINANCIAL INTERMEDIATION, ENTREPRENEURSHIP AND ECONOMIC GROWTH

    OpenAIRE

    Wenli Cheng

    2007-01-01

    This paper presents a simple general equilibrium model of financial intermediation, entrepreneurship and economic growth. In this model, the role of financial intermediation is to pool savings and to lend the pooled funds to an entrepreneur, who in turn invests the funds in a new production technology. The adoption of the new production technology improves individual real income. Thus financial intermediation promotes economic growth through affecting individuals’ saving behaviour and enabl...

  14. Biocatalytic Synthesis of Chiral Pharmaceutical Intermediates

    Directory of Open Access Journals (Sweden)

    Ramesh N. Patel

    2004-01-01

    Full Text Available The production of single enantiomers of drug intermediates has become increasingly important in the pharmaceutical industry. Chiral intermediates and fine chemicals are in high demand from both the pharmaceutical and agrochemical industries for the preparation of bulk drug substances and agricultural products. The enormous potential of microorganisms and enzymes for the transformation of synthetic chemicals with high chemo-, regio- and enantioselectivities has been demonstrated. In this article, biocatalytic processes are described for the synthesis of chiral pharmaceutical intermediates.

  15. Regularities of intermediate adsorption complex relaxation

    International Nuclear Information System (INIS)

    Manukova, L.A.

    1982-01-01

    The experimental data, characterizing the regularities of intermediate adsorption complex relaxation in the polycrystalline Mo-N 2 system at 77 K are given. The method of molecular beam has been used in the investigation. The analytical expressions of change regularity in the relaxation process of full and specific rates - of transition from intermediate state into ''non-reversible'', of desorption into the gas phase and accumUlation of the particles in the intermediate state are obtained

  16. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  17. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  18. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  19. Experiments in intermediate energy physics

    International Nuclear Information System (INIS)

    Dehnhard, D.

    2003-01-01

    Research in experimental nuclear physics was done from 1979 to 2002 primarily at intermediate energy facilities that provide pion, proton, and kaon beams. Particularly successful has been the work at the Los Alamos Meson Physics Facility (LAMPF) on unraveling the neutron and proton contributions to nuclear ground state and transition densities. This work was done on a wide variety of nuclei and with great detail on the carbon, oxygen, and helium isotopes. Some of the investigations involved the use of polarized targets which allowed the extraction of information on the spin-dependent part of the triangle-nucleon interaction. At the Indiana University Cyclotron Facility (IUCF) we studied proton-induced charge exchange reactions with results of importance to astrophysics and the nuclear few-body problem. During the first few years, the analysis of heavy-ion nucleus scattering data that had been taken prior to 1979 was completed. During the last few years we created hypernuclei by use of a kaon beam at Brookhaven National Laboratory (BNL) and an electron beam at Jefferson Laboratory (JLab). The data taken at BNL for a study of the non-mesonic weak decay of the A particle in a nucleus are still under analysis by our collaborators. The work at JLab resulted in the best resolution hypernuclear spectra measured thus far with magnetic spectrometers

  20. Experiments in intermediate energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Dehnhard, D.

    2003-02-28

    Research in experimental nuclear physics was done from 1979 to 2002 primarily at intermediate energy facilities that provide pion, proton, and kaon beams. Particularly successful has been the work at the Los Alamos Meson Physics Facility (LAMPF) on unraveling the neutron and proton contributions to nuclear ground state and transition densities. This work was done on a wide variety of nuclei and with great detail on the carbon, oxygen, and helium isotopes. Some of the investigations involved the use of polarized targets which allowed the extraction of information on the spin-dependent part of the triangle-nucleon interaction. At the Indiana University Cyclotron Facility (IUCF) we studied proton-induced charge exchange reactions with results of importance to astrophysics and the nuclear few-body problem. During the first few years, the analysis of heavy-ion nucleus scattering data that had been taken prior to 1979 was completed. During the last few years we created hypernuclei by use of a kaon beam at Brookhaven National Laboratory (BNL) and an electron beam at Jefferson Laboratory (JLab). The data taken at BNL for a study of the non-mesonic weak decay of the A particle in a nucleus are still under analysis by our collaborators. The work at JLab resulted in the best resolution hypernuclear spectra measured thus far with magnetic spectrometers.

  1. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  2. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  3. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  4. Nuclear structure at intermediate energies

    International Nuclear Information System (INIS)

    Bonner, B.E.; Mutchler, G.S.

    1991-01-01

    The theme that unites the sometimes seemingly disparate experiments undertaken by the Bonner Lab Medium Energy Group is a determination to understand in detail the many facets and manifestations of the strong interaction, that which is now referred to as nonperturbative QCD. Whether we are investigating the question of just what does carry the spin of baryons, or the extent of the validity of the SU(6) wavefunctions for the excited hyperons (as will be measured in their radiative decays in our CEBAF experiment), or questions associated with the formation of a new state of matter predicted by QCD (the subject of our BNL experiments E810, E854, as well as our approved experiment at RHIC), -- all these projects share this common goal. Our other experiments represent different approaches to the same broad undertaking. LAMPF E1097 will provide definitive answers to the question of the spin dependence of the inelastic channel of pion production in the n-p interaction. FNAL E683 may well open a new field of investigation in nuclear physics: that of just how quarks and gluons interact with nuclear matter as they transverse nuclei of different sizes. In most all of the experiments mentioned above, the Bonner Lab Group is playing major leadership roles as well as doing a big fraction of the hard work that such experiments require. We use many of the facilities that are unavailable to the intermediate energy physics community and we use our expertise to design and fabricate the detectors and instrumentation that are required to perform the measurements which we decide to do

  5. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  6. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  7. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  8. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  9. Using Peephole Optimization on Intermediate Code

    NARCIS (Netherlands)

    Tanenbaum, A.S.; van Staveren, H.; Stevenson, J.W.

    1982-01-01

    Many portable compilers generate an intermediate code that is subsequently translated into the target machine's assembly language. In this paper a stack-machine-based intermediate code suitable for algebraic languages (e.g., PASCAL, C, FORTRAN) and most byte-addressed mini- and microcomputers is

  10. Gasoline Engine Mechanics. Performance Objectives. Intermediate Course.

    Science.gov (United States)

    Jones, Marion

    Several intermediate performance objectives and corresponding criterion measures are listed for each of six terminal objectives presented in this curriculum guide for an intermediate gasoline engine mechanics course at the secondary level. (For the beginning course guide see CE 010 947.) The materials were developed for a two-semester (2 hour…

  11. Some Intermediate-Level Violin Concertos.

    Science.gov (United States)

    Abramson, Michael

    1997-01-01

    Contends that many violin students attempt difficult concertos before they are technically or musically prepared. Identifies a variety of concertos at the intermediate and advanced intermediate-level for students to study and master before attempting the advanced works by Bach and Mozart. Includes concertos by Vivaldi, Leclair, Viotti, Haydn,…

  12. 39 CFR 3001.39 - Intermediate decisions.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Intermediate decisions. 3001.39 Section 3001.39 Postal Service POSTAL REGULATORY COMMISSION PERSONNEL RULES OF PRACTICE AND PROCEDURE Rules of General Applicability § 3001.39 Intermediate decisions. (a) Initial decision by presiding officer. In any proceedings in...

  13. Pair production of intermediate vector bosons

    International Nuclear Information System (INIS)

    Mikaelian, K.O.

    1979-01-01

    The production of intermediate vector boson pairs W + W - , Z 0 Z 0 , W +- Z 0 and W +- γ in pp and p anti p collisions is discussed. The motivation is to detect the self-interactions among the four intermediate vector bosons

  14. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  15. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  16. Influence of an Intermediate Option on the Description-Experience Gap and Information Search.

    Science.gov (United States)

    Sharma, Neha; Debnath, Shoubhik; Dutt, Varun

    2018-01-01

    Research shows that people tend to overweight small probabilities in description and underweight them in experience, thereby leading to a different pattern of choices between description and experience; a phenomenon known as the Description-Experience (DE) gap. However, little is known on how the addition of an intermediate option and contextual framing influences the DE gap and people's search strategies. This paper tests the effects of an intermediate option and contextual framing on the DE gap and people's search strategies, where problems require search for information before a consequential choice. In the first experiment, 120 participants made choice decisions across investment problems that differed in the absence or presence of an intermediate option. Results showed that adding an intermediate option did not reduce the DE gap on the maximizing option across a majority of problems. There were a large majority of choices for the intermediate option. Furthermore, there was an increase in switching between options due to the presence of the intermediate option. In the second experiment, 160 participants made choice decisions in problems like those presented in experiment 1; however, problems lacked the investment framing. Results replicated findings from the first experiment and showed a similar DE gap on the maximizing option in a majority of problems in both the absence and presence of the intermediate option. Again, there were a large majority of choices for the intermediate option. Also, there was an increase in switching between options due to the presence of the intermediate option. Meta-analyses revealed that the absence or presence of the intermediate option created certain differences in the strength of frequency and recency processes. Also, a single natural-mean heuristic model was able to account for the experimental results across both experiments. We discuss implications of our findings to consequential decisions made after information search.

  17. Influence of an Intermediate Option on the Description-Experience Gap and Information Search

    Directory of Open Access Journals (Sweden)

    Neha Sharma

    2018-03-01

    Full Text Available Research shows that people tend to overweight small probabilities in description and underweight them in experience, thereby leading to a different pattern of choices between description and experience; a phenomenon known as the Description-Experience (DE gap. However, little is known on how the addition of an intermediate option and contextual framing influences the DE gap and people’s search strategies. This paper tests the effects of an intermediate option and contextual framing on the DE gap and people’s search strategies, where problems require search for information before a consequential choice. In the first experiment, 120 participants made choice decisions across investment problems that differed in the absence or presence of an intermediate option. Results showed that adding an intermediate option did not reduce the DE gap on the maximizing option across a majority of problems. There were a large majority of choices for the intermediate option. Furthermore, there was an increase in switching between options due to the presence of the intermediate option. In the second experiment, 160 participants made choice decisions in problems like those presented in experiment 1; however, problems lacked the investment framing. Results replicated findings from the first experiment and showed a similar DE gap on the maximizing option in a majority of problems in both the absence and presence of the intermediate option. Again, there were a large majority of choices for the intermediate option. Also, there was an increase in switching between options due to the presence of the intermediate option. Meta-analyses revealed that the absence or presence of the intermediate option created certain differences in the strength of frequency and recency processes. Also, a single natural-mean heuristic model was able to account for the experimental results across both experiments. We discuss implications of our findings to consequential decisions made after

  18. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  19. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  20. Electron-atom scattering at intermediate energies

    International Nuclear Information System (INIS)

    Kingston, A.E.; Walters, H.R.J.

    1982-01-01

    The problems of intermediate energy scattering are approached from the low and high energy ends. At low intermediate energies difficulties associated with the use of pseudostates and correlation terms are discussed, special consideration being given to nonphysical pseudoresonances. Perturbation methods appropriate to high intermediate energies are described and attempts to extend these high energy approximations down to low intermediate energies are studied. It is shown how the importance of electron exchange effects develops with decreasing energy. The problem of assessing the 'effective completeness' of pseudostate sets at intermediate energies is mentioned and an instructive analysis of a 2p pseudostate approximation to elastic e - -H scattering is given. It is suggested that at low energies the Pauli Exclusion Principle can act to hide short range defects in pseudostate approximations. (author)

  1. The Effects of Using Podcast on Listening Comprehension among Iranian Pre-intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Islam NamazianDost

    2017-09-01

    Full Text Available The purpose of the present study was to explore the effects of using podcast on listening comprehension among Iranian pre-intermediate EFL learners. To fulfill the objectives of the study a Homogeneity test (Oxford Quick Placement Test was administered among 90 students at the pre-intermediate level of Poyesh language Institute and finally 60 participants were selected. Then, they were non-randomly divided into two sub­groups, namely control and experimental groups.  Before starting the treatment, a validated teacher-made listening comprehension test was administered to students as pre-test to assess the participants' listening comprehension at the beginning of the course. Then, the experimental group received the treatment, which was teaching listening comprehension through using podcasts while the control group was taught using traditional methods of teaching listening with no multimedia source. After 20 sessions of treatment, the two groups were administered the same teacher-made listening test as post-test. Data were analyzed by Paired and Independent Samples t-­test. The findings showed that the experimental group significantly performed better than the control group. Generally, the experimental groups outperformed the control groups. The results suggest that podcasts can be used in English classes to develop listening ability among Iranian EFL learners.

  2. The Effect of Presentation Strategy on Reading Comprehension of Iranian Intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Hooshang Khoshsima

    2014-06-01

    Full Text Available The present experimental study primarily aimed at examining the effect of presentation strategy on reading comprehension of Iranian intermediate EFL learners. To determine the effect of this strategy, 61 students who enrolled in English Language Center of Chabahar Maritime University were initially selected and then divided randomly into two classes one as an experimental group and the other one as the control group. The homogeneity of their proficiency level was established via the administration of a TOEFL (the PBT proficiency test. A reading comprehension test as pre-test administered to the subjects of both groups one week before the initiation of the study. The experimental group used the strategy three sessions each week for ten weeks, while the control group was trained based on the ordinary approaches of teaching reading comprehension. Presentation strategy was worked for every two weeks during the experiment and then at the end of each two-week instruction, an immediate posttest was administered according to the strategy worked. At the end of the study, a post-test was administered to both groups. The statistical techniques being applied were Paired Sample t-test and Independent Sample t-test. The results of the study revealed that presentation strategy had significant effect on promoting reading comprehension of intermediate EFL learners.

  3. Effect of Intermediate Hosts on Emerging Zoonoses.

    Science.gov (United States)

    Cui, Jing-An; Chen, Fangyuan; Fan, Shengjie

    2017-08-01

    Most emerging zoonotic pathogens originate from animals. They can directly infect humans through natural reservoirs or indirectly through intermediate hosts. As a bridge, an intermediate host plays different roles in the transmission of zoonotic pathogens. In this study, we present three types of pathogen transmission to evaluate the effect of intermediate hosts on emerging zoonotic diseases in human epidemics. These types are identified as follows: TYPE 1, pathogen transmission without an intermediate host for comparison; TYPE 2, pathogen transmission with an intermediate host as an amplifier; and TYPE 3, pathogen transmission with an intermediate host as a vessel for genetic variation. In addition, we established three mathematical models to elucidate the mechanisms underlying zoonotic disease transmission according to these three types. Stability analysis indicated that the existence of intermediate hosts increased the difficulty of controlling zoonotic diseases because of more difficult conditions to satisfy for the disease to die out. The human epidemic would die out under the following conditions: TYPE 1: [Formula: see text] and [Formula: see text]; TYPE 2: [Formula: see text], [Formula: see text], and [Formula: see text]; and TYPE 3: [Formula: see text], [Formula: see text], [Formula: see text], and [Formula: see text] Simulation with similar parameters demonstrated that intermediate hosts could change the peak time and number of infected humans during a human epidemic; intermediate hosts also exerted different effects on controlling the prevalence of a human epidemic with natural reservoirs in different periods, which is important in addressing problems in public health. Monitoring and controlling the number of natural reservoirs and intermediate hosts at the right time would successfully manage and prevent the prevalence of emerging zoonoses in humans.

  4. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  5. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  6. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  7. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  8. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  9. Intermediate filaments and gene regulation.

    Science.gov (United States)

    Traub, P

    1995-01-01

    The biological role of intermediate filaments (IFs) of eukaryotic cells is still a matter of conjecture. On the basis of immunofluorescence and electron microscopic observations, they appear to play a cytoskeletal role in that they stabilize cellular structure and organize the distribution and interactions of intracellular organelles and components. The expression of a large number of cell type-specific and developmentally regulated subunit proteins is believed to provide multicellular organisms with different IF systems capable of differential interactions with the various substructures and components of their multiple, differentiated cells. However, the destruction of distinct IF systems by manipulation of cultured cells or by knock-out mutation of IF subunit proteins in transgenic mice exerts relatively little influence on cellular morphology and physiology and on development of mutant animals. In order to rationalize this dilemma, the cytoskeletal concept of IF function has been extended to purport that cytoplasmic (c) IFs and their subunit proteins also play fundamental roles in gene regulation. It is based on the in vitro capacity of cIF(protein)s to interact with guanine-rich, single-stranded DNA, supercoiled DNA and histones, as well as on their close structural relatedness to gene-regulatory DNA-binding and nuclear matrix proteins. Since cIF proteins do not possess classical nuclear localization signals, it is proposed that cIFs directly penetrate the double nuclear membrane, exploiting the amphiphilic, membrane-active character of their subunit proteins. Since they can establish metastable multisite contacts with nuclear matrix structures and/or chromatin areas containing highly repetitive DNA sequence elements at the nuclear periphery, they are supposed to participate in chromosome distribution and chromatin organization in interphase nuclei of differentiated cells. Owing to their different DNA-binding specificities, the various cIF systems may in this

  10. Language in use intermediate : classroom book

    CERN Document Server

    Doff, Adrian

    1995-01-01

    ach of the four levels comprises about 80 hours of class work, with additional time for the self-study work. The Teacher's Book contains all the pages from the Classroom Book, with interleaved teaching notes including optional activities to cater for different abilities. There is a video to accompany the Beginner, Pre-intermediate and Intermediate levels. Each video contains eight stimulating and entertaining short programmes, as well as a booklet of photocopiable activities. Free test material is available in booklet and web format for Beginner and Pre-intermediate levels. Visit www.cambridge.org/elt/liu or contact your local Cambridge University Press representative.

  11. Language in use intermediate : teacher's book

    CERN Document Server

    Doff, Adrian

    1998-01-01

    Each of the four levels comprises about 80 hours of class work, with additional time for the self-study work. The Teacher's Book contains all the pages from the Classroom Book, with interleaved teaching notes including optional activities to cater for different abilities. There is a video to accompany the Beginner, Pre-intermediate and Intermediate levels. Each video contains eight stimulating and entertaining short programmes, as well as a booklet of photocopiable activities. Free test material is available in booklet and web format for Beginner and Pre-intermediate levels. Visit www.cambridge.org/elt/liu or contact your local Cambridge University Press representative.

  12. Statistical and dynamical aspects of intermediate energy nuclear collisions

    International Nuclear Information System (INIS)

    Ghetti, R.

    1997-01-01

    Studies of intermediate energy heavy ion reactions have revealed that the probability of emitting n-fragments is reducible to the probability of emitting a single fragment through the binomial distribution. The resulting one-fragment probability shows a dependence on the thermal energy that is characteristic of statistical decay. Similarly, the charge distributions associated with n-fragment emission are reducible to the one-fragment charge distribution, and thermal scaling is observed. The reducibility equation for the n-fragment charge distribution contains a quantity with a value that starts from zero, at low transverse energies, and saturates at high transverse energies. This evolution may signal a transition from a coexistence phase to a vapour phase. In the search for a signal of liquid-gas phase transition, the appearance of intermittency is reconsidered. Percolation calculations, as well as data analysis, indicate that an intermittent-like signal appears from classes of events that do not coincide with the critical one. 232 refs

  13. Statistical and dynamical aspects of intermediate energy nuclear collisions

    Energy Technology Data Exchange (ETDEWEB)

    Ghetti, R.

    1997-01-01

    Studies of intermediate energy heavy ion reactions have revealed that the probability of emitting n-fragments is reducible to the probability of emitting a single fragment through the binomial distribution. The resulting one-fragment probability shows a dependence on the thermal energy that is characteristic of statistical decay. Similarly, the charge distributions associated with n-fragment emission are reducible to the one-fragment charge distribution, and thermal scaling is observed. The reducibility equation for the n-fragment charge distribution contains a quantity with a value that starts from zero, at low transverse energies, and saturates at high transverse energies. This evolution may signal a transition from a coexistence phase to a vapour phase. In the search for a signal of liquid-gas phase transition, the appearance of intermittency is reconsidered. Percolation calculations, as well as data analysis, indicate that an intermittent-like signal appears from classes of events that do not coincide with the critical one. 232 refs.

  14. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  15. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  16. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  17. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  18. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  19. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  20. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  1. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  2. The deterioration of intermediate moisture foods

    Science.gov (United States)

    Labruza, T. P.

    1971-01-01

    Deteriorative reactions are low and food quality high if intermediate moisture content of a food is held at a water activity of 0.6 to 0.75. Information is of interest to food processing and packaging industry.

  3. Intermediate/Advanced Research Design and Statistics

    Science.gov (United States)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  4. Simplifying biochemical models with intermediate species

    DEFF Research Database (Denmark)

    Feliu, Elisenda; Wiuf, Carsten

    2013-01-01

    techniques, we study systematically the effects of intermediate, or transient, species in biochemical systems and provide a simple, yet rigorous mathematical classification of all models obtained from a core model by including intermediates. Main examples include enzymatic and post-translational modification...... systems, where intermediates often are considered insignificant and neglected in a model, or they are not included because we are unaware of their existence. All possible models obtained from the core model are classified into a finite number of classes. Each class is defined by a mathematically simple...... canonical model that characterizes crucial dynamical properties, such as mono- and multistationarity and stability of steady states, of all models in the class. We show that if the core model does not have conservation laws, then the introduction of intermediates does not change the steady...

  5. On intermediate structures in heavy ion reactions

    International Nuclear Information System (INIS)

    Rotter, I.

    1977-01-01

    The conceptions of the nuclear reaction theory are reinvestigated on the basis of the continuum shell model. The correlation of the resonance states via the continuum can lead to intermediate structures in the cross section. (Auth.)

  6. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  7. Has Banks’ Financial Intermediation Improved in Russia?

    OpenAIRE

    Fungachova, Z.; Solanko, L.

    2010-01-01

    The aim of this paper is to analyze the increasing importance of banks in the Russian economy over the period following the financial crisis of 1998. We use several measures to assess the role of banks in domestic financial intermediation in Russia. The traditional macro-level view is complemented by the analysis of sectoral financial flows as well as by insights from micro-level studies. All of these confirm that banks are becoming increasingly important in financial intermediation. We find ...

  8. Intermediate Inflation or Late Time Acceleration?

    International Nuclear Information System (INIS)

    Sanyal, A.K.

    2008-01-01

    The expansion rate of intermediate inflation lies between the exponential and power law expansion but corresponding accelerated expansion does not start at the onset of cosmological evolution. Present study of intermediate inflation reveals that it admits scaling solution and has got a natural exit form it at a later epoch of cosmic evolution, leading to late time acceleration. The corresponding scalar field responsible for such feature is also found to behave as a tracker field for gravity with canonical kinetic term.

  9. Assessment of clinical utility of 18F-FDG PET in patients with head and neck cancer: a probability analysis

    International Nuclear Information System (INIS)

    Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von; Steurer, Johann; Bachmann, Lucas M.

    2003-01-01

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  10. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  11. Associations of Systemic Diseases with Intermediate Uveitis.

    Science.gov (United States)

    Shoughy, Samir S; Kozak, Igor; Tabbara, Khalid F

    2016-01-01

    To determine the associations of systemic diseases with intermediate uveitis. The medical records of 50 consecutive cases with intermediate uveitis referred to The Eye Center in Riyadh, Saudi Arabia, were reviewed. Age- and sex-matched patients without uveitis served as controls. Patients had complete ophthalmic and medical examinations. There were 27 male and 23 female patients. Mean age was 29 years with a range of 5-62 years. Overall, 21 cases (42%) had systemic disorders associated with intermediate uveitis and 29 cases (58%) had no associated systemic disease. A total of 11 patients (22%) had asthma, 4 (8%) had multiple sclerosis, 3 (6%) had presumed ocular tuberculosis, 1 (2%) had inflammatory bowel disease, 1 (2%) had non-Hodgkin lymphoma and 1 (2%) had sarcoidosis. Evidence of systemic disease was found in 50 (5%) of the 1,000 control subjects. Bronchial asthma was found in 37 patients (3.7 %), multiple sclerosis in 9 patients (0.9%), inflammatory bowel disease in 3 patients (0.3%), and tuberculosis in 1 patient (0.1%). None of the control patients had sarcoidosis or lymphoma. There were statistically significant associations between intermediate uveitis and bronchial asthma (p = 0.0001), multiple sclerosis (p = 0.003) and tuberculosis (p = 0.0005). Bronchial asthma and multiple sclerosis were the most frequently encountered systemic diseases associated with intermediate uveitis in our patient population. Patients with intermediate uveitis should undergo careful history-taking and investigations to rule out associated systemic illness.

  12. Higher order antibunching in intermediate states

    International Nuclear Information System (INIS)

    Verma, Amit; Sharma, Navneet K.; Pathak, Anirban

    2008-01-01

    Since the introduction of binomial state as an intermediate state, different intermediate states have been proposed. Different nonclassical effects have also been reported in these intermediate states. But till now higher order antibunching is predicted in only one type of intermediate state, which is known as shadowed negative binomial state. Recently we have shown that the higher order antibunching is not a rare phenomenon [P. Gupta, P. Pandey, A. Pathak, J. Phys. B 39 (2006) 1137]. To establish our earlier claim further, here we have shown that the higher order antibunching can be seen in different intermediate states, such as binomial state, reciprocal binomial state, hypergeometric state, generalized binomial state, negative binomial state and photon added coherent state. We have studied the possibility of observing the higher order subpoissonian photon statistics in different limits of intermediate states. The effects of different control parameters on the depth of non classicality have also been studied in this connection and it has been shown that the depth of nonclassicality can be tuned by controlling various physical parameters

  13. Use of tactual materials on the achievement of content specific vocabulary and terminology acquisition within an intermediate level science curriculum

    Science.gov (United States)

    Terry, Brian H.

    In this quasi-experimental study, the researcher investigated the effectiveness of three tactual strategies and one non-tactual strategy of content specific vocabulary acquisition. Flash cards, task cards, and learning wheels served as the tactual strategies, and vocabulary review sheets served as a non-tactual strategy. The sample (n=85) consisted of all middle school students in a small high performing middle school located in the northern suburbs of New York City. All of the vocabulary words and terms came from the New York State Intermediate Level Science Core Curriculum. Pre-tests and post-tests were used to collect the data. A repeated measures ANOVA was conducted on the gain scores from each of the treatments. Multiple paired sample t-tests were conducted to analyze the results. Repeated measures ANOVAs were used to determine if there was a variance between the academic achievement levels of the students, gender, and grade level for each of the treatments. All of the treatments significantly improved the science achievement of the students, but significance was found between them. Significance was found between the achievement groups with the above average students attaining a higher mean on the pre-test and post-test for each treatment, whereas the below average students had the lowest mean on both assessments. The sixth grade students showed significant improvement over the seventh grade students while using the flash cards (p=.004) and learning wheel (p=.007). During the learning wheel treatment, the males scored significantly better (p=.021) than the females on the pre-test and post-test. During the worksheet treatment, significance (p=.034) was found between gender and achievement group. The below average male students had the greatest gain from the pre-test to the post-test, but the post-test mean was still the lowest of the groups. Limitations, implications for future research and current practice are discussed. Key words are: flash cards, task cards

  14. Rare gases transition probabilities for plasma diagnostics

    International Nuclear Information System (INIS)

    Katsonis, K.; Siskos, A.; Ndiaye, A.; Clark, R.E.H.; Cornille, M.; Abdallah, J. Jr

    2005-01-01

    Emission spectroscopy is a powerful optical diagnostics tool which has been largely used in studying and monitoring various industrial, laboratory and natural plasmas. As these plasmas are rarely in Local Thermodynamic Equilibrium (LTE) a prerequisite of satisfactory evaluation of the plasma electron density n e and temperature T e is the existence of a detailed Collisional-Radiative (C-R) model taking into account the main physical processes influencing the plasma state and dynamics of its main constituents. The theoretical spectra which such a model generates match the experimental ones whenever the experimental values of ne and T e are introduced. In practice, in validating such models, discrepancies are observed which often are due to the atomic data included in the C-R model. In generating theoretical spectra pertaining to each atom(ion) multiplet, the most sensible atomic data are the relevant transition probabilities A j→i and electron collision excitation cross sections σ i→j . We note that the latter are actually poorly known, especially for low ionization stages and near the excitation threshold. We address here the evaluation of the former, especially of the A j→i of the Ar 2+ ion responsible for the Ar III spectra and of those of the Xe 2+ ion which are evaluated in an analogous way. Extensive studies of the Ar III and Xe III spectra exist, but the present status of Aj i cannot be considered sufficient for the generation of the theoretical spectra even of the most prominent visible lines coming from the Ar III multiplets 4s - 4p, 5p (corresponding to the well known '' red '' and 'blue' lines of Ar I) 4p - 4d, 5d and 3p - 4s, 5s (resonant) and the analogous Xe III multiplets (which have principal quantum numbers increased by two). Due to the gap observed in the Grotrian diagrams, the resonant lines which, together with the important metastable ones, belong to the 3p - 4s, 5s multiplets, (5p - 6s, 7s for Xe III), give spectra in the UV region. On

  15. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  16. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  17. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  18. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  19. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  20. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  1. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  2. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  3. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  4. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  5. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  6. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  7. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  8. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  9. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  10. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  11. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  12. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  13. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  14. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  15. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    Science.gov (United States)

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  16. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  17. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  18. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  19. Tracer signals of the intermediate layer of the Arabian Sea

    Science.gov (United States)

    Rhein, Monika; Stramma, Lothar; Plähn, Olaf

    In 1995, hydrographic and chlorofluorocarbon (CFCs, components F11, F12) measurements were carried out in the Gulf of Aden, in the Gulf of Oman, and in the Arabian Sea. In the Gulf of Oman, the F12 concentrations in the Persian Gulf outflow (PGW) at about 300m depth were significantly higher than in ambient surface water with saturations reaching 270%. These high values could not be caused by air-sea gas exchange. The outflow was probably contaminated with oil, and the lipophilic character of the CFCs could then lead to the observed supersaturations. The intermediate F12 maximum decreased rapidly further east and south. At the Strait of Bab el Mandeb in the Gulf of Aden, the Red Sea outflow (RSW) was saturated with F12 to about 65% at 400m depth, and decreased to 50% while descending to 800m depth. The low saturation is not surprising, because the outflow contains deep and intermediate water masses from the Red Sea which were isolated from the surface for some time. The tracer contributions to the Arabian Sea for Indian Central Water (ICW) and PGW are about equal, while below 500m depth the RSW contribution greatly exceeds ICW. Modeling the CFC budget of the Arabian Sea, the inflow of ICW north of 12°N is estimated to be 1-6 Sv, depending mainly on the strength of the flow of Red Sea Water into the Arabian Sea.

  20. Intermediate-level crossings of a first-passage path

    International Nuclear Information System (INIS)

    Bhat, Uttam; Redner, S

    2015-01-01

    We investigate some simple and surprising properties of a one-dimensional Brownian trajectory with diffusion coefficient D that starts at the origin and: (i) is at X at time T, or (ii) first reaches X at time T. We determine the most likely location of the first-passage trajectory from (0, 0) to (X, T) and its distribution at any intermediate time t < T. A first-passage path typically starts out by being repelled from its final location when X 2 /DT ≪ 1. We also determine the distribution of times when the trajectory first crosses and last crosses an arbitrary intermediate position x < X. The distribution of first-crossing times may be unimodal or bimodal, depending on whether X 2 /DT ≪ 1 or X 2 /DT ≫ 1. The form of the first-crossing probability in the bimodal regime is qualitatively similar to, but more singular than, the well-known arcsine law. (paper)

  1. Intermediate filament protein nestin is expressed in developing meninges.

    Science.gov (United States)

    Yay, A; Ozdamar, S; Canoz, O; Baran, M; Tucer, B; Sonmez, M F

    2014-01-01

    Nestin is a type VI intermediate filament protein known as a marker for progenitor cells that can be mostly found in tissues during the embryonic and fetal periods. In our study, we aimed to determine the expression of nestin in meninges covering the brain tissue at different developmental stages and in the new born. In this study 10 human fetuses in different development stages between developmental weeks 9-34 and a newborn brain tissue were used. Fetuses in paraffin section were stained with H+E and nestin immunohistochemical staining protocol was performed. In this study, in the human meninges intense nestin expression was detected as early as in the 9th week of development. Intensity of this expression gradually decreased in later stages of development and nestin expression still persisted in a small population of newborn meningeal cells. In the present study, nestin positive cells gradually diminished in the developing and maturing meninges during the fetal period. This probably depends on initiation of a decrease in nestin expression and replacement with other tissue-specific intermediate filaments while the differentiation process continues. These differences can make significant contributions to the investigation and diagnosis of various pathological disorders (Tab. 1, Fig. 3, Ref. 36).

  2. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  3. Partially folded intermediates during trypsinogen denaturation

    Directory of Open Access Journals (Sweden)

    Martins N.F.

    1999-01-01

    Full Text Available The equilibrium unfolding of bovine trypsinogen was studied by circular dichroism, differential spectra and size exclusion HPLC. The change in free energy of denaturation was = 6.99 ± 1.40 kcal/mol for guanidine hydrochloride and = 6.37 ± 0.57 kcal/mol for urea. Satisfactory fits of equilibrium unfolding transitions required a three-state model involving an intermediate in addition to the native and unfolded forms. Size exclusion HPLC allowed the detection of an intermediate population of trypsinogen whose Stokes radii varied from 24.1 ± 0.4 Å to 26.0 ± 0.3 Å for 1.5 M and 2.5 M guanidine hydrochloride, respectively. During urea denaturation, the range of Stokes radii varied from 23.9 ± 0.3 Å to 25.7 ± 0.6 Å for 4.0 M and 6.0 M urea, respectively. Maximal intrinsic fluorescence was observed at about 3.8 M urea with 8-aniline-1-naphthalene sulfonate (ANS binding. These experimental data indicate that the unfolding of bovine trypsinogen is not a simple transition and suggest that the equilibrium intermediate population comprises one intermediate that may be characterized as a molten globule. To obtain further insight by studying intermediates representing different stages of unfolding, we hope to gain a better understanding of the complex interrelations between protein conformation and energetics.

  4. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  5. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  6. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  7. The Intermediate Velocity Source in the 40Ca + 197Au Reaction at 35 AMeV

    International Nuclear Information System (INIS)

    Planeta, R.; Sosin, Z.; Hachaj, P.

    2001-01-01

    The creation of hot Ca-like fragments and the emission of intermediate velocity particles was studied in the 40 Ca+ 197 Au reaction at 35 AMeV. For peripheral collisions the primary projectile-like fragment was reconstructed using the AMPHORA 4π detector system. The particle distributions are compared with the predictions of a Monte Carlo code which calculates the nucleon transfer and clustering probabilities according to the system density of states. The velocity distributions of charged particles projected on the beam direction can be explained if emissions from the hot projectile-like fragment and the target-like fragment are supplemented by an emission from an intermediate velocity source located between them. The properties of the intermediate velocity source are properly described, including the 2 D/ 3 T/ 3 He effect. (author)

  8. The ARES High-level Intermediate Representation

    Energy Technology Data Exchange (ETDEWEB)

    Moss, Nicholas David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-03

    The LLVM intermediate representation (IR) lacks semantic constructs for depicting common high-performance operations such as parallel and concurrent execution, communication and synchronization. Currently, representing such semantics in LLVM requires either extending the intermediate form (a signi cant undertaking) or the use of ad hoc indirect means such as encoding them as intrinsics and/or the use of metadata constructs. In this paper we discuss a work in progress to explore the design and implementation of a new compilation stage and associated high-level intermediate form that is placed between the abstract syntax tree and when it is lowered to LLVM's IR. This highlevel representation is a superset of LLVM IR and supports the direct representation of these common parallel computing constructs along with the infrastructure for supporting analysis and transformation passes on this representation.

  9. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  10. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  11. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  12. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  13. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  14. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  15. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  16. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  17. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  18. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  19. Intermediate-energy nuclear chemistry workshop

    Energy Technology Data Exchange (ETDEWEB)

    Butler, G.W.; Giesler, G.C.; Liu, L.C.; Dropesky, B.J.; Knight, J.D.; Lucero, F.; Orth, C.J.

    1981-05-01

    This report contains the proceedings of the LAMPF Intermediate-Energy Nuclear Chemistry Workshop held in Los Alamos, New Mexico, June 23-27, 1980. The first two days of the Workshop were devoted to invited review talks highlighting current experimental and theoretical research activities in intermediate-energy nuclear chemistry and physics. Working panels representing major topic areas carried out indepth appraisals of present research and formulated recommendations for future research directions. The major topic areas were Pion-Nucleus Reactions, Nucleon-Nucleus Reactions and Nuclei Far from Stability, Mesonic Atoms, Exotic Interactions, New Theoretical Approaches, and New Experimental Techniques and New Nuclear Chemistry Facilities.

  20. Intermediate-energy nuclear chemistry workshop

    International Nuclear Information System (INIS)

    Butler, G.W.; Giesler, G.C.; Liu, L.C.; Dropesky, B.J.; Knight, J.D.; Lucero, F.; Orth, C.J.

    1981-05-01

    This report contains the proceedings of the LAMPF Intermediate-Energy Nuclear Chemistry Workshop held in Los Alamos, New Mexico, June 23-27, 1980. The first two days of the Workshop were devoted to invited review talks highlighting current experimental and theoretical research activities in intermediate-energy nuclear chemistry and physics. Working panels representing major topic areas carried out indepth appraisals of present research and formulated recommendations for future research directions. The major topic areas were Pion-Nucleus Reactions, Nucleon-Nucleus Reactions and Nuclei Far from Stability, Mesonic Atoms, Exotic Interactions, New Theoretical Approaches, and New Experimental Techniques and New Nuclear Chemistry Facilities

  1. MNE Entrepreneurial Capabilities at Intermediate Levels

    DEFF Research Database (Denmark)

    Hoenen, Anne K.; Nell, Phillip Christopher; Ambos, Björn

    2014-01-01

    at intermediate geographical levels differ from local subsidiaries and global corporate headquarters, and why those differences are important. We illustrate our arguments using data on European regional headquarters (RHQs). We find that RHQs' entrepreneurial capabilities depend on their external embeddedness...... and on the heterogeneous information that is generated through dissimilar markets within the region. Our study opens up for an interesting discussion of the independence of these mechanisms. In sum, we contribute to the understanding of the entrepreneurial role of intermediate units in general and RHQs in particular....

  2. On financial equilibrium with intermediation costs

    DEFF Research Database (Denmark)

    Markeprand, Tobias Ejnar

    2008-01-01

    This paper studies the set of competitive equilibria in financial economies with intermediation costs. We consider an arbitrary dividend structure, which includes options and equity with limited liabilities.We show a general existence result and upper-hemi continuity of the equilibrium correspond......This paper studies the set of competitive equilibria in financial economies with intermediation costs. We consider an arbitrary dividend structure, which includes options and equity with limited liabilities.We show a general existence result and upper-hemi continuity of the equilibrium...

  3. Governance-Default Risk Relationship and the Demand for Intermediated and Non-Intermediated Debt

    Directory of Open Access Journals (Sweden)

    Husam Aldamen

    2012-09-01

    Full Text Available This paper explores the impact of corporate governance on the demand for intermediated debt (asset finance, bank debt, non-bank private debt and non-intermediated debt (public debt in the Australian debt market. Relative to other countries the Australian debt market is characterised by higher proportions of intermediated or private debt with a lower inherent level of information asymmetry in that private lenders have greater access to financial information (Gray, Koh & Tong 2009. Our firm level, cross-sectional evidence suggests that higher corporate governance impacts demand for debt via the mitigation of default risk. However, this relationship is not uniform across all debt types. Intermediated debt such as bank and asset finance debt are more responsive to changes in governance-default risk relationship than non-bank and non-intermediated debt. The implication is that a firm’s demand for different debt types will reflect its governance-default risk profile.

  4. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  5. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  6. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  7. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  8. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  9. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  10. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  11. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  12. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  13. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  14. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  15. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  16. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  17. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  18. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  19. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  20. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  1. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  2. Trusted intermediating agents in electronic trade networks

    NARCIS (Netherlands)

    T.B. Klos (Tomas); F. Alkemade (Floortje)

    2005-01-01

    htmlabstract Electronic commerce and trading of information goods significantly impact the role of intermediaries: consumers can bypass intermediating agents by forming direct links to producers. One reason that traditional intermediaries can still make a profit, is that they have more knowledge of

  3. What Should be Taught in Intermediate Macroeconomics?

    Science.gov (United States)

    de Araujo, Pedro; O'Sullivan, Roisin; Simpson, Nicole B.

    2013-01-01

    A lack of consensus remains on what should form the theoretical core of the undergraduate intermediate macroeconomic course. In determining how to deal with the Keynesian/classical divide, instructors must decide whether to follow the modern approach of building macroeconomic relationships from micro foundations, or to use the traditional approach…

  4. Bridge: Intelligent Tutoring with Intermediate Representations

    Science.gov (United States)

    1988-05-01

    Research and Development Center and Psychology Department University of Pittsburgh Pittsburgh, PA. 15260 The Artificial Intelligence and Psychology...problem never introduces more than one unfamiliar plan. Inteligent Tutoring With Intermediate Representations - Bonar and Cunniigbam 4 You must have a... Inteligent Tutoring With ntermediate Representations - Bonar and Cunningham 7 The requirements are specified at four differcnt levels, corresponding to

  5. Essays in corporate finance and financial intermediation

    NARCIS (Netherlands)

    Kempf, Elisabeth

    2016-01-01

    This thesis consists of three chapters in corporate finance and financial intermediation. The first two chapters explore sources of incentives and learning for finance professionals. Specifically, the first chapter studies how the option to go work for an investment bank affects the incentives of

  6. Being back home after intermediate care

    DEFF Research Database (Denmark)

    Martinsen, Bente; Harder, Ingegerd; Norlyk, Annelise

    2015-01-01

    Older people may face many challenges and experience insecurity after discharge from hospital to home. To bridge the potential gap between general hospital and home, the concept ‘Intermediate Care’ (IC) was developed at the beginning of 2000. IC aims to safeguard older people from being discharge...

  7. Financial intermediation with credit constrained agents

    Czech Academy of Sciences Publication Activity Database

    Boháček, Radim

    2007-01-01

    Roč. 29, č. 4 (2007), s. 741-759 ISSN 0164-0704 R&D Projects: GA AV ČR IAA700850602 Institutional research plan: CEZ:AV0Z70850503 Keywords : financial intermediation * occupational choice * general equilibrium Subject RIV: AH - Economics Impact factor: 0.360, year: 2007

  8. Changes to the Intermediate Accounting Course Sequence

    Science.gov (United States)

    Davidson, Lesley H.; Francisco, William H.

    2009-01-01

    There is an ever-growing amount of information that must be covered in Intermediate Accounting courses. Due to recent accounting standards and the implementation of IFRS this trend is likely to continue. This report incorporates the results of a recent survey to examine the trend of spending more course time to cover this additional material.…

  9. Essays in financial intermediation and political economy

    NARCIS (Netherlands)

    Luo, Mancy

    2017-01-01

    This thesis consists of three chapters in financial intermediation and political economy. The first chapter studies how investors’ preference for local stocks affects global mutual funds’ investment behaviors, and shows that mutual funds overweight stocks from their client countries (i.e., where

  10. Intermediality and politics in theatre and performance

    NARCIS (Netherlands)

    Dapp, G.S.

    2013-01-01

    This dissertation applies the concepts of intermediality and politics to five performances by Rimini Protokoll, Christoph Schlingensief, and Igneous, and analyzes the implications that emerge on both a significational and a theoretical level. Based on the specific mediality involved, it argues that

  11. Intermediates, Catalysts, Persistence, and Boundary Steady States

    DEFF Research Database (Denmark)

    Marcondes de Freitas, Michael; Feliu, Elisenda; Wiuf, Carsten

    2017-01-01

    networks without breaking known necessary or sufficient conditions for persistence, by iteratively removing socalled intermediates and catalysts from the network. The procedures are easy to apply and, in many cases, lead to highly simplified network structures, such as monomolecular networks. For specific...

  12. Intermediates and Generic Convergence to Equilibria

    DEFF Research Database (Denmark)

    Marcondes de Freitas, Michael; Wiuf, Carsten; Feliu, Elisenda

    2017-01-01

    Known graphical conditions for the generic and global convergence to equilibria of the dynamical system arising from a reaction network are shown to be invariant under the so-called successive removal of intermediates, a systematic procedure to simplify the network, making the graphical conditions...

  13. Software Testing An ISEB Intermediate Certificate

    CERN Document Server

    Hambling, Brian

    2009-01-01

    Covering testing fundamentals, reviews, testing and risk, test management and test analysis, this book helps newly qualified software testers to learn the skills and techniques to take them to the next level. Written by leading authors in the field, this is the only official textbook of the ISEB Intermediate Certificate in Software Testing.

  14. C and C* among intermediate rings

    NARCIS (Netherlands)

    Sack, J.; Watson, S.

    2014-01-01

    Given a completely regular Hausdorff space X, an intermediate ring A(X) is a ring of real valued continuous functions between C*(X) and C(X). We discuss two correspondences between ideals in A(X) and z-filters on X, both reviewing old results and introducing new results. One correspondence, ZA,

  15. Opening the Black Box of Intermediation

    DEFF Research Database (Denmark)

    Nowinska, Agnieszka

    This paper attempts to answer how external environmental factors affect intermediating firms within the maritime industry - the middlemen that plays a very important role in the sector. The category encompasses firms such as liner and port agencies, freight forwarders and shipbrokers, who link sh...

  16. The Effect of Digital Stories on Enhancing Iranian Pre-intermediate EFL Learners' Listening Comprehension

    Directory of Open Access Journals (Sweden)

    Nasrin Hadidi Tamjid

    2012-11-01

    Full Text Available Learning a foreign language is a challenging process in which learners need motivation and encouragement through the use of modern techniques. The present paper investigates the effects digital stories may have on Iranian pre-intermediate EFL learners’ listening comprehension. To this end, the researchers carried out a quasi-experimental research in a language institution in Tabriz (Iran. In total, forty, 11-14-year-old female students participated in this research. Twenty students were in the experimental group and twenty in the control group. The Preliminary English Test (PET was administered at the beginning of the study to check whether all participants were homogeneous in terms of English language proficiency. A pre-test of listening comprehension was designed to gather initial data on the learners' listening skill prior to the treatment. The experimental group was presented with digital stories in a technology-equipped classroom. After the treatment, a post-test was administered to both groups to test the learners' progression in listening comprehension. Then, using an ANCOVA test, the performance of two groups was compared.The findings indicated that the experimental group outperformed the control group in the final test. The results raise interesting issues related to the use of technology in the context of foreign language learning, substantiating the link between technology rich environment and improved language learning.

  17. Topological probability and connection strength induced activity in complex neural networks

    International Nuclear Information System (INIS)

    Du-Qu, Wei; Bo, Zhang; Dong-Yuan, Qiu; Xiao-Shu, Luo

    2010-01-01

    Recent experimental evidence suggests that some brain activities can be assigned to small-world networks. In this work, we investigate how the topological probability p and connection strength C affect the activities of discrete neural networks with small-world (SW) connections. Network elements are described by two-dimensional map neurons (2DMNs) with the values of parameters at which no activity occurs. It is found that when the value of p is smaller or larger, there are no active neurons in the network, no matter what the value of connection strength is; for a given appropriate connection strength, there is an intermediate range of topological probability where the activity of 2DMN network is induced and enhanced. On the other hand, for a given intermediate topological probability level, there exists an optimal value of connection strength such that the frequency of activity reaches its maximum. The possible mechanism behind the action of topological probability and connection strength is addressed based on the bifurcation method. Furthermore, the effects of noise and transmission delay on the activity of neural network are also studied. (general)

  18. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  19. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  20. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  1. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  2. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  3. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  4. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  5. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  6. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  8. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  9. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  10. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  11. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  12. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  13. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  14. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  15. Isoporphyrin Intermediate in Heme Oxygenase Catalysis

    Science.gov (United States)

    Evans, John P.; Niemevz, Fernando; Buldain, Graciela; de Montellano, Paul Ortiz

    2008-01-01

    Human heme oxygenase-1 (hHO-1) catalyzes the O2- and NADPH-dependent oxidation of heme to biliverdin, CO, and free iron. The first step involves regiospecific insertion of an oxygen atom at the α-meso carbon by a ferric hydroperoxide and is predicted to proceed via an isoporphyrin π-cation intermediate. Here we report spectroscopic detection of a transient intermediate during oxidation by hHO-1 of α-meso-phenylheme-IX, α-meso-(p-methylphenyl)-mesoheme-III, and α-meso-(p-trifluoromethylphenyl)-mesoheme-III. In agreement with previous experiments (Wang, J., Niemevz, F., Lad, L., Huang, L., Alvarez, D. E., Buldain, G., Poulos, T. L., and Ortiz de Montellano, P. R. (2004) J. Biol. Chem. 279, 42593–42604), only the α-biliverdin isomer is produced with concomitant formation of the corresponding benzoic acid. The transient intermediate observed in the NADPH-P450 reductase-catalyzed reaction accumulated when the reaction was supported by H2O2 and exhibited the absorption maxima at 435 and 930 nm characteristic of an isoporphyrin. Product analysis by reversed phase high performance liquid chromatography and liquid chromatography electrospray ionization mass spectrometry of the product generated with H2O2 identified it as an isoporphyrin that, on quenching, decayed to benzoylbiliverdin. In the presence of H218O2, one labeled oxygen atom was incorporated into these products. The hHO-1-isoporphyrin complexes were found to have half-lives of 1.7 and 2.4 h for the p-trifluoromethyl- and p-methyl-substituted phenylhemes, respectively. The addition of NADPH-P450 reductase to the H2O2-generated hHO-1-isoporphyrin complex produced α-biliverdin, confirming its role as a reaction intermediate. Identification of an isoporphyrin intermediate in the catalytic sequence of hHO-1, the first such intermediate observed in hemoprotein catalysis, completes our understanding of the critical first step of heme oxidation. PMID:18487208

  16. Time-resolved resonance Raman spectroscopy of intermediates of bacteriorhodopsin: The bK(590) intermediate.

    Science.gov (United States)

    Terner, J; Hsieh, C L; Burns, A R; El-Sayed, M A

    1979-07-01

    We have combined microbeam and flow techniques with computer subtraction methods to obtain the resonance Raman spectrum of the short lived batho-intermediate (bK(590)) of bacteriorhodopsin. Comparison of the spectra obtained in (1)H(2)O and (2)H(2)O, as well as the fact that the bK(590) intermediate shows large optical red shifts, suggests that the Schiff base linkage of this intermediate is protonated. The fingerprint region of the spectrum of bK(590), sensitive to the isomeric configuration of the retinal chromophore, does not resemble the corresponding region of the parent bR(570) form. The resonance Raman spectrum of bK(590) as well as the spectra of all of the other main intermediates in the photoreaction cycle of bacteriorhodopsin are discussed and compared with resonance Raman spectra of published model compounds.

  17. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  18. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  19. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  20. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  1. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  2. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  3. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  4. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  5. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  6. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  7. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  8. The Impact of Embedded Story Structures versus Sequential Story Structures on Critical Thinking of Iranian Intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Sara Samadi

    2016-09-01

    Full Text Available Confirming the constructive effects of reading comprehension on critical thinking, this paper attempted to investigate the impact of story structures on critical thinking of Iranian EFL learners. In doing so, the researcher utilized a quasi–experimental design with 60 intermediate students who were divided into two embedded story structures and sequential story structures groups (experimental groups. After taking PET, a critical thinking questionnaire was employed as a pre-test. The two groups received 16 sessions of treatment. All participants received similar amount of instruction but one group was given embedded short stories and the other group sequential short stories. To compare the two groups, they were received the parallel critical thinking questionnaire as a post-test. The two null hypotheses in this study were rejected due to different performance of the two groups. Statistical results did not support the superiority of neither structures. Therefore, the researcher was not able to suggest which structure caused a better or higher impact on critical thinking. However, the findings reveal that teaching story structures in EFL context can develop critical thinking of intermediate EFL learners. The study have some implications for test-designers, teachers, and students.

  9. Bioorthogonal Cycloadditions with Sub-Millisecond Intermediates.

    Science.gov (United States)

    Qing, Yujia; Pulcu, Gökçe Su; Bell, Nicholas A W; Bayley, Hagan

    2018-01-26

    Tetrazine- and sydnone-based click reactions have emerged as important bioconjugation strategies with fast kinetics and N 2 or CO 2 as the only byproduct. Mechanistic studies of these reactions have focused on the initial rate-determining cycloaddition steps. The subsequent N 2 or CO 2 release from the bicyclic intermediates has been approached mainly through computational studies, which have predicted lifetimes of femtoseconds. In the present study, bioorthogonal cycloadditions involving N 2 or CO 2 extrusion have been examined experimentally at the single-molecule level by using a protein nanoreactor. At the resolution of this approach, the reactions appeared to occur in a single step, which places an upper limit on the lifetimes of the intermediates of about 80 μs, which is consistent with the computational work. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Hγ Line Spectrum of Intermediate Polars

    Directory of Open Access Journals (Sweden)

    Yonggi Kim

    1998-06-01

    Full Text Available Kim & Beuermann (1995, 1996 have developed a model for the propagation of X-rays from the accreting white dwarf through the infalling material and the re-emission of the energy deposited by photo-absorption in the optical (and UV spectral range. By using this model, we calculate the profiles of the Hγ emission-line spectrum of intermediate polars. Photoabsorption of X-rays by the infalling material is the dominant process in forming the observed energy-dependent rotational modulation of the X-ray flux. X-ray and optical modulations are sensitive to model parameters in different ways. In principle, these dependencies allow us to obtain improved insight into the accretion geometry of the intermediate polars. We present results of our calculations and compare them with the Hβ line spectrum (Kim & Beuermann 1996.

  11. Comments on intermediate-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.; Enqvist, K.; Nanopoulos, D.V.; Olive, K.

    1987-04-23

    Some superstring-inspired models employ intermediate scales m/sub I/ of gauge symmetry breaking. Such scales should exceed 10/sup 16/ GeV in order to avoid prima facie problems with baryon decay through heavy particles and non-perturbative behaviour of the gauge couplings above m/sub I/. However, the intermediate-scale phase transition does not occur until the temperature of the Universe falls below O(m/sub W/), after which an enormous excess of entropy is generated. Moreover, gauge symmetry breaking by renormalization group-improved radiative corrections is inapplicable because the symmetry-breaking field has not renormalizable interactions at scales below m/sub I/. We also comment on the danger of baryon and lepton number violation in the effective low-energy theory.

  12. Comments on intermediate-scale models

    International Nuclear Information System (INIS)

    Ellis, J.; Enqvist, K.; Nanopoulos, D.V.; Olive, K.

    1987-01-01

    Some superstring-inspired models employ intermediate scales m I of gauge symmetry breaking. Such scales should exceed 10 16 GeV in order to avoid prima facie problems with baryon decay through heavy particles and non-perturbative behaviour of the gauge couplings above m I . However, the intermediate-scale phase transition does not occur until the temperature of the Universe falls below O(m W ), after which an enormous excess of entropy is generated. Moreover, gauge symmetry breaking by renormalization group-improved radiative corrections is inapplicable because the symmetry-breaking field has not renormalizable interactions at scales below m I . We also comment on the danger of baryon and lepton number violation in the effective low-energy theory. (orig.)

  13. Carbon monosulfide: a useful synthetic intermediate

    International Nuclear Information System (INIS)

    Kramer, M.P.

    1986-01-01

    The physical properties of carbon monosulfide, CS, are well documented. The molecule has been observed in interstellar space and is found to be a common intermediate in the thermal decomposition of carbon disulfide and other sulfur compounds. Interestingly enough, the chemistry of carbon monosulfide, a molecule that is isovalent with carbon monoxide, has received little attention. The explosive nature of the carbon monosulfide monomer, which hindered previous workers, was overcome by the development of special handling techniques. The ability to produce carbon monosulfide in gram quantities had lead to synthesis of novel compounds and to a more direct synthetic route for certain known compounds. Specifically, the following general reaction demonstrates the capabilities of carbon monosulfide on the synthetic scale. CS + RXY → RXC(S)Y;(X = N,S), (Y = H, Cl). Note: The initial product formed in the reaction can be an unstable intermediate

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  15. [Studies in intermediate energy nuclear physics

    International Nuclear Information System (INIS)

    Peterson, R.J.

    1993-01-01

    This report summarizes work carried out between October 1, 1992 and September 30, 1993 at the Nuclear Physics Laboratory of the University of Colorado, Boulder. The experimental program in intermediate-energy nuclear physics is very broadly based; it includes pion-nucleon and pion-nucleus studies at LAMPF and TRIUMF, kaon-nucleus scattering at the AGS, and equipment development for experiments at the next generation of accelerator facilities

  16. Far from the intermediate nuclear field

    International Nuclear Information System (INIS)

    Dietrich, K.; Wagner, G.J.; Gregoire, C.; Campi, X.; Silvestre-Brac, B.; Platchkov, S.; Mayer, B.; Abgrall, Y.; Bohigas, O.; Grange, P.; Signarbieux, C.

    1988-01-01

    Pairing correlations in nuclear physics; the BCS state and quasi-particles; the layer model; collision effects on nuclear dynamics; the theory of cluster formation (application to nucleus fragmentation); short range correlations (few-particle systems); deuterium electron scattering; dibaryonic resonances; traditional and exotic hadron probes of nuclear structure; spectral fluctuations and chaotic motion; corrections to the intermediate nuclear field (nonrelativistic and other effects); and heavy nuclei splitting and nuclear superfluidity are introduced [fr

  17. Intermedial Strategies of Memory in Contemporary Novels

    DEFF Research Database (Denmark)

    Tanderup, Sara

    2014-01-01

    , and Judd Morrissey and drawing on the theoretical perspectives of N. Katherine Hayles (media studies) and Andreas Huyssen (cultural memory studies), Tanderup argues that recent intermedial novels reflect a certain nostalgia celebrating and remembering the book as a visual and material object in the age...... of digital media while also highlighting the influence of new media on our cultural understanding and representation of memory and the past....

  18. UEP LT Codes with Intermediate Feedback

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Popovski, Petar; Østergaard, Jan

    2013-01-01

    We analyze a class of rateless codes, called Luby transform (LT) codes with unequal error protection (UEP). We show that while these codes successfully provide UEP, there is a significant price in terms of redundancy in the lower prioritized segments. We propose a modification with a single inter...... intermediate feedback message. Our analysis shows a dramatic improvement on the decoding performance of the lower prioritized segment....

  19. International express student's book : pre-intermediate

    CERN Document Server

    Taylor, Liz

    1996-01-01

    The New Edition of International Express Pre-Intermediate retains all the keys features of this popular and successel four-level course. It combines engaging, up-to-date topics with a time-efficient and student-centred approach to language work, and clearly focused activities that reflect learner's real communicative needs - the ideal course for professional adults who use English for work, travel, and socializing.

  20. Probabilistic assessment of steel moment frames incremental collapse (ordinary, intermediate and special under earthquake

    Directory of Open Access Journals (Sweden)

    Kourosh Mehdizadeh

    2017-11-01

    Full Text Available Building collapse is a level of the structure performance in which the amount of financial and life loss is maximized, so this event could be the worst incident in the construction. Regarding to the possibility of destructive earthquakes in different parts of the world, detailed assessment of the structure's collapse has been one of the major challenges of the structural engineering. In this regard, offering models based on laboratory studies, considering the effective parameters and appropriate earthquakes could be a step towards achieving this goal. In this research, a five-story steel structure with a system of ordinary, intermediate and special moment frame (low, intermediate and high ductility has been designed based on the local regulations. In this study, the effect of resistance and stiffness deterioration of the structural elements based on the results of the laboratory models have been considered and the ductility role in the collapse capacity of steel moment frames has been investigated as probabilistic matter. For this purpose, incremental dynamic analysis has been done under 50 pairs of earthquake records proposing FEMA P695 instruction and fragility curves of various performance levels are developed. Results showed higher collapse capacity of special moment steel frame than the intermediate and ordinary moment frames. In the 50 percent probability level, the collapse capacity of special moment frame increased 34 % compared to the intermediate moment frame and 66 % to the ordinary moment frame. Also, the results showed that for different collapse spectral accelerations, the use of special moment frame instead of intermediate and ordinary moment frames reduces the collapse probability to 30 and 50 % respectively.

  1. Multifragmentation in intermediate energy heavy ion collisions

    International Nuclear Information System (INIS)

    Jacak, B.V.; Britt, H.C.; Claesson, G.

    1986-01-01

    There has been considerable recent interest in the production of intermediate mass fragments (A > 4) in intermediate and high energy nucleus-nucleus collisions. The mechanism for production of these fragments is not well understood and has been described by models employing a variety of assumptions. Some examples are: disassembly of a system in thermal equilibrium into nucleons and nuclear fragments, liquid-vapor phase transitions in nuclear matter, final state coalescence of nucleons and dynamical correlations between nucleons at breakup. Previous studies of fragment production, with one exception, have been single particle inclusive measurements; the observed fragment mass (or charge) distributions can be described by all of the models above. To gain insight into the fragment production mechanism, the authors used the GSI/LBL Plastic Ball detector system to get full azimuthal coverage for intermediate mass fragments in the forward hemisphere in the center of mass system while measuring all the light particles in each event. The authors studied the systems 200 MeV/nucleon Au + Au and Au + Fe

  2. Reactivity of Criegee Intermediates toward Carbon Dioxide.

    Science.gov (United States)

    Lin, Yen-Hsiu; Takahashi, Kaito; Lin, Jim Jr-Min

    2018-01-04

    Recent theoretical work by Kumar and Francisco suggested that the high reactivity of Criegee intermediates (CIs) could be utilized for designing efficient carbon capture technologies. Because the anti-CH 3 CHOO + CO 2 reaction has the lowest barrier in their study, we chose to investigate it experimentally. We probed anti-CH 3 CHOO with its strong UV absorption at 365 nm and measured the rate coefficient to be ≤2 × 10 -17 cm 3 molecule -1 s -1 at 298 K, which is consistent with our theoretical value of 2.1 × 10 -17 cm 3  molecule -1 s -1 at the QCISD(T)/CBS//B3LYP/6-311+G(2d,2p) level but inconsistent with their results obtained at the M06-2X/aug-cc-pVTZ level, which tends to underestimate the barrier heights. The experimental result indicates that the reaction of a Criegee intermediate with atmospheric CO 2 (400 ppmv) would be inefficient (k eff < 0.2 s -1 ) and cannot compete with other decay processes of Criegee intermediates like reactions with water vapor (∼10 3 s -1 ) or thermal decomposition (∼10 2 s -1 ).

  3. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  4. The Probability Heuristics Model of Syllogistic Reasoning.

    Science.gov (United States)

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  5. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  6. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  7. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  8. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.

  9. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  10. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  11. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  12. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  13. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  14. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  15. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  16. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  17. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  18. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  19. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  20. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  1. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  2. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  3. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  4. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  5. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  6. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  7. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  8. Quantum probability and conceptual combination in conjunctions.

    Science.gov (United States)

    Hampton, James A

    2013-06-01

    I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.

  9. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  10. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  11. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  12. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  13. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  14. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  15. Constraint-based Student Modelling in Probability Story Problems with Scaffolding Techniques

    Directory of Open Access Journals (Sweden)

    Nabila Khodeir

    2018-01-01

    Full Text Available Constraint-based student modelling (CBM is an important technique employed in intelligent tutoring systems to model student knowledge to provide relevant assistance. This paper introduces the Math Story Problem Tutor (MAST, a Web-based intelligent tutoring system for probability story problems, which is able to generate problems of different contexts, types and difficulty levels for self-paced learning. Constraints in MAST are specified at a low-level of granularity to allow fine-grained diagnosis of the student error. Furthermore, MAST extends CBM to address errors due to misunderstanding of the narrative story. It can locate and highlight keywords that may have been overlooked or misunderstood leading to an error. This is achieved by utilizing the role of sentences and keywords that are defined through the Natural Language Generation (NLG methods deployed in the story problem generation. MAST also integrates CBM with scaffolding questions and feedback to provide various forms of help and guidance to the student. This allows the student to discover and correct any errors in his/her solution. MAST has been preliminary evaluated empirically and the results show the potential effectiveness in tutoring students with a decrease in the percentage of violated constraints along the learning curve. Additionally, there is a significant improvement in the results of the post–test exam in comparison to the pre-test exam of the students using MAST in comparison to those relying on the textbook

  16. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  17. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  18. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  19. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  20. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr