WorldWideScience

Sample records for significantly increased probability

  1. Gas revenue increasingly significant

    International Nuclear Information System (INIS)

    Megill, R.E.

    1991-01-01

    This paper briefly describes the wellhead prices of natural gas compared to crude oil over the past 70 years. Although natural gas prices have never reached price parity with crude oil, the relative value of a gas BTU has been increasing. It is one of the reasons that the total amount of money coming from natural gas wells is becoming more significant. From 1920 to 1955 the revenue at the wellhead for natural gas was only about 10% of the money received by producers. Most of the money needed for exploration, development, and production came from crude oil. At present, however, over 40% of the money from the upstream portion of the petroleum industry is from natural gas. As a result, in a few short years natural gas may become 50% of the money revenues generated from wellhead production facilities

  2. Encounter Probability of Significant Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    The determination of the design wave height (often given as the significant wave height) is usually based on statistical analysis of long-term extreme wave height measurement or hindcast. The result of such extreme wave height analysis is often given as the design wave height corresponding to a c...

  3. Probability of seeing increases saccadic readiness.

    Directory of Open Access Journals (Sweden)

    Thérèse Collins

    Full Text Available Associating movement directions or endpoints with monetary rewards or costs influences movement parameters in humans, and associating movement directions or endpoints with food reward influences movement parameters in non-human primates. Rewarded movements are facilitated relative to non-rewarded movements. The present study examined to what extent successful foveation facilitated saccadic eye movement behavior, with the hypothesis that foveation may constitute an informational reward. Human adults performed saccades to peripheral targets that either remained visible after saccade completion or were extinguished, preventing visual feedback. Saccades to targets that were systematically extinguished were slower and easier to inhibit than saccades to targets that afforded successful foveation, and this effect was modulated by the probability of successful foveation. These results suggest that successful foveation facilitates behavior, and that obtaining the expected sensory consequences of a saccadic eye movement may serve as a reward for the oculomotor system.

  4. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  5. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  6. An array-based study of increased system lifetime probability

    DEFF Research Database (Denmark)

    Nesgaard, Carsten

    2003-01-01

    Society's increased dependence on electronic systems calls for highly reliable power supplies comprised of multiple converters working in parallel. This paper describes a redundancy control scheme, based on the array technology that increases the overall reliability quite considerably and thereby...

  7. An array-based study of increased system lifetime probability

    DEFF Research Database (Denmark)

    Nesgaard, Carsten

    2002-01-01

    Society's increased dependence on electronic systems calls for highly reliable power supplies comprised of multiple converters working in parallel. This paper describes a redundancy control scheme, based on the array technology that increases the overall reliability quite considerably and thereby...

  8. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  9. Econometric analysis of the changing effects in wind strength and significant wave height on the probability of casualty in shipping.

    Science.gov (United States)

    Knapp, Sabine; Kumar, Shashi; Sakurada, Yuri; Shen, Jiajun

    2011-05-01

    This study uses econometric models to measure the effect of significant wave height and wind strength on the probability of casualty and tests whether these effects changed. While both effects are in particular relevant for stability and strength calculations of vessels, it is also helpful for the development of ship construction standards in general to counteract increased risk resulting from changing oceanographic conditions. The authors analyzed a unique dataset of 3.2 million observations from 20,729 individual vessels in the North Atlantic and Arctic regions gathered during the period 1979-2007. The results show that although there is a seasonal pattern in the probability of casualty especially during the winter months, the effect of wind strength and significant wave height do not follow the same seasonal pattern. Additionally, over time, significant wave height shows an increasing effect in January, March, May and October while wind strength shows a decreasing effect, especially in January, March and May. The models can be used to simulate relationships and help understand the relationships. This is of particular interest to naval architects and ship designers as well as multilateral agencies such as the International Maritime Organization (IMO) that establish global standards in ship design and construction. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  11. Head multidetector computed tomography: emergency medicine physicians overestimate the pretest probability and legal risk of significant findings.

    Science.gov (United States)

    Baskerville, Jerry Ray; Herrick, John

    2012-02-01

    This study focuses on clinically assigned prospective estimated pretest probability and pretest perception of legal risk as independent variables in the ordering of multidetector computed tomographic (MDCT) head scans. Our primary aim is to measure the association between pretest probability of a significant finding and pretest perception of legal risk. Secondarily, we measure the percentage of MDCT scans that physicians would not order if there was no legal risk. This study is a prospective, cross-sectional, descriptive analysis of patients 18 years and older for whom emergency medicine physicians ordered a head MDCT. We collected a sample of 138 patients subjected to head MDCT scans. The prevalence of a significant finding in our population was 6%, yet the pretest probability expectation of a significant finding was 33%. The legal risk presumed was even more dramatic at 54%. These data support the hypothesis that physicians presume the legal risk to be significantly higher than the risk of a significant finding. A total of 21% or 15% patients (95% confidence interval, ±5.9%) would not have been subjected to MDCT if there was no legal risk. Physicians overestimated the probability that the computed tomographic scan would yield a significant result and indicated an even greater perceived medicolegal risk if the scan was not obtained. Physician test-ordering behavior is complex, and our study queries pertinent aspects of MDCT testing. The magnification of legal risk vs the pretest probability of a significant finding is demonstrated. Physicians significantly overestimated pretest probability of a significant finding on head MDCT scans and presumed legal risk. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Sixteen-Day Bedrest Significantly Increases Plasma Colloid Osmotic Pressure

    Science.gov (United States)

    Hargens, Alan R.; Hsieh, S. T.; Murthy, G.; Ballard, R. E.; Convertino, V. A.; Wade, Charles E. (Technical Monitor)

    1994-01-01

    Upon exposure to microgravity, astronauts lose up to 10% of their total plasma volume, which may contribute to orthostatic intolerance after space flight. Because plasma colloid osmotic pressure (COP) is a primary factor maintaining plasma volume, our objective was to measure time course changes in COP during microgravity simulated by 6 deg. head-down tilt (HDT). Seven healthy male subjects (30-55 years of age) were placed in HDT for 16 days. For the purpose of another study, three of the seven subjects were chosen to exercise on a cycle ergometer on day 16. Blood samples were drawn immediately before bedrest on day 14 of bedrest, 18-24 hours following exercise while all subjects were still in HDT and 1 hour following bedrest termination. Plasma COP was measured in all 20 microliter EDTA-treated samples using an osmometer fitted with a PM 30 membrane. Data were analyzed with paired and unpaired t-tests. Plasma COP on day 14 of bedrest (29.9 +/- 0.69 mmHg) was significantly higher (p less than 0.005) than the control, pre-bedrest value (23.1 +/- 0.76 mmHg). At one hour of upright recovery after HDT, plasma COP remained significantly elevated (exercise: 26.9 +/- 0.87 mmHg; no exercise: 26.3 +/- 0.85 mmHg). Additionally, exercise had no significant effect on plasma COP 18-24 hours following exercise (exercise: 27.8 +/- 1.09 mmHg; no exercise: 27.1 +/- 0.78 mmHg). Our results demonstrate that plasma COP increases significantly with microgravity simulated by HDT. However, preliminary results indicate exercise during HDT does not significantly affect plasma COP.

  13. Significance in the increase of women psychiatrists in Korea.

    Science.gov (United States)

    Kim, Ha Kyoung; Kim, Soo In

    2008-01-01

    The number of female doctors has increased in Korea; 18.9% (13,083) of the total medical doctors registered (69,097) were women in 2006, compared to 13.6% (2,216) in 1975. The proportion of female doctors will jump up by 2010 considering that nearly 40% of the medical students are women as of today. This trend has had strong influence on the field of psychiatry; the percentage of women psychiatrists rose from 1.6 (6)% to 18% (453), from 1975 to 2006 and now women residents comprise 39% (206) of all. This is not only a reflection of a social phenomenon of the increase in professional women but also attributed to some specific characteristics of the psychiatry. Psychiatric practice may come more natural to women. While clinical activities of women psychiatrists are expanding, there are few women leaders and much less women are involving in academic activities in this field as yet. Though there is less sexual discrimination in the field of psychiatry, women psychiatrists are still having a lot of difficulties in balancing work and family matters. Many women psychiatrists also report they've ever felt an implied discrimination in their careers. In this study, we are to identify the characteristics of women psychiatrists and to explore the significance of the increase in women psychiatrists in Korea and the situation in which they are.

  14. Lung scans with significant perfusion defects limited to matching pleural effusions have a low probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Datz, F.L.; Bedont, R.A.; Taylor, A.

    1985-01-01

    Patients with a pleural effusion on chest x-ray often undergo a lung scan to exclude pulmonary embolism (PE). According to other studies, when the scan shows a perfusion defect equal in size to a radiographic abnormality on chest x-ray, the scan should be classified as indeterminate or intermediate probability for PE. However, since those studies dealt primarily with alveolar infiltrates rather than pleural effusions, the authors undertook a retrospective study to determine the probability of PE in patients with pleural effusion and a matching perfusion defect. The authors reviewed 451 scans and x-rays of patients studied for suspected PE. Of those, 53 had moderate or large perfusion defects secondary to pleural effusion without other significant (>25% of a segment) effusion without other significant (>25% of a segment) defects on the scan. Final diagnosis was confirmed by pulmonary angiography (16), thoracentesis (40), venography (11), other radiographic and laboratory studies, and clinical course. Of the 53 patients, only 2 patients had venous thrombotic disease. One patient had PE on pulmonary angiography, the other patient had thrombophlebitis on venography. The remainder of the patients had effusions due to congestive heart failure (12), malignancy (12), infection (7), trauma (7), collegen vascular disease (7), sympathetic effusion (3) and unknown etiology (3). The authors conclude that lung scans with significant perfusion defects limited to matching pleural effusions on chest x-ray have a low probability for PE

  15. Increasing the statistical significance of entanglement detection in experiments.

    Science.gov (United States)

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  16. Heightened fire probability in Indonesia in non-drought conditions: the effect of increasing temperatures

    Science.gov (United States)

    Fernandes, Kátia; Verchot, Louis; Baethgen, Walter; Gutierrez-Velez, Victor; Pinedo-Vasquez, Miguel; Martius, Christopher

    2017-05-01

    In Indonesia, drought driven fires occur typically during the warm phase of the El Niño Southern Oscillation. This was the case of the events of 1997 and 2015 that resulted in months-long hazardous atmospheric pollution levels in Equatorial Asia and record greenhouse gas emissions. Nonetheless, anomalously active fire seasons have also been observed in non-drought years. In this work, we investigated the impact of temperature on fires and found that when the July-October (JASO) period is anomalously dry, the sensitivity of fires to temperature is modest. In contrast, under normal-to-wet conditions, fire probability increases sharply when JASO is anomalously warm. This describes a regime in which an active fire season is not limited to drought years. Greater susceptibility to fires in response to a warmer environment finds support in the high evapotranspiration rates observed in normal-to-wet and warm conditions in Indonesia. We also find that fire probability in wet JASOs would be considerably less sensitive to temperature were not for the added effect of recent positive trends. Near-term regional climate projections reveal that, despite negligible changes in precipitation, a continuing warming trend will heighten fire probability over the next few decades especially in non-drought years. Mild fire seasons currently observed in association with wet conditions and cool temperatures will become rare events in Indonesia.

  17. Increasing the statistical significance of entanglement detection in experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)

    2010-07-01

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.

  18. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    CERN Document Server

    Rotwain, I; Kuznetsov, I V; Panza, G F; Peresan, A

    2003-01-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M >= M sub 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M sub 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M >= M sub 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The co...

  19. APP Homodimers Transduce an Amyloid-β-Mediated Increase in Release Probability at Excitatory Synapses

    Directory of Open Access Journals (Sweden)

    Hilla Fogel

    2014-06-01

    Full Text Available Accumulation of amyloid-β peptides (Aβ, the proteolytic products of the amyloid precursor protein (APP, induces a variety of synaptic dysfunctions ranging from hyperactivity to depression that are thought to cause cognitive decline in Alzheimer’s disease. While depression of synaptic transmission has been extensively studied, the mechanisms underlying synaptic hyperactivity remain unknown. Here, we show that Aβ40 monomers and dimers augment release probability through local fine-tuning of APP-APP interactions at excitatory hippocampal boutons. Aβ40 binds to the APP, increases the APP homodimer fraction at the plasma membrane, and promotes APP-APP interactions. The APP activation induces structural rearrangements in the APP/Gi/o-protein complex, boosting presynaptic calcium flux and vesicle release. The APP growth-factor-like domain (GFLD mediates APP-APP conformational changes and presynaptic enhancement. Thus, the APP homodimer constitutes a presynaptic receptor that transduces signal from Aβ40 to glutamate release. Excessive APP activation may initiate a positive feedback loop, contributing to hippocampal hyperactivity in Alzheimer’s disease.

  20. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    International Nuclear Information System (INIS)

    Rotwain, I.; Kuznetsov, I.; De Natale, G.; Peresan, A.; Panza, G.F.

    2003-06-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M ≥ M 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M ≥ M 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The control experiment 'Seismic History' demonstrates the stability of the obtained results and indicates that the algorithm CN can be applied to monitor the preparation of impending earthquakes with M ≥ 3.0 at Mt. Vesuvius. (author)

  1. The Brazilian policy of withholding treatment for ADHD is probably increasing health and social costs

    Directory of Open Access Journals (Sweden)

    Carlos R. Maia

    2015-03-01

    Full Text Available Objective: To estimate the economic consequences of the current Brazilian government policy for attention-deficit/hyperactivity disorder (ADHD treatment and how much the country would save if treatment with immediate-release methylphenidate (MPH-IR, as suggested by the World Health Organization (WHO, was offered to patients with ADHD. Method: Based on conservative previous analyses, we assumed that 257,662 patients aged 5 to 19 years are not receiving ADHD treatment in Brazil. We estimated the direct costs and savings of treating and not treating ADHD on the basis of the following data: a spending on ADHD patients directly attributable to grade retention and emergency department visits; and b savings due to impact of ADHD treatment on these outcomes. Results: Considering outcomes for which data on the impact of MPH-IR treatment are available, Brazil is probably wasting approximately R$ 1.841 billion/year on the direct consequences of not treating ADHD in this age range alone. On the other hand, treating ADHD in accordance with WHO recommendations would save approximately R$ 1.163 billion/year. Conclusions: By increasing investments on MPH-IR treatment for ADHD to around R$ 377 million/year, the country would save approximately 3.1 times more than is currently spent on the consequences of not treating ADHD in patients aged 5 to 19 years.

  2. Resveratrol enhances airway surface liquid depth in sinonasal epithelium by increasing cystic fibrosis transmembrane conductance regulator open probability.

    Directory of Open Access Journals (Sweden)

    Shaoyan Zhang

    Full Text Available Chronic rhinosinusitis engenders enormous morbidity in the general population, and is often refractory to medical intervention. Compounds that augment mucociliary clearance in airway epithelia represent a novel treatment strategy for diseases of mucus stasis. A dominant fluid and electrolyte secretory pathway in the nasal airways is governed by the cystic fibrosis transmembrane conductance regulator (CFTR. The objectives of the present study were to test resveratrol, a strong potentiator of CFTR channel open probability, in preparation for a clinical trial of mucociliary activators in human sinus disease.Primary sinonasal epithelial cells, immortalized bronchoepithelial cells (wild type and F508del CFTR, and HEK293 cells expressing exogenous human CFTR were investigated by Ussing chamber as well as patch clamp technique under non-phosphorylating conditions. Effects on airway surface liquid depth were measured using confocal laser scanning microscopy. Impact on CFTR gene expression was measured by quantitative reverse transcriptase polymerase chain reaction.Resveratrol is a robust CFTR channel potentiator in numerous mammalian species. The compound also activated temperature corrected F508del CFTR and enhanced CFTR-dependent chloride secretion in human sinus epithelium ex vivo to an extent comparable to the recently approved CFTR potentiator, ivacaftor. Using inside out patches from apical membranes of murine cells, resveratrol stimulated an ~8 picosiemens chloride channel consistent with CFTR. This observation was confirmed in HEK293 cells expressing exogenous CFTR. Treatment of sinonasal epithelium resulted in a significant increase in airway surface liquid depth (in µm: 8.08+/-1.68 vs. 6.11+/-0.47,control,p<0.05. There was no increase CFTR mRNA.Resveratrol is a potent chloride secretagogue from the mucosal surface of sinonasal epithelium, and hydrates airway surface liquid by increasing CFTR channel open probability. The foundation for a

  3. Significant increase of Echinococcus multilocularis prevalencein foxes, but no increased predicted risk for humans

    NARCIS (Netherlands)

    Maas, M.; Dam-Deisz, W.D.C.; Roon, van A.M.; Takumi, K.; Giessen, van der J.W.B.

    2014-01-01

    The emergence of the zoonotic tapeworm Echinococcus multilocularis, causative agent ofalveolar echinococcosis (AE), poses a public health risk. A previously designed risk mapmodel predicted a spread of E. multilocularis and increasing numbers of alveolar echinococ-cosis patients in the province of

  4. Increasing vaginal progesterone gel supplementation after frozen-thawed embryo transfer significantly increases the delivery rate

    DEFF Research Database (Denmark)

    Alsbjerg, Birgit; Polyzos, Nikolaos P; Elbaek, Helle Olesen

    2013-01-01

    The aim of this study was to evaluate the reproductive outcome in patients receiving frozen-thawed embryo transfer before and after doubling of the vaginal progesterone gel supplementation. The study was a retrospective study performed in The Fertility Clinic, Skive Regional Hospital, Denmark....... A total of 346 infertility patients with oligoamenorrhoea undergoing frozen-thawed embryo transfer after priming with oestradiol and vaginal progesterone gel were included. The vaginal progesterone dose was changed from 90mg (Crinone) once a day to twice a day and the reproductive outcome during the two...... rate (8.7% versus 20.5%, respectively; P=0.002). Doubling of the vaginal progesterone gel supplementation during frozen-thawed embryo transfer cycles decreased the early pregnancy loss rate, resulting in a significantly higher delivery rate. This study evaluated the reproductive outcome of 346 women...

  5. Reduced probability of smoking cessation in men with increasing number of job losses and partnership breakdowns

    DEFF Research Database (Denmark)

    Kriegbaum, Margit; Larsen, Anne Mette; Christensen, Ulla

    2011-01-01

    and to study joint exposure to both. Methods Birth cohort study of smoking cessation of 6232 Danish men born in 1953 with a follow-up at age 51 (response rate 66.2%). History of unemployment and cohabitation was measured annually using register data. Information on smoking cessation was obtained...... by a questionnaire. Results The probability of smoking cessation decreased with the number of job losses (ranging from 1 OR 0.54 (95% CI 0.46 to 0.64) to 3+ OR 0.41 (95% CI 0.30 to 0.55)) and of broken partnerships (ranging from 1 OR 0.74 (95% CI 0.63 to 0.85) to 3+ OR 0.50 (95% CI 0.39 to 0.63)). Furthermore......–23 years (OR 0.44, 95% CI 0.37 to 0.52)). Those who never cohabited and experienced one or more job losses had a particular low chance of smoking cessation (OR 0.19, 95% CI 0.12 to 0.30). Conclusion The numbers of job losses and of broken partnerships were both inversely associated with probability...

  6. Multiple fields may offer better esophagus sparing without increased probability of lung toxicity in optimized IMRT of lung tumors

    International Nuclear Information System (INIS)

    Chapet, Olivier; Fraass, Benedick A.; Haken, Randall K. ten

    2006-01-01

    Purpose: To evaluate whether increasing numbers of intensity-modulated radiation therapy (IMRT) fields enhance lung-tumor dose without additional predicted toxicity for difficult planning geometries. Methods and Materials: Data from 8 previous three dimensional conformal radiation therapy (3D-CRT) patients with tumors located in various regions of each lung, but with planning target volumes (PTVs) overlapping part of the esophagus, were used as input. Four optimized-beamlet IMRT plans (1 plan that used the 3D-CRT beam arrangement and 3 plans with 3, 5, or 7 axial, but predominantly one-sided, fields) were compared. For IMRT, the equivalent uniform dose (EUD) in the whole PTV was optimized simultaneously with that in a reduced PTV exclusive of the esophagus. Normal-tissue complication probability-based costlets were used for the esophagus, heart, and lung. Results: Overall, IMRT plans (optimized by use of EUD to judiciously allow relaxed PTV dose homogeneity) result in better minimum PTV isodose surface coverage and better average EUD values than does conformal planning; dose generally increases with the number of fields. Even 7-field plans do not significantly alter normal-lung mean-dose values or lung volumes that receive more than 13, 20, or 30 Gy. Conclusion: Optimized many-field IMRT plans can lead to escalated lung-tumor dose in the special case of esophagus overlapping PTV, without unacceptable alteration in the dose distribution to normal lung

  7. Probability of burn-through of defective 13 kA splices at increased energy levels

    CERN Document Server

    Verweij, A

    2011-01-01

    In many 13 kA splices in the machine there is a lack of bonding between the superconducting cable and the stabilising copper along with a bad contact between the bus stabiliser and the splice stabiliser. In case of a quench of such a defective splice, the current cannot bypass the cable through the copper, hence leading to excessive local heating of the cable. This may result in a thermal runaway and burn-through of the cable in a time smaller than the time constant of the circuit. Since it is not possible to protect against this fast thermal run-away, one has to limit the current to a level that is small enough so that a burn-through cannot occur. Prompt quenching of the joint, and quenching due to heat propagation through the bus and through the helium are considered. Probabilities for joint burn-through are given for the RB circuit for beam energies of 3.5, 4 and 4.5 TeV, and a decay time constant of the RB circuit of 50 and 68 s.

  8. Reading the copepod personal ads : increasing encounter probability with hydromechanical signals

    NARCIS (Netherlands)

    van Duren, LA; Stamhuis, EJ; Videler, JJ

    1998-01-01

    Females of the calanoid copepod Temora longicornis react to chemical exudates of male conspecifics with little hops, quite distinct from their normal smooth uniform swimming motion. These hops possibly serve to create a hydrodynamical signal in the surrounding water, to increase encounter

  9. Probable causes of increasing brucellosis in free-ranging elk of the Greater Yellowstone Ecosystem

    Science.gov (United States)

    Cross, P.C.; Cole, E.K.; Dobson, A.P.; Edwards, W.H.; Hamlin, K.L.; Luikart, G.; Middleton, A.D.; Scurlock, B.M.; White, P.J.

    2010-01-01

    While many wildlife species are threatened, some populations have recovered from previous overexploitation, and data linking these population increases with disease dynamics are limited. We present data suggesting that free-ranging elk (Cervus elaphus) are a maintenance host for Brucella abortus in new areas of the Greater Yellowstone Ecosystem (GYE). Brucellosis seroprevalence in free-ranging elk increased from 0-7% in 1991-1992 to 8-20% in 2006-2007 in four of six herd units around the GYE. These levels of brucellosis are comparable to some herd units where elk are artificially aggregated on supplemental feeding grounds. There are several possible mechanisms for this increase that we evaluated using statistical and population modeling approaches. Simulations of an age-structured population model suggest that the observed levels of seroprevalence are unlikely to be sustained by dispersal from supplemental feeding areas with relatively high seroprevalence or an older age structure. Increases in brucellosis seroprevalence and the total elk population size in areas with feeding grounds have not been statistically detectable. Meanwhile, the rate of seroprevalence increase outside the feeding grounds was related to the population size and density of each herd unit. Therefore, the data suggest that enhanced elk-to-elk transmission in free-ranging populations may be occurring due to larger winter elk aggregations. Elk populations inside and outside of the GYE that traditionally did not maintain brucellosis may now be at risk due to recent population increases. In particular, some neighboring populations of Montana elk were 5-9 times larger in 2007 than in the 1970s, with some aggregations comparable to the Wyoming feeding-ground populations. Addressing the unintended consequences of these increasing populations is complicated by limited hunter access to private lands, which places many ungulate populations out of administrative control. Agency-landowner hunting access

  10. Sugar administration to newly emerged Aedes albopictus males increases their survival probability and mating performance.

    Science.gov (United States)

    Bellini, Romeo; Puggioli, Arianna; Balestrino, Fabrizio; Brunelli, Paolo; Medici, Anna; Urbanelli, Sandra; Carrieri, Marco

    2014-04-01

    Aedes albopictus male survival in laboratory cages is no more than 4-5 days when kept without any access to sugar indicating their need to feed on a sugar source soon after emergence. We therefore developed a device to administer energetic substances to newly emerged males when released as pupae as part of a sterile insect technique (SIT) programme, made with a polyurethane sponge 4 cm thick and perforated with holes 2 cm in diameter. The sponge was imbibed with the required sugar solution and due to its high retention capacity the sugar solution was available for males to feed for at least 48 h. When evaluated in lab cages, comparing adults emerged from the device with sugar solution vs the device with water only (as negative control), about half of the males tested positive for fructose using the Van Handel anthrone test, compared to none of males in the control cage. We then tested the tool in semi-field and in field conditions with different sugar concentrations (10%, 15%, and 20%) and compared results to the controls fed with water only. Males were recaptured by a battery operated manual aspirator at 24 and 48 h after pupae release. Rather high share 10-25% of captured males tested positive for fructose in recollections in the vicinity of the control stations, while in the vicinity of the sugar stations around 40-55% of males were positive, though variability between replicates was large. The sugar positive males in the control test may have been released males that had access to natural sugar sources found close to the release station and/or wild males present in the environment. Only a slight increase in the proportion of positive males was obtained by increasing the sugar concentration in the feeding device from 10% to 20%. Surprisingly, modification of the device to add a black plastic inverted funnel above the container reduced rather than increased the proportion of fructose positive males collected around the station. No evidence of difference in the

  11. Increasing Fractional Doses Increases the Probability of Benign PSA Bounce in Patients Undergoing Definitive HDR Brachytherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Hauck, Carlin R.; Ye, Hong; Chen, Peter Y.; Gustafson, Gary S.; Limbacher, Amy; Krauss, Daniel J., E-mail: Daniel.krauss@beaumont.edu

    2017-05-01

    Purpose: Prostate-specific antigen (PSA) bounce is a temporary elevation of the PSA level above a prior nadir. The purpose of this study was to determine whether the frequency of a PSA bounce following high-dose-rate (HDR) interstitial brachytherapy for the treatment of prostate cancer is associated with individual treatment fraction size. Methods and Materials: Between 1999 and 2014, 554 patients underwent treatment of low- or intermediate-risk prostate cancer with definitive HDR brachytherapy as monotherapy and had ≥3 subsequent PSA measurements. Four different fraction sizes were used: 950 cGy × 4 fractions, 1200 cGy × 2 fractions, 1350 cGy × 2 fractions, 1900 cGy × 1 fraction. Four definitions of PSA bounce were applied: ≥0.2, ≥0.5, ≥1.0, and ≥2.0 ng/mL above the prior nadir with a subsequent return to the nadir. Results: The median follow-up period was 3.7 years. The actuarial 3-year rate of PSA bounce for the entire cohort was 41.3%, 28.4%, 17.4%, and 6.8% for nadir +0.2, +0.5, +1.0, and +2.0 ng/mL, respectively. The 3-year rate of PSA bounce >0.2 ng/mL was 42.2%, 32.1%, 41.0%, and 59.1% for the 950-, 1200-, 1350-, and 1900-cGy/fraction levels, respectively (P=.002). The hazard ratio for bounce >0.2 ng/mL for patients receiving a single fraction of 1900 cGy compared with those receiving treatment in multiple fractions was 1.786 (P=.024). For patients treated with a single 1900-cGy fraction, the 1-, 2-, and 3-year rates of PSA bounce exceeding the Phoenix biochemical failure definition (nadir +2 ng/mL) were 4.5%, 18.7%, and 18.7%, respectively, higher than the rates for all other administered dose levels (P=.025). Conclusions: The incidence of PSA bounce increases with single-fraction HDR treatment. Knowledge of posttreatment PSA kinetics may aid in decision making regarding management of potential biochemical failures.

  12. Increasing Fractional Doses Increases the Probability of Benign PSA Bounce in Patients Undergoing Definitive HDR Brachytherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Hauck, Carlin R.; Ye, Hong; Chen, Peter Y.; Gustafson, Gary S.; Limbacher, Amy; Krauss, Daniel J.

    2017-01-01

    Purpose: Prostate-specific antigen (PSA) bounce is a temporary elevation of the PSA level above a prior nadir. The purpose of this study was to determine whether the frequency of a PSA bounce following high-dose-rate (HDR) interstitial brachytherapy for the treatment of prostate cancer is associated with individual treatment fraction size. Methods and Materials: Between 1999 and 2014, 554 patients underwent treatment of low- or intermediate-risk prostate cancer with definitive HDR brachytherapy as monotherapy and had ≥3 subsequent PSA measurements. Four different fraction sizes were used: 950 cGy × 4 fractions, 1200 cGy × 2 fractions, 1350 cGy × 2 fractions, 1900 cGy × 1 fraction. Four definitions of PSA bounce were applied: ≥0.2, ≥0.5, ≥1.0, and ≥2.0 ng/mL above the prior nadir with a subsequent return to the nadir. Results: The median follow-up period was 3.7 years. The actuarial 3-year rate of PSA bounce for the entire cohort was 41.3%, 28.4%, 17.4%, and 6.8% for nadir +0.2, +0.5, +1.0, and +2.0 ng/mL, respectively. The 3-year rate of PSA bounce >0.2 ng/mL was 42.2%, 32.1%, 41.0%, and 59.1% for the 950-, 1200-, 1350-, and 1900-cGy/fraction levels, respectively (P=.002). The hazard ratio for bounce >0.2 ng/mL for patients receiving a single fraction of 1900 cGy compared with those receiving treatment in multiple fractions was 1.786 (P=.024). For patients treated with a single 1900-cGy fraction, the 1-, 2-, and 3-year rates of PSA bounce exceeding the Phoenix biochemical failure definition (nadir +2 ng/mL) were 4.5%, 18.7%, and 18.7%, respectively, higher than the rates for all other administered dose levels (P=.025). Conclusions: The incidence of PSA bounce increases with single-fraction HDR treatment. Knowledge of posttreatment PSA kinetics may aid in decision making regarding management of potential biochemical failures.

  13. [False positive results or what's the probability that a significant P-value indicates a true effect?

    Science.gov (United States)

    Cucherat, Michel; Laporte, Silvy

    2017-09-01

    The use of statistical test is central in the clinical trial. At the statistical level, obtaining a Pinformation about the plausibility of the existence of treatment effect. With "Pfalse positive is very high. This is the case if the power is low, if there is an inflation of the alpha risk or if the result is exploratory or chance discoveries. This possibility is important to take into consideration when interpreting the results of clinical trials in order to avoid pushing ahead significant results in appearance, but which are likely to be actually false positive results. Copyright © 2017 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  14. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  15. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  16. Global Trends in Alzheimer Disease Clinical Development: Increasing the Probability of Success.

    Science.gov (United States)

    Sugino, Haruhiko; Watanabe, Akihito; Amada, Naoki; Yamamoto, Miho; Ohgi, Yuta; Kostic, Dusan; Sanchez, Raymond

    2015-08-01

    Alzheimer disease (AD) is a growing global health and economic issue as elderly populations increase dramatically across the world. Despite the many clinical trials conducted, currently no approved disease-modifying treatment exists. In this commentary, the present status of AD drug development and the grounds for collaborations between government, academia, and industry to accelerate the development of disease-modifying AD therapies are discussed. Official government documents, literature, and news releases were surveyed by MEDLINE and website research. Currently approved anti-AD drugs provide only short-lived symptomatic improvements, which have no effect on the underlying pathogenic mechanisms or progression of the disease. The failure to approve a disease-modifying drug for AD may be because the progression of AD in the patient populations enrolled in clinical studies was too advanced for drugs to demonstrate cognitive and functional improvements. The US Food and Drug Administration and the European Medicines Agency recently published draft guidance for industry which discusses approaches for conducting clinical studies with patients in early AD stages. For successful clinical trials in early-stage AD, however, it will be necessary to identify biomarkers highly correlated with the clinical onset and the longitudinal progress of AD. In addition, because of the high cost and length of clinical AD studies, support in the form of global initiatives and collaborations between government, industry, and academia is needed. In response to this situation, national guidance and international collaborations have been established. Global initiatives are focusing on 2025 as a goal to provide new treatment options, and early signs of success in biomarker and drug development are already emerging. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.

  17. Alcohol Increases Delay and Probability Discounting of Condom-Protected Sex: A Novel Vector for Alcohol-Related HIV Transmission.

    Science.gov (United States)

    Johnson, Patrick S; Sweeney, Mary M; Herrmann, Evan S; Johnson, Matthew W

    2016-06-01

    Alcohol use, especially at binge levels, is associated with sexual HIV risk behavior, but the mechanisms through which alcohol increases sexual risk taking are not well-examined. Delay discounting, that is, devaluation of future consequences as a function of delay to their occurrence, has been implicated in a variety of problem behaviors, including risky sexual behavior. Probability discounting is studied with a similar framework as delay discounting, but is a distinct process in which a consequence is devalued because it is uncertain or probabilistic. Twenty-three, nondependent alcohol users (13 male, 10 female; mean age = 25.3 years old) orally consumed alcohol (1 g/kg) or placebo in 2 separate experimental sessions. During sessions, participants completed tasks examining delay and probability discounting of hypothetical condom-protected sex (Sexual Delay Discounting Task, Sexual Probability Discounting Task) and of hypothetical and real money. Alcohol decreased the likelihood that participants would wait to have condom-protected sex versus having immediate, unprotected sex. Alcohol also decreased the likelihood that participants would use an immediately available condom given a specified level of sexually transmitted infection (STI) risk. Alcohol did not affect delay discounting of money, but it did increase participants' preferences for larger, probabilistic monetary rewards over smaller, certain rewards. Acute, binge-level alcohol intoxication may increase sexual HIV risk by decreasing willingness to delay sex in order to acquire a condom in situations where one is not immediately available, and by decreasing sensitivity to perceived risk of STI contraction. These findings suggest that delay and probability discounting are critical, but heretofore unrecognized, processes that may mediate the relations between alcohol use and HIV risk. Copyright © 2016 by the Research Society on Alcoholism.

  18. Increased probability of repetitive spinal motoneuron activation by transcranial magnetic stimulation after muscle fatigue in healthy subjects

    DEFF Research Database (Denmark)

    Andersen, Birgit; Felding, Ulrik Ascanius; Krarup, Christian

    2012-01-01

    Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased...... probability of repetitive spinal MN activation during exercise even if some MNs failed to discharge by the brain stimulus. Here we used a modified TST (Quadruple stimulation; QuadS and Quintuple stimulation; QuintS) to examine the influence of fatiguing exercise on second and third MN discharges after......, reflecting that a greater proportion of spinal MNs were activated 2 or 3 times by the transcranial stimulus. The size of QuadS responses did not return to pre-contraction levels during 10 min observation time indicating long-lasting increase in excitatory input to spinal MNs. In addition, the post...

  19. Continuous background light significantly increases flashing-light enhancement of photosynthesis and growth of microalgae.

    Science.gov (United States)

    Abu-Ghosh, Said; Fixler, Dror; Dubinsky, Zvy; Iluz, David

    2015-01-01

    Under specific conditions, flashing light enhances the photosynthesis rate in comparison to continuous illumination. Here we show that a combination of flashing light and continuous background light with the same integrated photon dose as continuous or flashing light alone can be used to significantly enhance photosynthesis and increase microalgae growth. To test this hypothesis, the green microalga Dunaliella salina was exposed to three different light regimes: continuous light, flashing light, and concomitant application of both. Algal growth was compared under three different integrated light quantities; low, intermediate, and moderately high. Under the combined light regime, there was a substantial increase in all algal growth parameters, with an enhanced photosynthesis rate, within 3days. Our strategy demonstrates a hitherto undescribed significant increase in photosynthesis and algal growth rates, which is beyond the increase by flashing light alone. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. St. John's wort significantly increased the systemic exposure and toxicity of methotrexate in rats

    International Nuclear Information System (INIS)

    Yang, Shih-Ying; Juang, Shin-Hun; Tsai, Shang-Yuan; Chao, Pei-Dawn Lee; Hou, Yu-Chi

    2012-01-01

    St. John's wort (SJW, Hypericum perforatum) is one of the popular nutraceuticals for treating depression. Methotrexate (MTX) is an immunosuppressant with narrow therapeutic window. This study investigated the effect of SJW on MTX pharmacokinetics in rats. Rats were orally given MTX alone and coadministered with 300 and 150 mg/kg of SJW, and 25 mg/kg of diclofenac, respectively. Blood was withdrawn at specific time points and serum MTX concentrations were assayed by a specific monoclonal fluorescence polarization immunoassay method. The results showed that 300 mg/kg of SJW significantly increased the AUC 0−t and C max of MTX by 163% and 60%, respectively, and 150 mg/kg of SJW significantly increased the AUC 0−t of MTX by 55%. In addition, diclofenac enhanced the C max of MTX by 110%. The mortality of rats treated with SJW was higher than that of controls. In conclusion, coadministration of SJW significantly increased the systemic exposure and toxicity of MTX. The combined use of MTX with SJW would need to be with caution. -- Highlights: ► St. John's wort significantly increased the AUC 0−t and C max of methotrexate. ► Coadministration of St. John's wort increased the exposure and toxicity of methotrexate. ► The combined use of methotrexate with St. John's wort will need to be with caution.

  1. Increased frequency of retinopathy of prematurity over the last decade and significant regional differences.

    Science.gov (United States)

    Holmström, Gerd; Tornqvist, Kristina; Al-Hawasi, Abbas; Nilsson, Åsa; Wallin, Agneta; Hellström, Ann

    2018-03-01

    Retinopathy of prematurity (ROP) causes childhood blindness globally in prematurely born infants. Although increased levels of oxygen supply lead to increased survival and reduced frequency of cerebral palsy, increased incidence of ROP is reported. With the help of a Swedish register for ROP, SWEDROP, national and regional incidences of ROP and frequencies of treatment were evaluated from 2008 to 2015 (n = 5734), as well as before and after targets of provided oxygen changed from 85-89% to 91-95% in 2014. Retinopathy of prematurity (ROP) was found in 31.9% (1829/5734) of all infants with a gestational age (GA) of <31 weeks at birth and 5.7% of the infants (329/5734) had been treated for ROP. Analyses of the national data revealed an increased incidence of ROP during the 8-year study period (p = 0.003), but there was no significant increase in the frequency of treatment. There were significant differences between the seven health regions of Sweden, regarding both incidence of ROP and frequency of treatment (p < 0.001). Comparison of regional data before and after the new oxygen targets revealed a significant increase in treated ROP in one region [OR: 2.24 (CI: 1.11-4.49), p = 0.024] and a borderline increase in one other [OR: 3.08 (CI: 0.99-9.60), p = 0.052]. The Swedish national ROP register revealed an increased incidence of ROP during an 8-year period and significant regional differences regarding the incidence of ROP and frequency of treatment. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. THE SMALL BUT SIGNIFICANT AND NONTRANSITORY INCREASE IN PRICES (SSNIP TEST

    Directory of Open Access Journals (Sweden)

    Liviana Niminet

    2008-12-01

    Full Text Available The Small but Significant Nontransitory Increase in Price Test was designed to define the relevant market by concepts of product, geographical area and time. This test, also called the ,,hypothetical monopolistic test” is the subject of many researches both economical and legal as it deals with economic concepts as well as with legally aspects.

  3. Evaluation of Significance of Diffusely Increased Bilateral Renal Uptake on Bone Scan

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Mi Sook; Yang, Woo Jin; Byun, Jae Young; Park, Jung Mi; Shinn, Kyung Sub; Bahk, Yong Whee [Catholic University College of Medicine, Seoul (Korea, Republic of)

    1990-03-15

    Unexpected renal abnormality can be detected on bone scan using {sup 99m}Tc-MDP. The purpose of the study is to evaluate the diagnostic significance of diffusely increased bilateral renal uptake on bone scan. 1,500 bone scan were reviewed and 43 scans which showed diffusely increased bilateral renal uptake were selected for analysis. Laboratory findings for renal and liver function tests including routine urinalysis were reviewed in 43 patients. 26 of 43 case showed abnormality in urinalysis and renal function study. 20 of 43 cases showed abnormal liver function study and 3 of these cases were diagnosed as hepatorenal syndrome later. 13 of those 20 cases had liver cirrhosis with or without hepatoma. 12 of 43 cases showed abnormality both in renal and liver function studies. 2 of 43 cases showed diffusely increased bilateral renal uptake after chemotherapy for cancer but not on previous scans before chemotherapy. 2 of 43 cases showed hypercalcaemia and 8 of 43 cases had multifocal bone uptake due to metastasis or benign bone lesion. But the latter showed no hypercalcaemia at all. There was no significant correlation between increased renal uptake and MDP uptake in soft tissue other than kidneys. This study raised the possibility that the impaired liver and/or renal function may result in diffuse increase of bilateral renal uptake of MDP of unknown mechanism. It seems to need further study on this correlation.

  4. Evaluation of Significance of Diffusely Increased Bilateral Renal Uptake on Bone Scan

    International Nuclear Information System (INIS)

    Sung, Mi Sook; Yang, Woo Jin; Byun, Jae Young; Park, Jung Mi; Shinn, Kyung Sub; Bahk, Yong Whee

    1990-01-01

    Unexpected renal abnormality can be detected on bone scan using 99m Tc-MDP. The purpose of the study is to evaluate the diagnostic significance of diffusely increased bilateral renal uptake on bone scan. 1,500 bone scan were reviewed and 43 scans which showed diffusely increased bilateral renal uptake were selected for analysis. Laboratory findings for renal and liver function tests including routine urinalysis were reviewed in 43 patients. 26 of 43 case showed abnormality in urinalysis and renal function study. 20 of 43 cases showed abnormal liver function study and 3 of these cases were diagnosed as hepatorenal syndrome later. 13 of those 20 cases had liver cirrhosis with or without hepatoma. 12 of 43 cases showed abnormality both in renal and liver function studies. 2 of 43 cases showed diffusely increased bilateral renal uptake after chemotherapy for cancer but not on previous scans before chemotherapy. 2 of 43 cases showed hypercalcaemia and 8 of 43 cases had multifocal bone uptake due to metastasis or benign bone lesion. But the latter showed no hypercalcaemia at all. There was no significant correlation between increased renal uptake and MDP uptake in soft tissue other than kidneys. This study raised the possibility that the impaired liver and/or renal function may result in diffuse increase of bilateral renal uptake of MDP of unknown mechanism. It seems to need further study on this correlation.

  5. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    Energy Technology Data Exchange (ETDEWEB)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A. [Dept. of Diagnostic and Interventional Radiology and Neuroradiology, University Hospital Essen, Essen (Germany)

    2007-04-15

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography.

  6. Noninvasive Coronary Angiography using 64-Detector-Row Computed Tomography in Patients with a Low to Moderate Pretest Probability of Significant Coronary Artery Disease

    International Nuclear Information System (INIS)

    Schlosser, T.; Mohrs, O.K.; Magedanz, A.; Nowak, B.; Voigtlaender, T.; Barkhausen, J.; Schmermund, A.

    2007-01-01

    Purpose: To evaluate the value of 64-detector-row computed tomography for ruling out high-grade coronary stenoses in patients with a low to moderate pretest probability of significant coronary artery disease. Material and Methods: The study included 61 patients with a suspicion of coronary artery disease on the basis of atypical angina or ambiguous findings in noninvasive stress testing and a class II indication for invasive coronary angiography (ICA). All patients were examined by 64-detector-row computed tomography angiography (CTA) and ICA. On a coronary segmental level, the presence of significant (>50% diameter) stenoses was examined. Results: In a total of 915 segments, CTA detected 62 significant stenoses. Thirty-four significant stenoses were confirmed by ICA, whereas 28 stenoses could not be confirmed by ICA. Twenty-two of them showed wall irregularities on ICA, and six were angiographically normal. Accordingly, on a coronary segmental basis, 28 false-positive and 0 false-negative findings resulted in a sensitivity of 100%, a specificity of 96.8%, a positive predictive value of 54.8%, and a negative predictive value of 100%. The diagnostic accuracy was 96.9%. Conclusion: Sixty-four-detector-row computed tomography reliably detects significant coronary stenoses in patients with suspected coronary artery disease and appears to be helpful in the selection of patients who need to undergo ICA. Calcified and non-calcified plaques are detected. Grading of stenoses in areas with calcification is difficult. Frequently, stenosis severity is overestimated by 64-detector-row computed tomography

  7. Introducing extra NADPH consumption ability significantly increases the photosynthetic efficiency and biomass production of cyanobacteria.

    Science.gov (United States)

    Zhou, Jie; Zhang, Fuliang; Meng, Hengkai; Zhang, Yanping; Li, Yin

    2016-11-01

    Increasing photosynthetic efficiency is crucial to increasing biomass production to meet the growing demands for food and energy. Previous theoretical arithmetic analysis suggests that the light reactions and dark reactions are imperfectly coupled due to shortage of ATP supply, or accumulation of NADPH. Here we hypothesized that solely increasing NADPH consumption might improve the coupling of light reactions and dark reactions, thereby increasing the photosynthetic efficiency and biomass production. To test this hypothesis, an NADPH consumption pathway was constructed in cyanobacterium Synechocystis sp. PCC 6803. The resulting extra NADPH-consuming mutant grew much faster and achieved a higher biomass concentration. Analyses of photosynthesis characteristics showed the activities of photosystem II and photosystem I and the light saturation point of the NADPH-consuming mutant all significantly increased. Thus, we demonstrated that introducing extra NADPH consumption ability is a promising strategy to increase photosynthetic efficiency and to enable utilization of high-intensity lights. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  8. Hydrologic effects of large southwestern USA wildfires significantly increase regional water supply: fact or fiction?

    Science.gov (United States)

    Wine, M. L.; Cadol, D.

    2016-08-01

    In recent years climate change and historic fire suppression have increased the frequency of large wildfires in the southwestern USA, motivating study of the hydrological consequences of these wildfires at point and watershed scales, typically over short periods of time. These studies have revealed that reduced soil infiltration capacity and reduced transpiration due to tree canopy combustion increase streamflow at the watershed scale. However, the degree to which these local increases in runoff propagate to larger scales—relevant to urban and agricultural water supply—remains largely unknown, particularly in semi-arid mountainous watersheds co-dominated by winter snowmelt and the North American monsoon. To address this question, we selected three New Mexico watersheds—the Jemez (1223 km2), Mogollon (191 km2), and Gila (4807 km2)—that together have been affected by over 100 wildfires since 1982. We then applied climate-driven linear models to test for effects of fire on streamflow metrics after controlling for climatic variability. Here we show that, after controlling for climatic and snowpack variability, significantly more streamflow discharged from the Gila watershed for three to five years following wildfires, consistent with increased regional water yield due to enhanced infiltration-excess overland flow and groundwater recharge at the large watershed scale. In contrast, we observed no such increase in discharge from the Jemez watershed following wildfires. Fire regimes represent a key difference between the contrasting responses of the Jemez and Gila watersheds with the latter experiencing more frequent wildfires, many caused by lightning strikes. While hydrologic dynamics at the scale of large watersheds were previously thought to be climatically dominated, these results suggest that if one fifth or more of a large watershed has been burned in the previous three to five years, significant increases in water yield can be expected.

  9. Breast-cancer-associated metastasis is significantly increased in a model of autoimmune arthritis.

    Science.gov (United States)

    Das Roy, Lopamudra; Pathangey, Latha B; Tinder, Teresa L; Schettini, Jorge L; Gruber, Helen E; Mukherjee, Pinku

    2009-01-01

    Sites of chronic inflammation are often associated with the establishment and growth of various malignancies including breast cancer. A common inflammatory condition in humans is autoimmune arthritis (AA) that causes inflammation and deformity of the joints. Other systemic effects associated with arthritis include increased cellular infiltration and inflammation of the lungs. Several studies have reported statistically significant risk ratios between AA and breast cancer. Despite this knowledge, available for a decade, it has never been questioned if the site of chronic inflammation linked to AA creates a milieu that attracts tumor cells to home and grow in the inflamed bones and lungs which are frequent sites of breast cancer metastasis. To determine if chronic inflammation induced by autoimmune arthritis contributes to increased breast cancer-associated metastasis, we generated mammary gland tumors in SKG mice that were genetically prone to develop AA. Two breast cancer cell lines, one highly metastatic (4T1) and the other non-metastatic (TUBO) were used to generate the tumors in the mammary fat pad. Lung and bone metastasis and the associated inflammatory milieu were evaluated in the arthritic versus the non-arthritic mice. We report a three-fold increase in lung metastasis and a significant increase in the incidence of bone metastasis in the pro-arthritic and arthritic mice compared to non-arthritic control mice. We also report that the metastatic breast cancer cells augment the severity of arthritis resulting in a vicious cycle that increases both bone destruction and metastasis. Enhanced neutrophilic and granulocytic infiltration in lungs and bone of the pro-arthritic and arthritic mice and subsequent increase in circulating levels of proinflammatory cytokines, such as macrophage colony stimulating factor (M-CSF), interleukin-17 (IL-17), interleukin-6 (IL-6), vascular endothelial growth factor (VEGF), and tumor necrosis factor-alpha (TNF-alpha) may contribute

  10. Breast cancer-associated metastasis is significantly increased in a model of autoimmune arthritis

    Science.gov (United States)

    Das Roy, Lopamudra; Pathangey, Latha B; Tinder, Teresa L; Schettini, Jorge L; Gruber, Helen E; Mukherjee, Pinku

    2009-01-01

    Introduction Sites of chronic inflammation are often associated with the establishment and growth of various malignancies including breast cancer. A common inflammatory condition in humans is autoimmune arthritis (AA) that causes inflammation and deformity of the joints. Other systemic effects associated with arthritis include increased cellular infiltration and inflammation of the lungs. Several studies have reported statistically significant risk ratios between AA and breast cancer. Despite this knowledge, available for a decade, it has never been questioned if the site of chronic inflammation linked to AA creates a milieu that attracts tumor cells to home and grow in the inflamed bones and lungs which are frequent sites of breast cancer metastasis. Methods To determine if chronic inflammation induced by autoimmune arthritis contributes to increased breast cancer-associated metastasis, we generated mammary gland tumors in SKG mice that were genetically prone to develop AA. Two breast cancer cell lines, one highly metastatic (4T1) and the other non-metastatic (TUBO) were used to generate the tumors in the mammary fat pad. Lung and bone metastasis and the associated inflammatory milieu were evaluated in the arthritic versus the non-arthritic mice. Results We report a three-fold increase in lung metastasis and a significant increase in the incidence of bone metastasis in the pro-arthritic and arthritic mice compared to non-arthritic control mice. We also report that the metastatic breast cancer cells augment the severity of arthritis resulting in a vicious cycle that increases both bone destruction and metastasis. Enhanced neutrophilic and granulocytic infiltration in lungs and bone of the pro-arthritic and arthritic mice and subsequent increase in circulating levels of proinflammatory cytokines, such as macrophage colony stimulating factor (M-CSF), interleukin-17 (IL-17), interleukin-6 (IL-6), vascular endothelial growth factor (VEGF), and tumor necrosis factor

  11. Significant increase of surface ozone at a rural site, north of eastern China

    Directory of Open Access Journals (Sweden)

    Z. Ma

    2016-03-01

    Full Text Available Ozone pollution in eastern China has become one of the top environmental issues. Quantifying the temporal trend of surface ozone helps to assess the impacts of the anthropogenic precursor reductions and the likely effects of emission control strategies implemented. In this paper, ozone data collected at the Shangdianzi (SDZ regional atmospheric background station from 2003 to 2015 are presented and analyzed to obtain the variation in the trend of surface ozone in the most polluted region of China, north of eastern China or the North China Plain. A modified Kolmogorov–Zurbenko (KZ filter method was performed on the maximum daily average 8 h (MDA8 concentrations of ozone to separate the contributions of different factors from the variation of surface ozone and remove the influence of meteorological fluctuations on surface ozone. Results reveal that the short-term, seasonal and long-term components of ozone account for 36.4, 57.6 and 2.2 % of the total variance, respectively. The long-term trend indicates that the MDA8 has undergone a significant increase in the period of 2003–2015, with an average rate of 1.13 ± 0.01 ppb year−1 (R2 = 0.92. It is found that meteorological factors did not significantly influence the long-term variation of ozone and the increase may be completely attributed to changes in emissions. Furthermore, there is no significant correlation between the long-term O3 and NO2 trends. This study suggests that emission changes in VOCs might have played a more important role in the observed increase of surface ozone at SDZ.

  12. Neurite outgrowth is significantly increased by the simultaneous presentation of Schwann cells and moderate exogenous electric fields

    Science.gov (United States)

    Koppes, Abigail N.; Seggio, Angela M.; Thompson, Deanna M.

    2011-08-01

    Axonal extension is influenced by a variety of external guidance cues; therefore, the development and optimization of a multi-faceted approach is probably necessary to address the intricacy of functional regeneration following nerve injury. In this study, primary dissociated neonatal rat dorsal root ganglia neurons and Schwann cells were examined in response to an 8 h dc electrical stimulation (0-100 mV mm-1). Stimulated samples were then fixed immediately, immunostained, imaged and analyzed to determine Schwann cell orientation and characterize neurite outgrowth relative to electric field strength and direction. Results indicate that Schwann cells are viable following electrical stimulation with 10-100 mV mm-1, and retain a normal morphology relative to unstimulated cells; however, no directional bias is observed. Neurite outgrowth was significantly enhanced by twofold following exposure to either a 50 mV mm-1 electric field (EF) or co-culture with unstimulated Schwann cells by comparison to neurons cultured alone. Neurite outgrowth was further increased in the presence of simultaneously applied cues (Schwann cells + 50 mV mm-1 dc EF), exhibiting a 3.2-fold increase over unstimulated control neurons, and a 1.2-fold increase over either neurons cultured with unstimulated Schwann cells or the electrical stimulus alone. These results indicate that dc electric stimulation in combination with Schwann cells may provide synergistic guidance cues for improved axonal growth relevant to nerve injuries in the peripheral nervous system.

  13. Application of Bioorganic Fertilizer Significantly Increased Apple Yields and Shaped Bacterial Community Structure in Orchard Soil.

    Science.gov (United States)

    Wang, Lei; Li, Jing; Yang, Fang; E, Yaoyao; Raza, Waseem; Huang, Qiwei; Shen, Qirong

    2017-02-01

    and Rhodospirillaceae, were found to be the significantly increased by the BOF addition and the genus Lysobacter may identify members of this group effective in biological control-based plant disease management and the members of family Rhodospirillaceae had an important role in fixing molecular nitrogen. These results strengthen the understanding of responses to the BOF and possible interactions within bacterial communities in soil that can be associated with disease suppression and the accumulation of carbon and nitrogen. The increase of apple yields after the application of BOF might be attributed to the fact that the application of BOF increased SOM, and soil total nitrogen, and changed the bacterial community by enriching Rhodospirillaceae, Alphaprotreobateria, and Proteobacteria.

  14. One stone, two birds: silica nanospheres significantly increase photocatalytic activity and colloidal stability of photocatalysts

    Science.gov (United States)

    Rasamani, Kowsalya D.; Foley, Jonathan J., IV; Sun, Yugang

    2018-03-01

    Silver-doped silver chloride [AgCl(Ag)] nanoparticles represent a unique class of visible-light-driven photocatalysts, in which the silver dopants introduce electron-abundant mid-gap energy levels to lower the bandgap of AgCl. However, free-standing AgCl(Ag) nanoparticles, particularly those with small sizes and large surface areas, exhibit low colloidal stability and low compositional stability upon exposure to light irradiation, leading to easy aggregation and conversion to metallic silver and thus a loss of photocatalytic activity. These problems could be eliminated by attaching the small AgCl(Ag) nanoparticles to the surfaces of spherical dielectric silica particles with submicrometer sizes. The high optical transparency in the visible spectral region (400-800 nm), colloidal stability, and chemical/electronic inertness displayed by the silica spheres make them ideal for supporting photocatalysts and significantly improving their stability. The spherical morphology of the dielectric silica particles can support light scattering resonances to generate significantly enhanced electric fields near the silica particle surfaces, on which the optical absorption cross-section of the AgCl(Ag) nanoparticles is dramatically increased to promote their photocatalytic activity. The hybrid silica/AgCl(Ag) structures exhibit superior photocatalytic activity and stability, suitable for supporting photocatalysis sustainably; for instance, their efficiency in the photocatalytic decomposition of methylene blue decreases by only ˜9% even after ten cycles of operation.

  15. Phytohormone supplementation significantly increases growth of Chlamydomonas reinhardtii cultivated for biodiesel production.

    Science.gov (United States)

    Park, Won-Kun; Yoo, Gursong; Moon, Myounghoon; Kim, Chul Woong; Choi, Yoon-E; Yang, Ji-Won

    2013-11-01

    Cultivation is the most expensive step in the production of biodiesel from microalgae, and substantial research has been devoted to developing more cost-effective cultivation methods. Plant hormones (phytohormones) are chemical messengers that regulate various aspects of growth and development and are typically active at very low concentrations. In this study, we investigated the effect of different phytohormones on microalgal growth and biodiesel production in Chlamydomonas reinhardtii and their potential to lower the overall cost of commercial biofuel production. The results indicated that all five of the tested phytohormones (indole-3-acetic acid, gibberellic acid, kinetin, 1-triacontanol, and abscisic acid) promoted microalgal growth. In particular, hormone treatment increased biomass production by 54 to 69 % relative to the control growth medium (Tris-acetate-phosphate, TAP). Phytohormone treatments also affected microalgal cell morphology but had no effect on the yields of fatty acid methyl esters (FAMEs) as a percent of biomass. We also tested the effect of these phytohormones on microalgal growth in nitrogen-limited media by supplementation in the early stationary phase. Maximum cell densities after addition of phytohormones were higher than in TAP medium, even when the nitrogen source was reduced to 40 % of that in TAP medium. Taken together, our results indicate that phytohormones significantly increased microalgal growth, particularly in nitrogen-limited media, and have potential for use in the development of efficient microalgal cultivation for biofuel production.

  16. [Significant increase in the colonisation of Staphylococcus aureus among medical students during their hospital practices].

    Science.gov (United States)

    Rodríguez-Avial, Carmen; Alvarez-Novoa, Andrea; Losa, Azucena; Picazo, Juan J

    2013-10-01

    Staphylococcus aureus is a pathogen of major concern. The emergence of methicillin-resistant S. aureus (MRSA) has increasingly complicated the therapeutic approach of hospital-acquired infections. Surveillance of MRSA and control measures must be implemented in different healthcare settings, including screening programs for carriers. Our first aim was to determine the prevalence of methicillin-susceptible S. aureus (MSSA) and MRSA nasal carriage in medical students from the Clínico San Carlos Hospital (Madrid). As the MRSA carrier rate in healthcare workers is higher than in the general population, we hypothesised that carrier rate could be increased during their clinical practice in their last three years. We performed an epidemiologic al study of the prevalence of S. aureus colonisation among a group of medical students, who were sampled in 2008 in their third-year, and in 2012 when this class was in its sixth year. We have found a significant increase in MSSA carriage, from 27% to 46%. There were no MRSA colonisations in the third-year, but one was found in the sixth-year group. The large majority of strains (89%) of strains were resistant to penicillin, and 27% to erythromycin and clindamycin. As 19 coagulase-negative Staphylococcus MR were also identified, a horizontal transfer of genes, such as mecA gene to S. aureus, could have occurred. Medical students are both, at risk for acquiring, and a potential source of nosocomial pathogens, mainly MSSA. Therefore, they should take special care for hygienic precautions, such as frequent and proper hand washing, while working in the hospital. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  17. Skipping one or more dialysis sessions significantly increases mortality: measuring the impact of non-adherence

    Directory of Open Access Journals (Sweden)

    Eduardo Gottlieb

    2014-06-01

    Full Text Available Introduction: Non-adherence to the prescribed dialysis sessions frequency ranges from 2% to 50% of patients. The objective of this study was to evaluate the impact of detecting and measuring the non-adherence to the prescribed dialysis frequency and to determine the importance of a multidisciplinary approach with the aim of improving adherence. Methods: longitudinal cohort study including 8,164 prevalent hemodialysis patients in April 2010, with more than 90 days of treatment, in Fresenius Medical Care Argentina units that were monitored for 3 years. The survey evaluated: interruption of at least one dialysis session in a month or reduction at least 10 minutes of a dialysis session in a month, during 6 months prior to the survey. Relative mortality risks were evaluated among groups. Results: 648 patients (7.9% interrupted dialysis sessions: 320 (3.9% interrupted one session per month and 328 (4.01% interrupted more than one session per month. After 3 years monitoring, 349 patients (53.8 % remained active in hemodialysis and 299 were inactive due to different reasons: 206 deceased (31.8 %, 47 transfers or monitoring losses (7.25 %, 36 transplanted (5.55 %, 8 changes to PD modality (1.2% and 2 recovered their kidney function (0.3 %.Interrupting one session per month significantly increased the mortality risk comparing both groups (interrupters and non-interrupters: RR 2.65 (IC 95% 2.24 – 3.14. Interrupting more than one dialysis session also increased significantly mortality risk comparing to the non-interrupters: RR 2.8 (IC 95% 2.39 – 3.28. After 3 years monitoring, 41.6 % of interrupters at the beginning had improved their adherence through a multidisciplinary program of quality improvement. Conclusion: Global mortality was greater among patients who interrupted dialysis sessions. A considerable proportion of interrupter patients at the beginning modified their behavior through the implementation of a multidisciplinary program of quality

  18. Maternal undernutrition significantly impacts ovarian follicle number and increases ovarian oxidative stress in adult rat offspring.

    Directory of Open Access Journals (Sweden)

    Angelica B Bernal

    Full Text Available BACKGROUND: We have shown recently that maternal undernutrition (UN advanced female pubertal onset in a manner that is dependent upon the timing of UN. The long-term consequence of this accelerated puberty on ovarian function is unknown. Recent findings suggest that oxidative stress may be one mechanism whereby early life events impact on later physiological functioning. Therefore, using an established rodent model of maternal UN at critical windows of development, we examined maternal UN-induced changes in offspring ovarian function and determined whether these changes were underpinned by ovarian oxidative stress. METHODOLOGY/PRINCIPAL FINDINGS: Our study is the first to show that maternal UN significantly reduced primordial and secondary follicle number in offspring in a manner that was dependent upon the timing of maternal UN. Specifically, a reduction in these early stage follicles was observed in offspring born to mothers undernourished throughout both pregnancy and lactation. Additionally, antral follicle number was reduced in offspring born to all mothers that were UN regardless of whether the period of UN was restricted to pregnancy or lactation or both. These reductions were associated with decreased mRNA levels of genes critical for follicle maturation and ovulation. Increased ovarian protein carbonyls were observed in offspring born to mothers UN during pregnancy and/or lactation and this was associated with peroxiredoxin 3 hyperoxidation and reduced mRNA levels; suggesting compromised antioxidant defence. This was not observed in offspring of mothers UN during lactation alone. CONCLUSIONS: We propose that maternal UN, particularly at a time-point that includes pregnancy, results in reduced offspring ovarian follicle numbers and mRNA levels of regulatory genes and may be mediated by increased ovarian oxidative stress coupled with a decreased ability to repair the resultant oxidative damage. Together these data are suggestive of

  19. Free ammonia pre-treatment of secondary sludge significantly increases anaerobic methane production.

    Science.gov (United States)

    Wei, Wei; Zhou, Xu; Wang, Dongbo; Sun, Jing; Wang, Qilin

    2017-07-01

    Energy recovery in the form of methane from sludge/wastewater is restricted by the poor and slow biodegradability of secondary sludge. An innovative pre-treatment technology using free ammonia (FA, i.e. NH 3 ) was proposed in this study to increase anaerobic methane production. The solubilisation of secondary sludge was significantly increased after FA pre-treatment at up to 680 mg NH 3 -N/L for 1 day, under which the solubilisation (i.e. 0.4 mg SCOD/mg VS; SCOD: soluble chemical oxygen demand; VS: volatile solids) was >10 times higher than that without FA pre-treatment (i.e. 0.03 mg SCOD/mg VS). Biochemical methane potential assays showed that FA pre-treatment at above 250 mg NH 3 -N/L is effective in improving anaerobic methane production. The highest improvement in biochemical methane potential (B 0 ) and hydrolysis rate (k) was achieved at FA concentrations of 420-680 mg NH 3 -N/L, and was determined as approximately 22% (from 160 to 195 L CH 4 /kg VS added) and 140% (from 0.22 to 0.53 d -1 ) compared to the secondary sludge without pre-treatment. More analysis revealed that the FA induced improvement in B 0 and k could be attributed to the rapidly biodegradable substances rather than the slowly biodegradable substances. Economic and environmental analyses showed that the FA-based technology is economically favourable and environmentally friendly. Since this FA technology aims to use the wastewater treatment plants (WWTPs) waste (i.e. anaerobic digestion liquor) to enhance methane production from the WWTPs, it will set an example for the paradigm shift of the WWTPs from 'linear economy' to 'circular economy'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Myriocin significantly increases the mortality of a non-mammalian model host during Candida pathogenesis.

    Directory of Open Access Journals (Sweden)

    Nadja Rodrigues de Melo

    Full Text Available Candida albicans is a major human pathogen whose treatment is challenging due to antifungal drug toxicity, drug resistance and paucity of antifungal agents available. Myrocin (MYR inhibits sphingosine synthesis, a precursor of sphingolipids, an important cell membrane and signaling molecule component. MYR also has dual immune suppressive and antifungal properties, potentially modulating mammalian immunity and simultaneously reducing fungal infection risk. Wax moth (Galleria mellonella larvae, alternatives to mice, were used to establish if MYR suppressed insect immunity and increased survival of C. albicans-infected insects. MYR effects were studied in vivo and in vitro, and compared alone and combined with those of approved antifungal drugs, fluconazole (FLC and amphotericin B (AMPH. Insect immune defenses failed to inhibit C. albicans with high mortalities. In insects pretreated with the drug followed by C. albicans inoculation, MYR+C. albicans significantly increased mortality to 93% from 67% with C. albicans alone 48 h post-infection whilst AMPH+C. albicans and FLC+C. albicans only showed 26% and 0% mortalities, respectively. MYR combinations with other antifungal drugs in vivo also enhanced larval mortalities, contrasting the synergistic antifungal effect of the MYR+AMPH combination in vitro. MYR treatment influenced immunity and stress management gene expression during C. albicans pathogenesis, modulating transcripts putatively associated with signal transduction/regulation of cytokines, I-kappaB kinase/NF-kappaB cascade, G-protein coupled receptor and inflammation. In contrast, all stress management gene expression was down-regulated in FLC and AMPH pretreated C. albicans-infected insects. Results are discussed with their implications for clinical use of MYR to treat sphingolipid-associated disorders.

  1. Significantly increased risk of carotid atherosclerosis with arsenic exposure and polymorphisms in arsenic metabolism genes

    International Nuclear Information System (INIS)

    Hsieh, Yi-Chen; Lien, Li-Ming; Chung, Wen-Ting; Hsieh, Fang-I; Hsieh, Pei-Fan; Wu, Meei-Maan; Tseng, Hung-Pin; Chiou, Hung-Yi; Chen, Chien-Jen

    2011-01-01

    Individual susceptibility to arsenic-induced carotid atherosclerosis might be associated with genetic variations in arsenic metabolism. The purpose of this study is to explore the interaction effect on risk of carotid atherosclerosis between arsenic exposure and risk genotypes of purine nucleoside phosphorylase (PNP), arsenic (+3) methyltransferase (As3MT), and glutathione S-transferase omega 1 (GSTO1) and omega 2 (GSTO2). A community-based case-control study was conducted in northeastern Taiwan to investigate the arsenic metabolic-related genetic susceptibility to carotid atherosclerosis. In total, 863 subjects, who had been genotyped and for whom the severity of carotid atherosclerosis had been determined, were included in the present study. Individual well water was collected and arsenic concentration determined using hydride generation combined with flame atomic absorption spectrometry. The result showed that a significant dose-response trend (P=0.04) of carotid atherosclerosis risk associated with increasing arsenic concentration. Non-significant association between genetic polymorphisms of PNP Gly51Ser, Pro57Pro, As3MT Met287Thr, GSTO1 Ala140Asp, and GSTO2 A-183G and the risk for development of carotid atherosclerosis were observed. However, the significant interaction effect on carotid atherosclerosis risk was found for arsenic exposure (>50 μg/l) and the haplotypes of PNP (p=0.0115). A marked elevated risk of carotid atherosclerosis was observed in subjects with arsenic exposure of >50 μg/l in drinking water and those who carried the PNP A-T haplotype and at least either of the As3MT risk polymorphism or GSTO risk haplotypes (OR, 6.43; 95% CI, 1.79-23.19). In conclusion, arsenic metabolic genes, PNP, As3MT, and GSTO, may exacerbate the formation of atherosclerosis in individuals with high levels of arsenic concentration in well water (>50 μg/l). - Highlights: →Arsenic metabolic genes might be associated with carotid atherosclerosis. → A case

  2. Significantly increased risk of carotid atherosclerosis with arsenic exposure and polymorphisms in arsenic metabolism genes

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Yi-Chen [School of Public Health, College of Public Health and Nutrition, Taipei Medical University, 250 Wusing St., Taipei 11031, Taiwan (China); Lien, Li-Ming [Graduate Institute of Clinical Medicine, College of Medicine, Taipei Medical University, Taipei, Taiwan (China); School of Medicine, Taipei Medical University, Taipei, Taiwan (China); Department of Neurology, Shin Kong WHS Memorial Hospital, Taipei, Taiwan (China); Chung, Wen-Ting [Department of Neurology, Wanfang Hospital, Taipei Medical University, Taipei, Taiwan (China); Graduate Institute of Clinical Medicine, Taipei Medical University, Taipei, Taiwan (China); Hsieh, Fang-I; Hsieh, Pei-Fan [School of Public Health, College of Public Health and Nutrition, Taipei Medical University, 250 Wusing St., Taipei 11031, Taiwan (China); Wu, Meei-Maan [School of Public Health, College of Public Health and Nutrition, Taipei Medical University, 250 Wusing St., Taipei 11031, Taiwan (China); Graduate Institute of Basic Medicine, College of Medicine, Fu-Jen Catholic University, Taipei, Taiwan (China); Tseng, Hung-Pin [Department of Neurology, Lotung Poh-Ai Hospital, I-Lan, Taiwan (China); Chiou, Hung-Yi, E-mail: hychiou@tmu.edu.tw [School of Public Health, College of Public Health and Nutrition, Taipei Medical University, 250 Wusing St., Taipei 11031, Taiwan (China); Chen, Chien-Jen [Genomics Research Center, Academia Sinica, Taipei, Taiwan (China)

    2011-08-15

    Individual susceptibility to arsenic-induced carotid atherosclerosis might be associated with genetic variations in arsenic metabolism. The purpose of this study is to explore the interaction effect on risk of carotid atherosclerosis between arsenic exposure and risk genotypes of purine nucleoside phosphorylase (PNP), arsenic (+3) methyltransferase (As3MT), and glutathione S-transferase omega 1 (GSTO1) and omega 2 (GSTO2). A community-based case-control study was conducted in northeastern Taiwan to investigate the arsenic metabolic-related genetic susceptibility to carotid atherosclerosis. In total, 863 subjects, who had been genotyped and for whom the severity of carotid atherosclerosis had been determined, were included in the present study. Individual well water was collected and arsenic concentration determined using hydride generation combined with flame atomic absorption spectrometry. The result showed that a significant dose-response trend (P=0.04) of carotid atherosclerosis risk associated with increasing arsenic concentration. Non-significant association between genetic polymorphisms of PNP Gly51Ser, Pro57Pro, As3MT Met287Thr, GSTO1 Ala140Asp, and GSTO2 A-183G and the risk for development of carotid atherosclerosis were observed. However, the significant interaction effect on carotid atherosclerosis risk was found for arsenic exposure (>50 {mu}g/l) and the haplotypes of PNP (p=0.0115). A marked elevated risk of carotid atherosclerosis was observed in subjects with arsenic exposure of >50 {mu}g/l in drinking water and those who carried the PNP A-T haplotype and at least either of the As3MT risk polymorphism or GSTO risk haplotypes (OR, 6.43; 95% CI, 1.79-23.19). In conclusion, arsenic metabolic genes, PNP, As3MT, and GSTO, may exacerbate the formation of atherosclerosis in individuals with high levels of arsenic concentration in well water (>50 {mu}g/l). - Highlights: {yields}Arsenic metabolic genes might be associated with carotid atherosclerosis. {yields

  3. Templated assembly of photoswitches significantly increases the energy-storage capacity of solar thermal fuels.

    Science.gov (United States)

    Kucharski, Timothy J; Ferralis, Nicola; Kolpak, Alexie M; Zheng, Jennie O; Nocera, Daniel G; Grossman, Jeffrey C

    2014-05-01

    Large-scale utilization of solar-energy resources will require considerable advances in energy-storage technologies to meet ever-increasing global energy demands. Other than liquid fuels, existing energy-storage materials do not provide the requisite combination of high energy density, high stability, easy handling, transportability and low cost. New hybrid solar thermal fuels, composed of photoswitchable molecules on rigid, low-mass nanostructures, transcend the physical limitations of molecular solar thermal fuels by introducing local sterically constrained environments in which interactions between chromophores can be tuned. We demonstrate this principle of a hybrid solar thermal fuel using azobenzene-functionalized carbon nanotubes. We show that, on composite bundling, the amount of energy stored per azobenzene more than doubles from 58 to 120 kJ mol(-1), and the material also maintains robust cyclability and stability. Our results demonstrate that solar thermal fuels composed of molecule-nanostructure hybrids can exhibit significantly enhanced energy-storage capabilities through the generation of template-enforced steric strain.

  4. Increased Mortality in Diabetic Foot Ulcer Patients: The Significance of Ulcer Type

    Science.gov (United States)

    Chammas, N. K.; Hill, R. L. R.; Edmonds, M. E.

    2016-01-01

    Diabetic foot ulcer (DFU) patients have a greater than twofold increase in mortality compared with nonulcerated diabetic patients. We investigated (a) cause of death in DFU patients, (b) age at death, and (c) relationship between cause of death and ulcer type. This was an eleven-year retrospective study on DFU patients who attended King's College Hospital Foot Clinic and subsequently died. A control group of nonulcerated diabetic patients was matched for age and type of diabetes mellitus. The cause of death was identified from death certificates (DC) and postmortem (PM) examinations. There were 243 DFU patient deaths during this period. Ischaemic heart disease (IHD) was the major cause of death in 62.5% on PM compared to 45.7% on DC. Mean age at death from IHD on PM was 5 years lower in DFU patients compared to controls (68.2 ± 8.7 years versus 73.1 ± 8.0 years, P = 0.015). IHD as a cause of death at PM was significantly linked to neuropathic foot ulcers (OR 3.064, 95% CI 1.003–9.366, and P = 0.049). Conclusions. IHD is the major cause of premature mortality in DFU patients with the neuropathic foot ulcer patients being at a greater risk. PMID:27213157

  5. Factors associated with an increased risk of vertebral fracture in monoclonal gammopathies of undetermined significance

    International Nuclear Information System (INIS)

    Piot, J M; Royer, M; Schmidt-Tanguy, A; Hoppé, E; Gardembas, M; Bourrée, T; Hunault, M; François, S; Boyer, F; Ifrah, N; Renier, G; Chevailler, A; Audran, M; Chappard, D; Libouban, H; Mabilleau, G; Legrand, E; Bouvard, B

    2015-01-01

    Monoclonal gammopathies of undetermined significance (MGUS) have been shown to be associated with an increased risk of fractures. This study describes prospectively the bone status of MGUS patients and determines the factors associated with vertebral fracture. We included prospectively 201 patients with MGUS, incidentally discovered, and with no known history of osteoporosis: mean age 66.6±12.5 years, 48.3% women, 51.7% immunoglobulin G (IgG), 33.3% IgM and 10.4% IgA. Light chain was kappa in 64.2% patients. All patients had spinal radiographs and bone mineral density measurement in addition to gammopathy assessment. At least one prevalent non-traumatic vertebral fracture was discovered in 18.4% patients and equally distributed between men and women. Fractured patients were older, had a lower bone density and had also more frequently a lambda light chain isotype. Compared with patients with κ light chain, the odds ratio of being fractured for patients with λ light chain was 4.32 (95% confidence interval 1.80–11.16; P=0.002). These results suggest a high prevalence of non-traumatic vertebral fractures in MGUS associated with lambda light chain isotype and not only explained by low bone density

  6. Constrained parameterisation of photosynthetic capacity causes significant increase of modelled tropical vegetation surface temperature

    Science.gov (United States)

    Kattge, J.; Knorr, W.; Raddatz, T.; Wirth, C.

    2009-04-01

    Photosynthetic capacity is one of the most sensitive parameters of terrestrial biosphere models whose representation in global scale simulations has been severely hampered by a lack of systematic analyses using a sufficiently broad database. Due to its coupling to stomatal conductance changes in the parameterisation of photosynthetic capacity may potentially influence transpiration rates and vegetation surface temperature. Here, we provide a constrained parameterisation of photosynthetic capacity for different plant functional types in the context of the photosynthesis model proposed by Farquhar et al. (1980), based on a comprehensive compilation of leaf photosynthesis rates and leaf nitrogen content. Mean values of photosynthetic capacity were implemented into the coupled climate-vegetation model ECHAM5/JSBACH and modelled gross primary production (GPP) is compared to a compilation of independent observations on stand scale. Compared to the current standard parameterisation the root-mean-squared difference between modelled and observed GPP is substantially reduced for almost all PFTs by the new parameterisation of photosynthetic capacity. We find a systematic depression of NUE (photosynthetic capacity divided by leaf nitrogen content) on certain tropical soils that are known to be deficient in phosphorus. Photosynthetic capacity of tropical trees derived by this study is substantially lower than standard estimates currently used in terrestrial biosphere models. This causes a decrease of modelled GPP while it significantly increases modelled tropical vegetation surface temperatures, up to 0.8°C. These results emphasise the importance of a constrained parameterisation of photosynthetic capacity not only for the carbon cycle, but also for the climate system.

  7. Significant yield increases from control of leaf diseases in maize - an overlooked problem?!

    DEFF Research Database (Denmark)

    Jørgensen, Lise Nistrup

    2012-01-01

    The area of maize has increased in several European countries in recent years. In Denmark, the area has increased from 10,000 ha in 1980 to 185,000 ha in 2011. Initially only silage maize was cultivated in Denmark, but in more recent years the area of grain maize has also increased. Farms growing...

  8. Fetal rat metabonome alteration by prenatal caffeine ingestion probably due to the increased circulatory glucocorticoid level and altered peripheral glucose and lipid metabolic pathways

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yansong [Department of Pharmacology, Basic Medical School of Wuhan University, Wuhan University, Wuhan, 430071 (China); Xu, Dan [Department of Pharmacology, Basic Medical School of Wuhan University, Wuhan University, Wuhan, 430071 (China); Research Center of Food and Drug Evaluation, Wuhan University, Wuhan, 430071 (China); Feng, Jianghua, E-mail: jianghua.feng@xmu.edu.cn [Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, Wuhan, 430071 (China); Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, 361005 (China); Kou, Hao; Liang, Gai [Department of Pharmacology, Basic Medical School of Wuhan University, Wuhan University, Wuhan, 430071 (China); Yu, Hong; He, Xiaohua; Zhang, Baifang; Chen, Liaobin [Research Center of Food and Drug Evaluation, Wuhan University, Wuhan, 430071 (China); Magdalou, Jacques [UMR 7561 CNRS-Nancy Université, Faculté de Médicine, Vandoeuvre-lès-Nancy (France); Wang, Hui, E-mail: wanghui19@whu.edu.cn [Department of Pharmacology, Basic Medical School of Wuhan University, Wuhan University, Wuhan, 430071 (China); Research Center of Food and Drug Evaluation, Wuhan University, Wuhan, 430071 (China)

    2012-07-15

    The aims of this study were to clarify the metabonome alteration in fetal rats after prenatal caffeine ingestion and to explore the underlying mechanism pertaining to the increased fetal circulatory glucocorticoid (GC). Pregnant Wistar rats were daily intragastrically administered with different doses of caffeine (0, 20, 60 and 180 mg/kg) from gestational days (GD) 11 to 20. Metabonome of fetal plasma and amniotic fluid on GD20 were analyzed by {sup 1}H nuclear magnetic resonance-based metabonomics. Gene and protein expressions involved in the GC metabolism, glucose and lipid metabolic pathways in fetal liver and gastrocnemius were measured by real-time RT-PCR and immunohistochemistry. Fetal plasma metabonome were significantly altered by caffeine, which presents as the elevated α- and β‐glucose, reduced multiple lipid contents, varied apolipoprotein contents and increased levels of a number of amino acids. The metabonome of amniotic fluids showed a similar change as that in fetal plasma. Furthermore, the expressions of 11β-hydroxysteroid dehydrogenase 2 (11β-HSD-2) were decreased, while the level of blood GC and the expressions of 11β-HSD-1 and glucocorticoid receptor (GR) were increased in fetal liver and gastrocnemius. Meanwhile, the expressions of insulin-like growth factor 1 (IGF-1), IGF-1 receptor and insulin receptor were decreased, while the expressions of adiponectin receptor 2, leptin receptors and AMP-activated protein kinase α2 were increased after caffeine treatment. Prenatal caffeine ingestion characteristically change the fetal metabonome, which is probably attributed to the alterations of glucose and lipid metabolic pathways induced by increased circulatory GC, activated GC metabolism and enhanced GR expression in peripheral metabolic tissues. -- Highlights: ► Prenatal caffeine ingestion altered the metabonome of IUGR fetal rats. ► Caffeine altered the glucose and lipid metabolic pathways of IUGR fetal rats. ► Prenatal caffeine

  9. Fetal rat metabonome alteration by prenatal caffeine ingestion probably due to the increased circulatory glucocorticoid level and altered peripheral glucose and lipid metabolic pathways

    International Nuclear Information System (INIS)

    Liu, Yansong; Xu, Dan; Feng, Jianghua; Kou, Hao; Liang, Gai; Yu, Hong; He, Xiaohua; Zhang, Baifang; Chen, Liaobin; Magdalou, Jacques; Wang, Hui

    2012-01-01

    The aims of this study were to clarify the metabonome alteration in fetal rats after prenatal caffeine ingestion and to explore the underlying mechanism pertaining to the increased fetal circulatory glucocorticoid (GC). Pregnant Wistar rats were daily intragastrically administered with different doses of caffeine (0, 20, 60 and 180 mg/kg) from gestational days (GD) 11 to 20. Metabonome of fetal plasma and amniotic fluid on GD20 were analyzed by 1 H nuclear magnetic resonance-based metabonomics. Gene and protein expressions involved in the GC metabolism, glucose and lipid metabolic pathways in fetal liver and gastrocnemius were measured by real-time RT-PCR and immunohistochemistry. Fetal plasma metabonome were significantly altered by caffeine, which presents as the elevated α- and β‐glucose, reduced multiple lipid contents, varied apolipoprotein contents and increased levels of a number of amino acids. The metabonome of amniotic fluids showed a similar change as that in fetal plasma. Furthermore, the expressions of 11β-hydroxysteroid dehydrogenase 2 (11β-HSD-2) were decreased, while the level of blood GC and the expressions of 11β-HSD-1 and glucocorticoid receptor (GR) were increased in fetal liver and gastrocnemius. Meanwhile, the expressions of insulin-like growth factor 1 (IGF-1), IGF-1 receptor and insulin receptor were decreased, while the expressions of adiponectin receptor 2, leptin receptors and AMP-activated protein kinase α2 were increased after caffeine treatment. Prenatal caffeine ingestion characteristically change the fetal metabonome, which is probably attributed to the alterations of glucose and lipid metabolic pathways induced by increased circulatory GC, activated GC metabolism and enhanced GR expression in peripheral metabolic tissues. -- Highlights: ► Prenatal caffeine ingestion altered the metabonome of IUGR fetal rats. ► Caffeine altered the glucose and lipid metabolic pathways of IUGR fetal rats. ► Prenatal caffeine ingestion

  10. Unequivocal detection of ozone recovery in the Antarctic Ozone Hole through significant increases in atmospheric layers with minimum ozone

    Science.gov (United States)

    de Laat, Jos; van Weele, Michiel; van der A, Ronald

    2015-04-01

    An important new landmark in present day ozone research is presented through MLS satellite observations of significant ozone increases during the ozone hole season that are attributed unequivocally to declining ozone depleting substances. For many decades the Antarctic ozone hole has been the prime example of both the detrimental effects of human activities on our environment as well as how to construct effective and successful environmental policies. Nowadays atmospheric concentrations of ozone depleting substances are on the decline and first signs of recovery of stratospheric ozone and ozone in the Antarctic ozone hole have been observed. The claimed detection of significant recovery, however, is still subject of debate. In this talk we will discuss first current uncertainties in the assessment of ozone recovery in the Antarctic ozone hole by using multi-variate regression methods, and, secondly present an alternative approach to identify ozone hole recovery unequivocally. Even though multi-variate regression methods help to reduce uncertainties in estimates of ozone recovery, great care has to be taken in their application due to the existence of uncertainties and degrees of freedom in the choice of independent variables. We show that taking all uncertainties into account in the regressions the formal recovery of ozone in the Antarctic ozone hole cannot be established yet, though is likely before the end of the decade (before 2020). Rather than focusing on time and area averages of total ozone columns or ozone profiles, we argue that the time evolution of the probability distribution of vertically resolved ozone in the Antarctic ozone hole contains a better fingerprint for the detection of ozone recovery in the Antarctic ozone hole. The advantages of this method over more tradition methods of trend analyses based on spatio-temporal average ozone are discussed. The 10-year record of MLS satellite measurements of ozone in the Antarctic ozone hole shows a

  11. Presence of gingivitis and periodontitis significantly increases hospital charges in patients undergoing heart valve surgery.

    Science.gov (United States)

    Allareddy, Veerasathpurush; Elangovan, Satheesh; Rampa, Sankeerth; Shin, Kyungsup; Nalliah, Romesh P; Allareddy, Veerajalandhar

    2015-01-01

    To examine the prevalence and impact of gingivitis and periodontitis in patients having heart valve surgical procedures. Nationwide Inpatient Sample for the years 2004-2010 was used. All patients who had heart valve surgical procedures were selected. Prevalence of gingivitis/periodontitis was examined in these patients. Impact of gingivitis/periodontitis on hospital charges, length of stay, and infectious complications was examined. 596,190 patients had heart valve surgical procedures. Gingivitis/periodontitis was present in 0.2 percent. Outcomes included: median hospital charges ($175,418 with gingivitis/ periodontitis versus $149,353 without gingivitis/periodontitis) and median length of stay (14 days with gingivitis/periodontitis versus 8 days without gingivitis/periodontitis). After adjusting for the effects of patient- and hospital-level confounding factors, hospital charges and length of stay were significantly higher (p gingivitis/periodontitis compared to their counterparts. Further, patients with gingivitis/periodontitis had significantly higher odds for having bacterial infections (OR = 3.41, 95% CI = 2.33-4.98, p gingivitis/periodontitis. Presence of gingivitis and periodontitis is associated with higher risk for bacterial infections and significant hospital resource utilization.

  12. Transabdominal cerclage: the significance of dual pathology and increased preterm delivery.

    Science.gov (United States)

    Farquharson, Roy G; Topping, Joanne; Quenby, Siobhan M

    2005-10-01

    Transabdominal cerclage is a recognised treatment for cervical weakness with a history of recurrent mid-trimester loss and a failed elective vaginal suture. The emergence of dual pathology, such as antiphospholipid syndrome and bacterial vaginosis, is associated with an increased risk of preterm delivery (RR 2.34, 95% CI 1.15-5.8). The first 40 cases are described where strict adherence to an investigation protocol and consistent treatment plan has been implemented.

  13. Clinical significance of increased lung/heart ratio in 210Tl stress myocardial image

    International Nuclear Information System (INIS)

    Liu Zaoli; Chang Fengqin; Zhang Fengge; Wang Xiaoyuan; Liu Liuhua

    1990-01-01

    230 cases were studied with 201 Tl stress image. The results showed that the lung/heart ratio closely correlated with the presence and severity of coronary heart disease (CHD). Among them, 18 cases (7.8%) showed significantly elevated lung/heart ratio (> 0.50). It was confirmed that all of the 18 cases have severe CHD with left ventricular insufficiency. The author emphasizes that measurement of the lung/heart ratio during 201 Tl stress myocardial image may be useful for the assessment of the severity, evalation of the left ventricular function and judgement of prognosis in CHD

  14. Strong Selection Significantly Increases Epistatic Interactions in the Long-Term Evolution of a Protein.

    Directory of Open Access Journals (Sweden)

    Aditi Gupta

    2016-03-01

    Full Text Available Epistatic interactions between residues determine a protein's adaptability and shape its evolutionary trajectory. When a protein experiences a changed environment, it is under strong selection to find a peak in the new fitness landscape. It has been shown that strong selection increases epistatic interactions as well as the ruggedness of the fitness landscape, but little is known about how the epistatic interactions change under selection in the long-term evolution of a protein. Here we analyze the evolution of epistasis in the protease of the human immunodeficiency virus type 1 (HIV-1 using protease sequences collected for almost a decade from both treated and untreated patients, to understand how epistasis changes and how those changes impact the long-term evolvability of a protein. We use an information-theoretic proxy for epistasis that quantifies the co-variation between sites, and show that positive information is a necessary (but not sufficient condition that detects epistasis in most cases. We analyze the "fossils" of the evolutionary trajectories of the protein contained in the sequence data, and show that epistasis continues to enrich under strong selection, but not for proteins whose environment is unchanged. The increase in epistasis compensates for the information loss due to sequence variability brought about by treatment, and facilitates adaptation in the increasingly rugged fitness landscape of treatment. While epistasis is thought to enhance evolvability via valley-crossing early-on in adaptation, it can hinder adaptation later when the landscape has turned rugged. However, we find no evidence that the HIV-1 protease has reached its potential for evolution after 9 years of adapting to a drug environment that itself is constantly changing. We suggest that the mechanism of encoding new information into pairwise interactions is central to protein evolution not just in HIV-1 protease, but for any protein adapting to a changing

  15. Corruption Significantly Increases the Capital Cost of Power Plants in Developing Contexts

    Directory of Open Access Journals (Sweden)

    Kumar Biswajit Debnath

    2018-03-01

    Full Text Available Emerging economies with rapidly growing population and energy demand, own some of the most expensive power plants in the world. We hypothesized that corruption has a relationship with the capital cost of power plants in developing countries such as Bangladesh. For this study, we analyzed the capital cost of 61 operational and planned power plants in Bangladesh. Initial comparison study revealed that the mean capital cost of a power plant in Bangladesh is twice than that of the global average. Then, the statistical analysis revealed a significant correlation between corruption and the cost of power plants, indicating that higher corruption leads to greater capital cost. The high up-front cost can be a significant burden on the economy, at present and in the future, as most are financed through international loans with extended repayment terms. There is, therefore, an urgent need for the review of the procurement and due diligence process of establishing power plants, and for the implementation of a more transparent system to mitigate adverse effects of corruption on megaprojects.

  16. Modern environmental health hazards: a public health issue of increasing significance in Africa.

    Science.gov (United States)

    Nweke, Onyemaechi C; Sanders, William H

    2009-06-01

    Traditional hazards such as poor sanitation currently account for most of Africa's environmentally related disease burden. However, with rapid development absent appropriate safeguards for environment and health, modern environmental health hazards (MEHHs) may emerge as critical contributors to the continent's disease burden. We review recent evidence of human exposure to and health effects from MEHHs, and their occurrence in environmental media and consumer products. Our purpose is to highlight the growing significance of these hazards as African countries experience urbanization, industrial growth, and development. We reviewed published epidemiologic, exposure, and environmental studies of chemical agents such as heavy metals and pesticides. The body of evidence demonstrates ongoing environmental releases of MEHHs and human exposures sometimes at toxicologically relevant levels. Several sources of MEHHs in environmental media have been identified, including natural resource mining and processing and automobile exhaust. Biomonitoring studies provided direct evidence of human exposure to metals such as mercury and lead and pesticides such as p,p'-dichlorodiphenyltrichloroethane (DDT) and organophosphates. Land and water resource pollution and industrial air toxics are areas of significant data gaps, notwithstanding the presence of several emitting sources. Unmitigated MEHH releases and human exposure have implications for Africa's disease burden. For Africans encumbered by conditions such as malnutrition that impair resilience to toxicologic challenges, the burden may be higher. A shift in public health policy toward accommodating the emerging diversity in Africa's environmental health issues is necessary to successfully alleviate the burden of avoidable ill health and premature death for all its communities now and in the future.

  17. Combining modularity, conservation, and interactions of proteins significantly increases precision and coverage of protein function prediction

    Directory of Open Access Journals (Sweden)

    Sers Christine T

    2010-12-01

    Full Text Available Abstract Background While the number of newly sequenced genomes and genes is constantly increasing, elucidation of their function still is a laborious and time-consuming task. This has led to the development of a wide range of methods for predicting protein functions in silico. We report on a new method that predicts function based on a combination of information about protein interactions, orthology, and the conservation of protein networks in different species. Results We show that aggregation of these independent sources of evidence leads to a drastic increase in number and quality of predictions when compared to baselines and other methods reported in the literature. For instance, our method generates more than 12,000 novel protein functions for human with an estimated precision of ~76%, among which are 7,500 new functional annotations for 1,973 human proteins that previously had zero or only one function annotated. We also verified our predictions on a set of genes that play an important role in colorectal cancer (MLH1, PMS2, EPHB4 and could confirm more than 73% of them based on evidence in the literature. Conclusions The combination of different methods into a single, comprehensive prediction method infers thousands of protein functions for every species included in the analysis at varying, yet always high levels of precision and very good coverage.

  18. Increased Body Mass Index during Therapy for Childhood Acute Lymphoblastic Leukemia: A Significant and Underestimated Complication

    Directory of Open Access Journals (Sweden)

    Helen C. Atkinson

    2015-01-01

    Full Text Available Objective & Design. We undertook a retrospective review of children diagnosed with acute lymphoblastic leukemia (ALL and treated with modern COG protocols (n=80 to determine longitudinal changes in body mass index (BMI and the prevalence of obesity compared with a healthy reference population. Results. At diagnosis, the majority of patients (77.5% were in the healthy weight category. During treatment, increases in BMI z-scores were greater for females than males; the prevalence of obesity increased from 10.3% to 44.8% (P<0.004 for females but remained relatively unchanged for males (9.8% to 13.7%, P=0.7. Longitudinal analysis using linear mixed-effects identified associations between BMI z-scores and time-dependent interactions with sex (P=0.0005, disease risk (P<0.0001, age (P=0.0001, and BMI z-score (P<0.0001 at diagnosis and total dose of steroid during maintenance (P=0.01. Predicted mean BMI z-scores at the end of therapy were greater for females with standard risk ALL irrespective of age at diagnosis and for males younger than 4 years of age at diagnosis with standard risk ALL. Conclusion. Females treated on standard risk protocols and younger males may be at greatest risk of becoming obese during treatment for ALL. These subgroups may benefit from intervention strategies to manage BMI during treatment for ALL.

  19. Big data integration shows Australian bush-fire frequency is increasing significantly.

    Science.gov (United States)

    Dutta, Ritaban; Das, Aruneema; Aryal, Jagannath

    2016-02-01

    Increasing Australian bush-fire frequencies over the last decade has indicated a major climatic change in coming future. Understanding such climatic change for Australian bush-fire is limited and there is an urgent need of scientific research, which is capable enough to contribute to Australian society. Frequency of bush-fire carries information on spatial, temporal and climatic aspects of bush-fire events and provides contextual information to model various climate data for accurately predicting future bush-fire hot spots. In this study, we develop an ensemble method based on a two-layered machine learning model to establish relationship between fire incidence and climatic data. In a 336 week data trial, we demonstrate that the model provides highly accurate bush-fire incidence hot-spot estimation (91% global accuracy) from the weekly climatic surfaces. Our analysis also indicates that Australian weekly bush-fire frequencies increased by 40% over the last 5 years, particularly during summer months, implicating a serious climatic shift.

  20. Significantly Increased Extreme Precipitation Expected in Europe and North America from Extratropical Storms

    Science.gov (United States)

    Hawcroft, M.; Hodges, K.; Walsh, E.; Zappa, G.

    2017-12-01

    For the Northern Hemisphere extratropics, changes in circulation are key to determining the impacts of climate warming. The mechanisms governing these circulation changes are complex, leading to the well documented uncertainty in projections of the future location of the mid-latitude storm tracks simulated by climate models. These storms are the primary source of precipitation for North America and Europe and generate many of the large-scale precipitation extremes associated with flooding and severe economic loss. Here, we show that in spite of the uncertainty in circulation changes, by analysing the behaviour of the storms themselves, we find entirely consistent and robust projections across an ensemble of climate models. In particular, we find that projections of change in the most intensely precipitating storms (above the present day 99th percentile) in the Northern Hemisphere are substantial and consistent across models, with large increases in the frequency of both summer (June-August, +226±68%) and winter (December-February, +186±34%) extreme storms by the end of the century. Regionally, both North America (summer +202±129%, winter +232±135%) and Europe (summer +390±148%, winter +318±114%) are projected to experience large increases in the frequency of intensely precipitating storms. These changes are thermodynamic and driven by surface warming, rather than by changes in the dynamical behaviour of the storms. Such changes in storm behaviour have the potential to have major impacts on society given intensely precipitating storms are responsible for many large-scale flooding events.

  1. Circulatory nucleosome levels are significantly increased in early and late-onset preeclampsia.

    Science.gov (United States)

    Zhong, Xiao Yan; Gebhardt, Stefan; Hillermann, Renate; Tofa, Kashefa Carelse; Holzgreve, Wolfgang; Hahn, Sinuhe

    2005-08-01

    Elevations in circulatory DNA, as measured by real-time PCR, have been observed in pregnancies with manifest preeclampsia. Recent reports have indicated that circulatory nucleosome levels are elevated in the periphery of cancer patients. We have now examined whether circulatory nucleosome levels are similarly elevated in cases with preeclampsia. Maternal plasma samples were prepared from 17 cases with early onset preeclampsia (34 weeks gestation) with 10 matched normotensive controls. Levels of circulatory nucleosomes were quantified by commercial ELISA (enzyme-linked immunosorbant assay). The level of circulatory nucleosomes was significantly elevated in both study preeclampsia groups, compared to the matched normotensive control group (p = 0.000 and p = 0.001, respectively). Our data suggests that preeclampsia is associated with the elevated presence of circulatory nucleosomes, and that this phenomenon occurs in both early- and late-onset forms of the disorder. Copyright 2005 John Wiley & Sons, Ltd.

  2. TNFRSF14 aberrations in follicular lymphoma increase clinically significant allogeneic T-cell responses.

    Science.gov (United States)

    Kotsiou, Eleni; Okosun, Jessica; Besley, Caroline; Iqbal, Sameena; Matthews, Janet; Fitzgibbon, Jude; Gribben, John G; Davies, Jeffrey K

    2016-07-07

    Donor T-cell immune responses can eradicate lymphomas after allogeneic hematopoietic stem cell transplantation (AHSCT), but can also damage healthy tissues resulting in harmful graft-versus-host disease (GVHD). Next-generation sequencing has recently identified many new genetic lesions in follicular lymphoma (FL). One such gene, tumor necrosis factor receptor superfamily 14 (TNFRSF14), abnormal in 40% of FL patients, encodes the herpes virus entry mediator (HVEM) which limits T-cell activation via ligation of the B- and T-lymphocyte attenuator. As lymphoma B cells can act as antigen-presenting cells, we hypothesized that TNFRSF14 aberrations that reduce HVEM expression could alter the capacity of FL B cells to stimulate allogeneic T-cell responses and impact the outcome of AHSCT. In an in vitro model of alloreactivity, human lymphoma B cells with TNFRSF14 aberrations had reduced HVEM expression and greater alloantigen-presenting capacity than wild-type lymphoma B cells. The increased immune-stimulatory capacity of lymphoma B cells with TNFRSF14 aberrations had clinical relevance, associating with higher incidence of acute GVHD in patients undergoing AHSCT. FL patients with TNFRSF14 aberrations may benefit from more aggressive immunosuppression to reduce harmful GVHD after transplantation. Importantly, this study is the first to demonstrate the impact of an acquired genetic lesion on the capacity of tumor cells to stimulate allogeneic T-cell immune responses which may have wider consequences for adoptive immunotherapy strategies. © 2016 by The American Society of Hematology.

  3. Exposure to Tumescent Solution Significantly Increases Phosphorylation of Perilipin in Adipocytes.

    Science.gov (United States)

    Keskin, Ilknur; Sutcu, Mustafa; Eren, Hilal; Keskin, Mustafa

    2017-02-01

    Lidocaine and epinephrine could potentially decrease adipocyte viability, but these effects have not been substantiated. The phosphorylation status of perilipin in adipocytes may be predictive of cell viability. Perilipin coats lipid droplets and restricts access of lipases; phospho-perilipin lacks this protective function. The authors investigated the effects of tumescent solution containing lidocaine and epinephrine on the phosphorylation status of perilipin in adipocytes. In this in vitro study, lipoaspirates were collected before and after tumescence from 15 women who underwent abdominoplasty. Fat samples were fixed, sectioned, and stained for histologic and immunohistochemical analyses. Relative phosphorylation of perilipin was inferred from pixel intensities of immunostained adipocytes observed with confocal microscopy. For adipocytes collected before tumescent infiltration, 10.08% of total perilipin was phosphorylated. In contrast, 30.62% of total perilipin was phosphorylated for adipocytes collected from tumescent tissue (P < .01). The tumescent technique increases the relative phosphorylation of perilipin in adipocytes, making these cells more vulnerable to lipolysis. Tumescent solution applied for analgesia or hemostasis of the donor site should contain the lowest possible concentrations of lidocaine and epinephrine. LEVEL OF EVIDENCE 5. © 2016 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  4. Significance of Increasing n-3 PUFA Content in Pork on Human Health.

    Science.gov (United States)

    Ma, Xianyong; Jiang, Zongyong; Lai, Chaoqiang

    2016-01-01

    Evidence for the health-promoting effects of food rich in n-3 polyunsaturated fatty acids (n-3 PUFA) is reviewed. Pork is an important meat source for humans. According to a report by the US Department of Agriculture ( http://www.ers.usda.gov/topics ), the pork consumption worldwide in 2011 was about 79.3 million tons, much higher than that of beef (48.2 million tons). Pork also contains high levels of unsaturated fatty acids relative to ruminant meats (Enser, M., Hallett, K., Hewett, B., Fursey, G. A. J. and Wood, J. D. (1996) . Fatty acid content and composition of English beef, lamb, and pork at retail. Meat Sci. 44:443-458). The available literature indicates that the levels of eicosatetraenoic and docosahexaenoic in pork may be increased by fish-derived or linseed products, the extent of which being dependent on the nature of the supplementation. Transgenic pigs and plants show promise with high content of n-3 PUFA and low ratio of n-6/n-3 fatty acids in their tissues. The approaches mentioned for decreasing n-6/n-3 ratios have both advantages and disadvantages. Selected articles are critically reviewed and summarized.

  5. Continues administration of Nano-PSO significantly increased survival of genetic CJD mice.

    Science.gov (United States)

    Binyamin, Orli; Keller, Guy; Frid, Kati; Larush, Liraz; Magdassi, Shlomo; Gabizon, Ruth

    2017-12-01

    We have shown previously that Nano-PSO, a nanodroplet formulation of pomegranate seed oil, delayed progression of neurodegeneration signs when administered for a designated period of time to TgMHu2ME199K mice, modeling for genetic prion disease. In the present work, we treated these mice with a self-emulsion formulation of Nano-PSO or a parallel Soybean oil formulation from their day of birth until a terminal disease stage. We found that long term Nano-PSO administration resulted in increased survival of TgMHu2ME199K lines by several months. Interestingly, initiation of treatment at day 1 had no clinical advantage over initiation at day 70, however cessation of treatment at 9months of age resulted in the rapid loss of the beneficial clinical effect. Pathological studies revealed that treatment with Nano-PSO resulted in the reduction of GAG accumulation and lipid oxidation, indicating a strong neuroprotective effect. Contrarily, the clinical effect of Nano-PSO did not correlate with reduction in the levels of disease related PrP, the main prion marker. We conclude that long term administration of Nano-PSO is safe and may be effective in the prevention/delay of onset of neurodegenerative conditions such as genetic CJD. Copyright © 2017. Published by Elsevier Inc.

  6. Elicitor Mixtures Significantly Increase Bioactive Compounds, Antioxidant Activity, and Quality Parameters in Sweet Bell Pepper

    Directory of Open Access Journals (Sweden)

    Lina Garcia-Mier

    2015-01-01

    Full Text Available Sweet bell peppers are greatly appreciated for their taste, color, pungency, and aroma. Additionally, they are good sources of bioactive compounds with antioxidant activity, which can be improved by the use of elicitors. Elicitors act as metabolite-inducing factors (MIF by mimic stress conditions. Since plants rarely experience a single stress condition one by one but are more likely to be exposed to simultaneous stresses, it is important to evaluate the effect of elicitors on plant secondary metabolism as mixtures. Jasmonic acid (JA, hydrogen peroxide (HP, and chitosan (CH were applied to fruits and plants of bell pepper as mixtures. Bioactive compounds, antioxidant activity, and quality parameters were evaluated. The assessed elicitor cocktail leads to an increase in the variables evaluated (P ≤ 0.05 when applied to mature fruits after harvest, whereas the lowest values were observed in the treatment applied to immature fruits. Therefore, the application of the elicitor cocktail to harvested mature fruits is recommended in order to improve bioactive compounds and the antioxidant activity of sweet bell peppers.

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  9. Increase in tumor control and normal tissue complication probabilities in advanced head-and-neck cancer for dose-escalated intensity-modulated photon and proton therapy

    Directory of Open Access Journals (Sweden)

    Annika eJakobi

    2015-11-01

    Full Text Available Introduction:Presently used radio-chemotherapy regimens result in moderate local control rates for patients with advanced head and neck squamous cell carcinoma (HNSCC. Dose escalation (DE may be an option to improve patient outcome, but may also increase the risk of toxicities in healthy tissue. The presented treatment planning study evaluated the feasibility of two DE levels for advanced HNSCC patients, planned with either intensity-modulated photon therapy (IMXT or proton therapy (IMPT.Materials and Methods:For 45 HNSCC patients, IMXT and IMPT treatment plans were created including DE via a simultaneous integrated boost (SIB in the high-risk volume, while maintaining standard fractionation with 2 Gy per fraction in the remaining target volume. Two DE levels for the SIB were compared: 2.3 Gy and 2.6 Gy. Treatment plan evaluation included assessment of tumor control probabilities (TCP and normal tissue complication probabilities (NTCP.Results:An increase of approximately 10% in TCP was estimated between the DE levels. A pronounced high-dose rim surrounding the SIB volume was identified in IMXT treatment. Compared to IMPT, this extra dose slightly increased the TCP values and to a larger extent the NTCP values. For both modalities, the higher DE level led only to a small increase in NTCP values (mean differences < 2% in all models, except for the risk of aspiration, which increased on average by 8% and 6% with IMXT and IMPT, respectively, but showed a considerable patient dependence. Conclusions:Both DE levels appear applicable to patients with IMXT and IMPT since all calculated NTCP values, except for one, increased only little for the higher DE level. The estimated TCP increase is of relevant magnitude. The higher DE schedule needs to be investigated carefully in the setting of a prospective clinical trial, especially regarding toxicities caused by high local doses that lack a sound dose response description, e.g., ulcers.

  10. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  11. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  14. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  15. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  16. Triglyceride content in remnant lipoproteins is significantly increased after food intake and is associated with plasma lipoprotein lipase.

    Science.gov (United States)

    Nakajima, Katsuyuki; Tokita, Yoshiharu; Sakamaki, Koji; Shimomura, Younosuke; Kobayashi, Junji; Kamachi, Keiko; Tanaka, Akira; Stanhope, Kimber L; Havel, Peter J; Wang, Tao; Machida, Tetsuo; Murakami, Masami

    2017-02-01

    Previous large population studies reported that non-fasting plasma triglyceride (TG) reflect a higher risk for cardiovascular disease than TG in the fasting plasma. This is suggestive of the presence of higher concentration of remnant lipoproteins (RLP) in postprandial plasma. TG and RLP-TG together with other lipids, lipoproteins and lipoprotein lipase (LPL) in both fasting and postprandial plasma were determined in generally healthy volunteers and in patients with coronary artery disease (CAD) after consuming a fat load or a more typical moderate meal. RLP-TG/TG ratio (concentration) and RLP-TG/RLP-C ratio (particle size) were significantly increased in the postprandial plasma of both healthy controls and CAD patients compared with those in fasting plasma. LPL/RLP-TG ratio demonstrated the interaction correlation between RLP concentration and LPL activity The increased RLP-TG after fat consumption contributed to approximately 90% of the increased plasma TG, while approximately 60% after a typical meal. Plasma LPL in postprandial plasma was not significantly altered after either type of meal. Concentrations of RLP-TG found in the TG along with its particle size are significantly increased in postprandial plasma compared with fasting plasma. Therefore, non-fasting TG determination better reflects the presence of higher RLP concentrations in plasma. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. St. John's wort significantly increased the systemic exposure and toxicity of methotrexate in rats

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Shih-Ying [Graduate Institute of Pharmaceutical Chemistry, China Medical University, Taichung, Taiwan (China); Juang, Shin-Hun [Graduate Institute of Pharmaceutical Chemistry, China Medical University, Taichung, Taiwan (China); Department of Medical Research, China Medical University Hospital, Taichung, Taiwan (China); Tsai, Shang-Yuan; Chao, Pei-Dawn Lee [School of Pharmacy, China Medical University, Taichung, Taiwan (China); Hou, Yu-Chi, E-mail: hou5133@gmail.com [School of Pharmacy, China Medical University, Taichung, Taiwan (China); Department of Medical Research, China Medical University Hospital, Taichung, Taiwan (China)

    2012-08-15

    St. John's wort (SJW, Hypericum perforatum) is one of the popular nutraceuticals for treating depression. Methotrexate (MTX) is an immunosuppressant with narrow therapeutic window. This study investigated the effect of SJW on MTX pharmacokinetics in rats. Rats were orally given MTX alone and coadministered with 300 and 150 mg/kg of SJW, and 25 mg/kg of diclofenac, respectively. Blood was withdrawn at specific time points and serum MTX concentrations were assayed by a specific monoclonal fluorescence polarization immunoassay method. The results showed that 300 mg/kg of SJW significantly increased the AUC{sub 0−t} and C{sub max} of MTX by 163% and 60%, respectively, and 150 mg/kg of SJW significantly increased the AUC{sub 0−t} of MTX by 55%. In addition, diclofenac enhanced the C{sub max} of MTX by 110%. The mortality of rats treated with SJW was higher than that of controls. In conclusion, coadministration of SJW significantly increased the systemic exposure and toxicity of MTX. The combined use of MTX with SJW would need to be with caution. -- Highlights: ► St. John's wort significantly increased the AUC{sub 0−t} and C{sub max} of methotrexate. ► Coadministration of St. John's wort increased the exposure and toxicity of methotrexate. ► The combined use of methotrexate with St. John's wort will need to be with caution.

  18. Significant social events and increasing use of life-sustaining treatment: trend analysis using extracorporeal membrane oxygenation as an example.

    Science.gov (United States)

    Chen, Yen-Yuan; Chen, Likwang; Huang, Tien-Shang; Ko, Wen-Je; Chu, Tzong-Shinn; Ni, Yen-Hsuan; Chang, Shan-Chwen

    2014-03-04

    Most studies have examined the outcomes of patients supported by extracorporeal membrane oxygenation as a life-sustaining treatment. It is unclear whether significant social events are associated with the use of life-sustaining treatment. This study aimed to compare the trend of extracorporeal membrane oxygenation use in Taiwan with that in the world, and to examine the influence of significant social events on the trend of extracorporeal membrane oxygenation use in Taiwan. Taiwan's extracorporeal membrane oxygenation uses from 2000 to 2009 were collected from National Health Insurance Research Dataset. The number of the worldwide extracorporeal membrane oxygenation cases was mainly estimated using Extracorporeal Life Support Registry Report International Summary July 2012. The trend of Taiwan's crude annual incidence rate of extracorporeal membrane oxygenation use was compared with that of the rest of the world. Each trend of extracorporeal membrane oxygenation use was examined using joinpoint regression. The measurement was the crude annual incidence rate of extracorporeal membrane oxygenation use. Each of the Taiwan's crude annual incidence rates was much higher than the worldwide one in the same year. Both the trends of Taiwan's and worldwide crude annual incidence rates have significantly increased since 2000. Joinpoint regression selected the model of the Taiwan's trend with one joinpoint in 2006 as the best-fitted model, implying that the significant social events in 2006 were significantly associated with the trend change of extracorporeal membrane oxygenation use following 2006. In addition, significantly social events highlighted by the media are more likely to be associated with the increase of extracorporeal membrane oxygenation use than being fully covered by National Health Insurance. Significant social events, such as a well-known person's successful extracorporeal membrane oxygenation use highlighted by the mass media, are associated with the use of

  19. Social marketing campaign significantly associated with increases in syphilis testing among gay and bisexual men in San Francisco.

    Science.gov (United States)

    Montoya, Jorge A; Kent, Charlotte K; Rotblatt, Harlan; McCright, Jacque; Kerndt, Peter R; Klausner, Jeffrey D

    2005-07-01

    Between 1999 and 2002, San Francisco experienced a sharp increase in early syphilis among gay and bisexual men. In response, the San Francisco Department of Public Health launched a social marketing campaign to increase testing for syphilis, and awareness and knowledge about syphilis among gay and bisexual men. A convenience sample of 244 gay and bisexual men (18-60 years of age) were surveyed to evaluate the effectiveness of the campaign. Respondents were interviewed to elicit unaided and aided awareness about the campaign, knowledge about syphilis, recent sexual behaviors, and syphilis testing behavior. After controlling for other potential confounders, unaided campaign awareness was a significant correlate of having a syphilis test in the last 6 months (odds ratio, 3.21; 95% confidence interval, 1.30-7.97) compared with no awareness of the campaign. A comparison of respondents aware of the campaign with those not aware also revealed significant increases in awareness and knowledge about syphilis. The Healthy Penis 2002 campaign achieved its primary objective of increasing syphilis testing, and awareness and knowledge about syphilis among gay and bisexual men in San Francisco.

  20. Active Brown Fat During 18F-FDG PET/CT Imaging Defines a Patient Group with Characteristic Traits and an Increased Probability of Brown Fat Redetection.

    Science.gov (United States)

    Gerngroß, Carlos; Schretter, Johanna; Klingenspor, Martin; Schwaiger, Markus; Fromme, Tobias

    2017-07-01

    Brown adipose tissue (BAT) provides a means of nonshivering thermogenesis. In humans, active BAT can be visualized by 18 F-FDG uptake as detected by PET combined with CT. The retrospective analysis of clinical scans is a valuable source to identify anthropometric parameters that influence BAT mass and activity and thus the potential efficacy of envisioned drugs targeting this tissue to treat metabolic disease. Methods: We analyzed 2,854 18 F-FDG PET/CT scans from 1,644 patients and identified 98 scans from 81 patients with active BAT. We quantified the volume of active BAT depots (mean values in mL ± SD: total BAT, 162 ± 183 [ n = 98]; cervical, 40 ± 37 [ n = 53]; supraclavicular, 66 ± 68 [ n = 71]; paravertebral, 51 ± 53 [ n = 69]; mediastinal, 43 ± 40 [ n = 51]; subphrenic, 21 ± 21 [ n = 29]). Because only active BAT is detectable by 18 F-FDG uptake, these numbers underestimate the total amount of BAT. Considering only 32 scans of the highest activity as categorized by a visual scoring strategy, we determined a mean total BAT volume of 308 ± 208 mL. In 30 BAT-positive patients with 3 or more repeated scans, we calculated a much higher mean probability to redetect active BAT (52% ± 25%) as compared with the overall prevalence of 4.9%. We calculated a BAT activity index (BFI) based on volume and intensity of individual BAT depots. Results: We detected higher total BFI in younger patients ( P = 0.009), whereas sex, body mass index, height, mass, outdoor temperature, and blood parameters did not affect total or depot-specific BAT activity. Surprisingly, renal creatinine clearance as estimated from mass, age, and plasma creatinine was a significant predictor of BFI on the total ( P = 0.005) as well as on the level of several individual depots. In summary, we detected a high amount of more than 300 mL of BAT tissue. Conclusion: BAT-positive patients represent a group with a higher than usual probability to activate BAT during a scan. Estimated renal creatinine

  1. Significance and prognostic value of increased serum direct bilirubin level for lymph node metastasis in Chinese rectal cancer patients.

    Science.gov (United States)

    Gao, Chun; Fang, Long; Li, Jing-Tao; Zhao, Hong-Chuan

    2016-02-28

    To determine the significance of increased serum direct bilirubin level for lymph node metastasis (LNM) in Chinese rectal cancer patients, after those with known hepatobiliary and pancreatic diseases were excluded. A cohort of 469 patients, who were treated at the China-Japan Friendship Hospital, Ministry of Health (Beijing, China), in the period from January 2003 to June 2011, and with a pathological diagnosis of rectal adenocarcinoma, were recruited. They included 231 patients with LNM (49.3%) and 238 patients without LNM. Follow-up for these patients was taken through to December 31, 2012. The baseline serum direct bilirubin concentration was (median/inter-quartile range) 2.30/1.60-3.42 μmol/L. Univariate analysis showed that compared with patients without LNM, the patients with LNM had an increased level of direct bilirubin (2.50/1.70-3.42 vs 2.10/1.40-3.42, P = 0.025). Multivariate analysis showed that direct bilirubin was independently associated with LNM (OR = 1.602; 95%CI: 1.098-2.338, P = 0.015). Moreover, we found that: (1) serum direct bilirubin differs between male and female patients; a higher concentration was associated with poor tumor classification; (2) as the baseline serum direct bilirubin concentration increased, the percentage of patients with LNM increased; and (3) serum direct bilirubin was associated with the prognosis of rectal cancer patients and higher values indicated poor prognosis. Higher serum direct bilirubin concentration was associated with the increased risk of LNM and poor prognosis in our rectal cancers.

  2. Expression of a bacterial catalase in a strictly anaerobic methanogen significantly increases tolerance to hydrogen peroxide but not oxygen

    Science.gov (United States)

    Jennings, Matthew E.; Schaff, Cody W.; Horne, Alexandra J.; Lessner, Faith H.

    2014-01-01

    Haem-dependent catalase is an antioxidant enzyme that degrades H2O2, producing H2O and O2, and is common in aerobes. Catalase is present in some strictly anaerobic methane-producing archaea (methanogens), but the importance of catalase to the antioxidant system of methanogens is poorly understood. We report here that a survey of the sequenced genomes of methanogens revealed that the majority of species lack genes encoding catalase. Moreover, Methanosarcina acetivorans is a methanogen capable of synthesizing haem and encodes haem-dependent catalase in its genome; yet, Methanosarcina acetivorans cells lack detectable catalase activity. However, inducible expression of the haem-dependent catalase from Escherichia coli (EcKatG) in the chromosome of Methanosarcina acetivorans resulted in a 100-fold increase in the endogenous catalase activity compared with uninduced cells. The increased catalase activity conferred a 10-fold increase in the resistance of EcKatG-induced cells to H2O2 compared with uninduced cells. The EcKatG-induced cells were also able to grow when exposed to levels of H2O2 that inhibited or killed uninduced cells. However, despite the significant increase in catalase activity, growth studies revealed that EcKatG-induced cells did not exhibit increased tolerance to O2 compared with uninduced cells. These results support the lack of catalase in the majority of methanogens, since methanogens are more likely to encounter O2 rather than high concentrations of H2O2 in the natural environment. Catalase appears to be a minor component of the antioxidant system in methanogens, even those that are aerotolerant, including Methanosarcina acetivorans. Importantly, the experimental approach used here demonstrated the feasibility of engineering beneficial traits, such as H2O2 tolerance, in methanogens. PMID:24222618

  3. The adipokine leptin increases skeletal muscle mass and significantly alters skeletal muscle miRNA expression profile in aged mice

    International Nuclear Information System (INIS)

    Hamrick, Mark W.; Herberg, Samuel; Arounleut, Phonepasong; He, Hong-Zhi; Shiver, Austin; Qi, Rui-Qun; Zhou, Li; Isales, Carlos M.

    2010-01-01

    Research highlights: → Aging is associated with muscle atrophy and loss of muscle mass, known as the sarcopenia of aging. → We demonstrate that age-related muscle atrophy is associated with marked changes in miRNA expression in muscle. → Treating aged mice with the adipokine leptin significantly increased muscle mass and the expression of miRNAs involved in muscle repair. → Recombinant leptin therapy may therefore be a novel approach for treating age-related muscle atrophy. -- Abstract: Age-associated loss of muscle mass, or sarcopenia, contributes directly to frailty and an increased risk of falls and fractures among the elderly. Aged mice and elderly adults both show decreased muscle mass as well as relatively low levels of the fat-derived hormone leptin. Here we demonstrate that loss of muscle mass and myofiber size with aging in mice is associated with significant changes in the expression of specific miRNAs. Aging altered the expression of 57 miRNAs in mouse skeletal muscle, and many of these miRNAs are now reported to be associated specifically with age-related muscle atrophy. These include miR-221, previously identified in studies of myogenesis and muscle development as playing a role in the proliferation and terminal differentiation of myogenic precursors. We also treated aged mice with recombinant leptin, to determine whether leptin therapy could improve muscle mass and alter the miRNA expression profile of aging skeletal muscle. Leptin treatment significantly increased hindlimb muscle mass and extensor digitorum longus fiber size in aged mice. Furthermore, the expression of 37 miRNAs was altered in muscles of leptin-treated mice. In particular, leptin treatment increased the expression of miR-31 and miR-223, miRNAs known to be elevated during muscle regeneration and repair. These findings suggest that aging in skeletal muscle is associated with marked changes in the expression of specific miRNAs, and that nutrient-related hormones such as leptin

  4. The adipokine leptin increases skeletal muscle mass and significantly alters skeletal muscle miRNA expression profile in aged mice

    Energy Technology Data Exchange (ETDEWEB)

    Hamrick, Mark W., E-mail: mhamrick@mail.mcg.edu [Department of Cellular Biology and Anatomy, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Department of Orthopaedic Surgery, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Herberg, Samuel; Arounleut, Phonepasong [Department of Cellular Biology and Anatomy, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Department of Orthopaedic Surgery, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); He, Hong-Zhi [Henry Ford Immunology Program, Henry Ford Health System, Detroit, MI (United States); Department of Dermatology, Henry Ford Health System, Detroit, MI (United States); Shiver, Austin [Department of Cellular Biology and Anatomy, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Department of Orthopaedic Surgery, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Qi, Rui-Qun [Henry Ford Immunology Program, Henry Ford Health System, Detroit, MI (United States); Department of Dermatology, Henry Ford Health System, Detroit, MI (United States); Zhou, Li [Henry Ford Immunology Program, Henry Ford Health System, Detroit, MI (United States); Department of Dermatology, Henry Ford Health System, Detroit, MI (United States); Department of Internal Medicine, Henry Ford Health System, Detroit, MI (United States); Isales, Carlos M. [Department of Cellular Biology and Anatomy, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); Department of Orthopaedic Surgery, Institute of Molecular Medicine and Genetics, Medical College of Georgia, Augusta, GA (United States); others, and

    2010-09-24

    Research highlights: {yields} Aging is associated with muscle atrophy and loss of muscle mass, known as the sarcopenia of aging. {yields} We demonstrate that age-related muscle atrophy is associated with marked changes in miRNA expression in muscle. {yields} Treating aged mice with the adipokine leptin significantly increased muscle mass and the expression of miRNAs involved in muscle repair. {yields} Recombinant leptin therapy may therefore be a novel approach for treating age-related muscle atrophy. -- Abstract: Age-associated loss of muscle mass, or sarcopenia, contributes directly to frailty and an increased risk of falls and fractures among the elderly. Aged mice and elderly adults both show decreased muscle mass as well as relatively low levels of the fat-derived hormone leptin. Here we demonstrate that loss of muscle mass and myofiber size with aging in mice is associated with significant changes in the expression of specific miRNAs. Aging altered the expression of 57 miRNAs in mouse skeletal muscle, and many of these miRNAs are now reported to be associated specifically with age-related muscle atrophy. These include miR-221, previously identified in studies of myogenesis and muscle development as playing a role in the proliferation and terminal differentiation of myogenic precursors. We also treated aged mice with recombinant leptin, to determine whether leptin therapy could improve muscle mass and alter the miRNA expression profile of aging skeletal muscle. Leptin treatment significantly increased hindlimb muscle mass and extensor digitorum longus fiber size in aged mice. Furthermore, the expression of 37 miRNAs was altered in muscles of leptin-treated mice. In particular, leptin treatment increased the expression of miR-31 and miR-223, miRNAs known to be elevated during muscle regeneration and repair. These findings suggest that aging in skeletal muscle is associated with marked changes in the expression of specific miRNAs, and that nutrient

  5. Prognostic significance of increased bone marrow microcirculation in newly diagnosed multiple myeloma: results of a prospective DCE-MRI study

    Energy Technology Data Exchange (ETDEWEB)

    Merz, Maximilian; Hillengass, Jens [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); University of Heidelberg, Department of Hematology, Oncology and Rheumatology, Heidelberg (Germany); Moehler, Thomas M.; Ritsch, Judith; Delorme, Stefan [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); Baeuerle, Tobias [University of Erlangen-Nuremberg, Department of Radiology, Erlangen (Germany); Zechmann, Christian M. [Rinecker Proton Therapy, Muenchen (Germany); Wagner, Barbara; Hose, Dirk [University of Heidelberg, Department of Hematology, Oncology and Rheumatology, Heidelberg (Germany); Jauch, Anna [University of Heidelberg, Institute of Human Genetics, Heidelberg (Germany); Kunz, Christina; Hielscher, Thomas [German Cancer Research Center, Department of Biostatistics, Heidelberg (Germany); Laue, Hendrik [Fraunhofer MEVIS, Bremen (Germany); Goldschmidt, Hartmut [University of Heidelberg, Department of Hematology, Oncology and Rheumatology, Heidelberg (Germany); National Center for Tumor Diseases, Heidelberg (Germany)

    2016-05-15

    Aim of this prospective study was to investigate prognostic significance of increased bone marrow microcirculation as detected by dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) for survival and local complications in patients with multiple myeloma (MM). We performed DCE-MRI of the lumbar spine in 131 patients with newly diagnosed MM and analysed data according to the Brix model to acquire amplitude A and exchange rate constant k{sub ep}. In 61 patients a second MRI performed after therapy was evaluated to assess changes in vertebral height and identify vertebral fractures. Correlation analysis revealed significant positive association between beta2-microglobulin as well as immunoparesis with DCE-MRI parameters A and k{sub ep}. Additionally, A was negatively correlated with haemoglobin levels and k{sub ep} was positively correlated with LDH levels. Higher baseline k{sub ep} values were associated with decreased vertebral height in a second MRI (P = 0.007) and A values were associated with new vertebral fractures in the lower lumbar spine (P = 0.03 for L4). Pre-existing lytic bone lesions or remission after therapy had no impact on the occurrence of vertebral fractures. Multivariate analysis revealed that amplitude A is an independent adverse risk factor for overall survival. DCE-MRI is a non-invasive tool with significance for systemic prognosis and vertebral complications. (orig.)

  6. Balance disorder and increased risk of falls in osteoporosis and kyphosis: significance of kyphotic posture and muscle strength.

    Science.gov (United States)

    Sinaki, Mehrsheed; Brey, Robert H; Hughes, Christine A; Larson, Dirk R; Kaufman, Kenton R

    2005-08-01

    This controlled trial was designed to investigate the influence of osteoporosis-related kyphosis (O-K) on falls. Twelve community-dwelling women with O-K (Cobb angle, 50-65 degrees measured from spine radiographs) and 13 healthy women serving as controls were enrolled. Mean age of the O-K group was 76 years (+/-5.1), height 158 cm (+/-5), and weight 61 kg (+/-7.9), and mean age of the control group was 71 years (+/-4.6), height 161 cm (+/-3.8), and weight 66 kg (+/-11.7). Quantitative isometric strength data were collected. Gait was monitored during unobstructed level walking and during stepping over an obstacle of four different heights randomly assigned (2.5%, 5%, 10%, and 15% of the subject's height). Balance was objectively assessed with computerized dynamic posturography consisting of the sensory organization test. Back extensor strength, grip strength, and all lower extremity muscle groups were significantly weaker in the O-K group than the control group (P controls for all conditions of unobstructed and obstructed level walking. Obstacle height had a significant effect on all center-of-mass variables. The O-K subjects had significantly greater balance abnormalities on computerized dynamic posturography than the control group (P =0.002). Data show that thoracic hyperkyphosis on a background of reduced muscle strength plays an important role in increasing body sway, gait unsteadiness, and risk of falls in osteoporosis.

  7. The contribution of human agricultural activities to increasing evapotranspiration is significantly greater than climate change effect over Heihe agricultural region.

    Science.gov (United States)

    Zou, Minzhong; Niu, Jun; Kang, Shaozhong; Li, Xiaolin; Lu, Hongna

    2017-08-18

    Evapotranspiration (ET) is a major component linking the water, energy, and carbon cycles. Understanding changes in ET and the relative contribution rates of human activity and of climate change at the basin scale is important for sound water resources management. In this study, changes in ET in the Heihe agricultural region in northwest China during 1984-2014 were examined using remotely-sensed ET data with the Soil and Water Assessment Tool (SWAT). Correlation analysis identified the dominant factors that influence change in ET per unit area and those that influence change in total ET. Factor analysis identified the relative contribution rates of the dominant factors in each case. The results show that human activity, which includes factors for agronomy and irrigation, and climate change, including factors for precipitation and relative humidity, both contribute to increases in ET per unit area at rates of 60.93% and 28.01%, respectively. Human activity, including the same factors, and climate change, including factors for relative humidity and wind speed, contribute to increases in total ET at rates of 53.86% and 35.68%, respectively. Overall, in the Heihe agricultural region, the contribution of human agricultural activities to increased ET was significantly greater than that of climate change.

  8. Plucking the Golden Goose: Higher Royalty Rates on the Oil Sands Generate Significant Increases in Government Revenue

    Directory of Open Access Journals (Sweden)

    Kenneth J. McKenzie

    2011-09-01

    Full Text Available The Alberta government’s 2009 New Royalty Framework elicited resistance on the part of the energy industry, leading to subsequent reductions in the royalties imposed on natural gas and conventional oil. However, the oil sands sector, subject to different terms, quickly accepted the new arrangement with little complaint, recognizing it as win-win situation for industry and the government. Under the framework, Alberta recoups much more money in royalties — about $1 billion over the two year period of 2009 and 2010 — without impinging significantly on investment in the oil sands. This brief paper demonstrates that by spreading the financial risks and benefits to everyone involved, the new framework proves it’s possible to generate increased revenue without frightening off future investment. The same model could conceivably be applied to the conventional oil and natural gas sectors.

  9. "Not just another Wii training": a graded Wii protocol to increase physical fitness in adolescent girls with probable developmental coordination disorder-a pilot study.

    Science.gov (United States)

    Bonney, Emmanuel; Rameckers, Eugene; Ferguson, Gillian; Smits-Engelsman, Bouwien

    2018-02-22

    Adolescents with low motor competence participate less in physical activity and tend to exhibit decreased physical fitness compared to their peers with high motor competence. It is therefore essential to identify new methods of enhancing physical fitness in this population. Active video games (AVG) have been shown to improve motor performance, yet investigations of its impact on physical fitness are limited. The objective of this study was to examine the impact of the graded Wii protocol in adolescent girls with probable Developmental Coordination Disorder (p-DCD). A single-group pre-post design was conducted to assess the impact of a newly developed Wii protocol in adolescent girls attending school in a low income community of Cape Town, South Africa. Sixteen participants (aged 13-16 years) with p-DCD (≤16th percentile on the MABC-2 test) were recruited. Participants received 45 min Wii training for 14 weeks. Outcome measures included the six-minute walk distance and repeated sprint ability. Information on heart rate, enjoyment and perceived exertion ratings were also collected. Significant improvements in aerobic and anaerobic fitness were observed. The participants reported high enjoyment scores and low perceived exertion ratings. The graded Wii protocol was easily adaptable and required little resources (space, equipment and expertise) to administer. The findings provide preliminary evidence to support the use of the graded Wii protocol for promoting physical fitness in adolescent girls with p-DCD. Further studies are needed to confirm these results and to validate the clinical efficacy of the protocol in a larger sample with a more robust design.

  10. A Prolonged Time Interval Between Trauma and Prophylactic Radiation Therapy Significantly Increases the Risk of Heterotopic Ossification

    Energy Technology Data Exchange (ETDEWEB)

    Mourad, Waleed F., E-mail: Waleed246@gmail.com [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States); Department of Radiation Oncology, Beth Israel Medical Center, New York, NY (Israel); Packianathan, Satyaseelan [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States); Shourbaji, Rania A. [Department of Epidemiology and Biostatistics, Jackson State University, Jackson, MS (United States); Zhang Zhen; Graves, Mathew [Department of Orthopedic Surgery, University of Mississippi Medical Center, Jackson, MS (United States); Khan, Majid A. [Department of Radiology, University of Mississippi Medical Center, Jackson, MS (United States); Baird, Michael C. [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States); Russell, George [Department of Orthopedic Surgery, University of Mississippi Medical Center, Jackson, MS (United States); Vijayakumar, Srinivasan [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States)

    2012-03-01

    Purpose: To ascertain whether the time from injury to prophylactic radiation therapy (RT) influences the rate of heterotopic ossification (HO) after operative treatment of displaced acetabular fractures. Methods and Materials: This is a single-institution, retrospective analysis of patients referred for RT for the prevention of HO. Between January 2000 and January 2009, 585 patients with displaced acetabular fractures were treated surgically followed by RT for HO prevention. We analyzed the effect of time from injury on prevention of HO by RT. In all patients, 700 cGy was prescribed in a single fraction and delivered within 72 hours postsurgery. The patients were stratified into five groups according to time interval (in days) from the date of their accident to the date of RT: Groups A {<=}3, B {<=}7, C {<=}14, D {<=}21, and E >21days. Results: Of the 585 patients with displaced acetabular fractures treated with RT, (18%) 106 patients developed HO within the irradiated field. The risk of HO after RT increased from 10% for RT delivered {<=}3 days to 92% for treatment delivered >21 days after the initial injury. Wilcoxon test showed a significant correlation between the risk of HO and the length of time from injury to RT (p < 0.0001). Chi-square test and multiple logistic regression analysis showed no significant association between all other factors and the risk of HO (race, gender, cause and type of fracture, surgical approach, or the use of indomethacin). Conclusions: Our data suggest that there is higher incidence and risk of HO if prophylactic RT is significantly delayed after a displaced acetabular fracture. Thus, RT should be administered as early as clinically possible after the trauma. Patients undergoing RT >3 weeks from their displaced acetabular fracture should be informed of the higher risk (>90%) of developing HO despite prophylaxis.

  11. Significance of increased lung thallium-201 activity on serial cardiac images after dipyridamole treatment in coronary heart disease

    International Nuclear Information System (INIS)

    Okada, R.D.; Dai, Y.H.; Boucher, C.A.; Pohost, G.M.

    1984-01-01

    Increased lung thallium-201 (Tl-201) activity occurs in patients with severe coronary artery disease (CAD) on initial postexercise images. To determine the significance of assessing lung Tl-201 on serial imaging after dipyridamole therapy, initial and delayed (2 to 3 hours) Tl-201 imaging was performed in 40 patients with CAD and 26 normal control subjects. Lung Tl-201 activity was quantitated as a percentage of maximal myocardial activity for each imaging time (lung Tl-201 index). The mean initial lung Tl-201 activity was 42 +/- 2% (+/- standard error of the mean) in 26 control subjects, 56 +/- 2% in 25 patients with 2- or 3-vessel CAD (p less than 0.001) and 53 +/- 2% in 15 patients with 1-vessel CAD (p less than 0.005 compared with control subjects) (difference not significant between 1-vessel and multivessel CAD). Dipyridamole lung Tl-201 activity decreased relative to the myocardium from initial to delayed images (p less than 0.001) in patients with CAD but not in control subjects. When a dipyridamole lung Tl-201 index of 58% (mean +/- 2 standard deviations for control subjects) was chosen as the upper limit of normal, 14 of 40 of the CAD patients (35%) had abnormal values and all control patients had values within normal limits. These 14 patients with CAD and abnormal initial lung Tl-201 indexes had rest ejection fractions that were not significantly different from those in patients with CAD, and normal initial dipyridamole lung Tl-201 index (58 +/- 4% and 63 +/- 2%, respectively)

  12. Pressure sores significantly increase the risk of developing a Fournier's gangrene in patients with spinal cord injury.

    Science.gov (United States)

    Backhaus, M; Citak, M; Tilkorn, D-J; Meindl, R; Schildhauer, T A; Fehmer, T

    2011-11-01

    Retrospective chart review. The aim of our study was to evaluate the mortality rate and further specific risk factors for Fournier's gangrene in patients with spinal cord injury (SCI). Division of Spinal Cord Injury, BG-University Hospital Bergmannsheil Bochum, Ruhr-University Bochum, Germany. All patients with a SCI and a Fournier's gangrene treated in our hospital were enrolled in this study. Following parameters were taken form patients medical records: age, type of SCI, cause of Fournier's gangrene, number of surgical debridements, length of hospital and intensive care unit stay, co morbidity factors and mortality rate. In addition, laboratory parameter including the laboratory risk indicator for necrotizing fasciitis (LRINEC) score and microbiological findings were analyzed. Clinical diagnosis was made via histological examination. A total of 16 male patients (15 paraplegic and one tetraplegic) were included in the study. In 81% of all cases, the origin of Fournier's gangrene was a pressure sore. The median LRINEC score on admission was 6.5. In the vast majority of cases, a polybacterial infection was found. No patient died during the hospital stay. The mean number of surgical debridements before soft tissue closure was 1.9 and after a mean time interval of 39.1 days wound closure was performed in all patients. Pressure sores significantly increase the risk of developing Fournier's gangrene in patients with SCI. We reported the results of our patients to increase awareness among physicians and training staff working with patients with a SCI in order to expedite the diagnosis.

  13. Six weeks of structured exercise training and hypocaloric diet increases the probability of ovulation after clomiphene citrate in overweight and obese patients with polycystic ovary syndrome: a randomized controlled trial.

    Science.gov (United States)

    Palomba, S; Falbo, A; Giallauria, F; Russo, T; Rocca, M; Tolino, A; Zullo, F; Orio, F

    2010-11-01

    Clomiphene citrate (CC) is the first-line therapy for the induction of ovulation in infertile women with polycystic ovary syndrome (PCOS), but ∼20% of patients are unresponsive. The aim of the current study was to test the hypothesis that a 6-week intervention that consisted of structured exercise training (SET) and hypocaloric diet increases the probability of ovulation after CC in overweight and obese CC-resistant PCOS patients. A cohort of 96 overweight and obese CC-resistant PCOS patients was enrolled consecutively in a three-arm randomized, parallel, controlled, assessor-blinded clinical trial. The three interventions were: SET plus hypocaloric diet for 6 weeks (Group A); 2 weeks of observation followed by one cycle of CC therapy (Group B); and SET plus hypocaloric diet for 6 weeks, with one cycle of CC after the first 2 weeks (Group C). The primary end-point was the ovulation rate. Other reproductive data, as well as anthropometric, hormonal and metabolic data, were also collected and considered as secondary end points. After 6 weeks of SET plus hypocaloric diet, the ovulation rate was significantly (P =0.008) higher in Group C [12/32 (37.5%)] than in Groups A [4/32 (12.5%)] and B [3/32 (9.4%)] with relative risks of 3.9 [95% confidence interval (CI) 1.1-8.3; P = 0.035] and 4.0 (95% CI 1.2-12.8; P = 0.020) compared with Groups A and B, respectively. Compared with baseline, in Groups A and C, a significant improvement in clinical and biochemical androgen and insulin sensitivity indexes was observed. In the same two groups, the insulin sensitivity index was significantly (P hypocaloric diet was effective in increasing the probability of ovulation under CC treatment. The study was registered at Clinicaltrials.gov:NCT0100468.

  14. Association between probable postnatal depression and increased infant mortality and morbidity: findings from the DON population-based cohort study in rural Ghana

    NARCIS (Netherlands)

    Weobong, Benedict; ten Asbroek, Augustinus H. A.; Soremekun, Seyi; Gram, Lu; Amenga-Etego, Seeba; Danso, Samuel; Owusu-Agyei, Seth; Prince, Martin; Kirkwood, Betty R.

    2015-01-01

    To assess the impact of probable depression in the immediate postnatal period on subsequent infant mortality and morbidity. Cohort study nested within 4 weekly surveillance of all women of reproductive age to identify pregnancies and collect data on births and deaths. Rural/periurban communities

  15. Marrying Step Feed with Secondary Clarifier Improvements to Significantly Increase Peak Wet Weather Treatment Capacity: An Integrated Methodology.

    Science.gov (United States)

    Daigger, Glen T; Siczka, John S; Smith, Thomas F; Frank, David A; McCorquodale, J A

    2017-08-01

      The need to increase the peak wet weather secondary treatment capacity of the City of Akron, Ohio, Water Reclamation Facility (WRF) provided the opportunity to test an integrated methodology for maximizing the peak wet weather secondary treatment capacity of activated sludge systems. An initial investigation, consisting of process modeling of the secondary treatment system and computational fluid dynamics (CFD) analysis of the existing relatively shallow secondary clarifiers (3.3 and 3.7 m sidewater depth in 30.5 m diameter units), indicated that a significant increase in capacity from 416 000 to 684 000 m3/d or more was possible by adding step feed capabilities to the existing bioreactors and upgrading the existing secondary clarifiers. One of the six treatment units at the WRF was modified, and an extensive 2-year testing program was conducted to determine the total peak wet weather secondary treatment capacity achievable. The results demonstrated that a peak wet weather secondary treatment capacity approaching 974 000 m3/d is possible as long as secondary clarifier solids and hydraulic loadings could be separately controlled using the step feed capability provided. Excellent sludge settling characteristics are routinely experienced at the City of Akron WRF, raising concerns that the identified peak wet weather secondary treatment capacity could not be maintained should sludge settling characteristics deteriorate for some reason. Computational fluid dynamics analysis indicated that the impact of the deterioration of sludge settling characteristics could be mitigated and the identified peak wet weather secondary treatment capacity maintained by further use of the step feed capability provided to further reduce secondary clarifier solids loading rates at the identified high surface overflow rates. The results also demonstrated that effluent limits not only for total suspended solids (TSS) and five-day carbonaceous biochemical oxygen demand (cBOD5) could be

  16. Influence of anthropogenic activities on PAHs in sediments in a significant gulf of low-latitude developing regions, the Beibu Gulf, South China Sea: distribution, sources, inventory and probability risk.

    Science.gov (United States)

    Li, Pingyang; Xue, Rui; Wang, Yinghui; Zhang, Ruijie; Zhang, Gan

    2015-01-15

    Fifteen polycyclic aromatic hydrocarbons (PAHs) in 41 surface sediment samples and a sediment core (50 cm) from the Beibu Gulf, a significant low-latitude developing gulf, were analyzed. PAHs concentrations were 3.01-388 ng g(-)(1) (mean 95.5 ng g(-)(1)) in the surface sediments and 10.5-87.1 ng g(-)(1) (average 41.1 ng g(-)(1)) in the sediment core. Source apportionment indicated that PAHs were generated from coke production and vehicular emissions (39.4%), coal and biomass combustion (35.8%), and petrogenic sources (24.8%). PAHs were mainly concentrated in the industrialized and urbanized regions and the harbor, and were transported by atmospheric deposition to the marine matrix. The mass inventory (1.57-2.62t) and probability risk showed sediments here served as an important reservoir but low PAH risk. Different from oil and natural gas in developed regions, coal combustion has always been a significant energy consumption pattern in this developing region for the past 30 years (56 ± 5%). Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Clinical significance of increased gelatinolytic activity in the rectal mucosa during external beam radiation therapy of prostate cancer

    International Nuclear Information System (INIS)

    Hovdenak, Nils; Wang Junru; Sung, C.-C.; Kelly, Thomas; Fajardo, Luis F.; Hauer-Jensen, Martin

    2002-01-01

    Purpose: Rectal toxicity (proctitis) is a dose-limiting factor in pelvic radiation therapy. Mucosal atrophy, i.e., net extracellular matrix degradation, is a prominent feature of radiation proctitis, but the underlying mechanisms are not known. We prospectively examined changes in matrix metalloproteinase (MMP)-2 and MMP-9 (gelatinase A and B) in the rectal mucosa during radiation therapy of prostate cancer, as well as the relationships of these changes with symptomatic, structural, and cellular evidence of radiation proctitis. Methods and Materials: Seventeen patients scheduled for external beam radiation therapy for prostate cancer were prospectively enrolled. Symptoms of gastrointestinal toxicity were recorded, and endoscopy with biopsy of the rectal mucosa was performed before radiation therapy, as well as 2 and 6 weeks into the treatment course. Radiation proctitis was assessed by endoscopic scoring, quantitative histology, and quantitative immunohistochemistry. MMP-2 and MMP-9 were localized immunohistochemically, and activities were determined by gelatin zymography. Results: Symptoms, endoscopic scores, histologic injury, and mucosal macrophages and neutrophils increased from baseline to 2 weeks. Symptoms increased further from 2 weeks to 6 weeks, whereas endoscopic and cellular evidence of proctitis did not. Compared to pretreatment values, there was increased total gelatinolytic activity of MMP-2 and MMP-9 at 2 weeks (p=0.02 and p=0.004, respectively) and 6 weeks (p=0.006 and p=0.001, respectively). Active MMP-2 was increased at both time points (p=0.0001 and p=0.002). Increased MMP-9 and MMP-2 at 6 weeks was associated with radiation-induced diarrhea (p=0.007 and p=0.02, respectively) and with mucosal neutrophil infiltration (rho=0.62). Conclusions: Pelvic radiation therapy causes increased MMP-2 and MMP-9 activity in the rectal mucosa. These changes correlate with radiation-induced diarrhea and granulocyte infiltration and may contribute to abnormal

  18. Determination of relative CMRO2 from CBF and BOLD changes: significant increase of oxygen consumption rate during visual stimulation

    DEFF Research Database (Denmark)

    Kim, S.G.; Rostrup, Egill; Larsson, H.B.

    1999-01-01

    signal changes were measured simultaneously using the flow-sensitive alternating inversion recovery (FAIR) technique. During hypercapnia established by an end-tidal CO2 increase of 1.46 kPa, CBF in the visual cortex increased by 47.3 +/- 17.3% (mean +/- SD; n = 9), and deltaR2* was -0.478 +/- 0.147 sec......The blood oxygenation level-dependent (BOLD) effect in functional magnetic resonance imaging depends on at least partial uncoupling between cerebral blood flow (CBF) and cerebral metabolic rate of oxygen (CMRO2) changes. By measuring CBF and BOLD simultaneously, the relative change in CMRO2 can...

  19. Increased Risk of Clinically Significant Gallstones following an Appendectomy: A Five-Year Follow-Up Study.

    Directory of Open Access Journals (Sweden)

    Shiu-Dong Chung

    Full Text Available Although the vermiform appendix is commonly considered a vestigial organ, adverse health consequences after an appendectomy have garnered increasing attention. In this study, we investigated the risks of gallstone occurrence during a 5-year follow-up period after an appendectomy, using a population-based dataset. We used data from the Taiwan Longitudinal Health Insurance Database 2005. The exposed cohort included 4916 patients who underwent an appendectomy. The unexposed cohort was retrieved by randomly selecting 4916 patients matched with the exposed cohort in terms of sex, age, and year. We individually tracked each patient for a 5-year period to identify those who received a diagnosis of gallstones during the follow-up period. Cox proportional hazard regressions were performed for the analysis. During the 5-year follow-up period, the incidence rate per 1000 person-years was 4.71 for patients who had undergone an appendectomy, compared to a rate of 2.59 for patients in the unexposed cohort (p<0.001. Patients who had undergone an appendectomy were independently associated with a 1.79 (95% CI = 1.29~2.48-fold increased risk of being diagnosed with gallstones during the 5-year follow-up period. We found that among female patients, the adjusted hazard ratio of gallstones was 2.25 (95% CI = 1.41~3.59 for patients who underwent an appendectomy compared to unexposed patients. However, for male patients, we failed to observe an increased hazard for gallstones among patients who underwent an appendectomy compared to unexposed patients. We found an increased risk of a subsequent gallstone diagnosis within 5 years after an appendectomy.

  20. Modifications resulting in significant increases in the beam usage time of a 60 keV electron beam welder

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Harrison, J.L.

    1976-01-01

    Short beam usage times were encountered using a 60 keV electron beam welder. These short times were the direct result of a buildup of a reaction product (WO 2 . 90 ) that occurred on graphite washers which housed the tungsten emitter plate. While it was not possible to prevent the reaction product, its growth rate was sufficiently altered by changing graphite materials and minor design changes of the washers. With these modifications beam usage times increased from an original 40 min to approximately 675 min

  1. Was there significant tax evasion after the 1999 50 cent per pack cigarette tax increase in California?

    Science.gov (United States)

    Emery, S; White, M; Gilpin, E; Pierce, J

    2002-01-01

    Objectives: Several states, including California, have implemented large cigarette excise tax increases, which may encourage smokers to purchase their cigarettes in other lower taxed states, or from other lower or non-taxed sources. Such tax evasion thwarts tobacco control objectives and may cost the state substantial tax revenues. Thus, this study investigates the extent of tax evasion in the 6–12 months after the implementation of California's $0.50/pack excise tax increase. Design and setting: Retrospective data analysis from the 1999 California Tobacco Surveys (CTS), a random digit dialled telephone survey of California households. Main outcome measures: Sources of cigarettes, average daily cigarette consumption, and reported price paid. Results: Very few (5.1 (0.7)% (±95% confidence limits)) of California smokers avoided the excise tax by usually purchasing cigarettes from non- or lower taxed sources, such as out-of-state outlets, military commissaries, or the internet. The vast majority of smokers purchased their cigarettes from the most convenient and expensive sources: convenience stores/gas (petrol) stations (45.0 (1.9)%), liquor/drug stores (16.4 (1.6)%), and supermarkets (8.8 (1.2)%). Conclusions: Despite the potential savings, tax evasion by individual smokers does not appear to pose a serious threat to California's excise tax revenues or its tobacco control objectives. PMID:12035006

  2. Love is the triumph of the imagination: Daydreams about significant others are associated with increased happiness, love and connection.

    Science.gov (United States)

    Poerio, Giulia L; Totterdell, Peter; Emerson, Lisa-Marie; Miles, Eleanor

    2015-05-01

    Social relationships and interactions contribute to daily emotional well-being. The emotional benefits that come from engaging with others are known to arise from real events, but do they also come from the imagination during daydreaming activity? Using experience sampling methodology with 101 participants, we obtained 371 reports of naturally occurring daydreams with social and non-social content and self-reported feelings before and after daydreaming. Social, but not non-social, daydreams were associated with increased happiness, love and connection and this effect was not solely attributable to the emotional content of the daydreams. These effects were only present when participants were lacking in these feelings before daydreaming and when the daydream involved imagining others with whom the daydreamer had a high quality relationship. Findings are consistent with the idea that social daydreams may function to regulate emotion: imagining close others may serve the current emotional needs of daydreamers by increasing positive feelings towards themselves and others. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Electronic prompts significantly increase response rates to postal questionnaires: a randomized trial within a randomized trial and meta-analysis.

    Science.gov (United States)

    Clark, Laura; Ronaldson, Sarah; Dyson, Lisa; Hewitt, Catherine; Torgerson, David; Adamson, Joy

    2015-12-01

    To assess the effectiveness of sending electronic prompts to randomized controlled trial participants to return study questionnaires. A "trial within a trial" embedded within a study determining the effectiveness of chronic obstructive pulmonary disease (DOC) screening on smoking cessation. Those participants taking part in DOC who provided a mobile phone number and/or an electronic mail address were randomized to either receive an electronic prompt or no electronic prompt to return a study questionnaire. The results were combined with two previous studies in a meta-analysis. A total of 437 participants were randomized: 226 to the electronic prompt group and 211 to the control group. A total of 285 (65.2%) participants returned the follow-up questionnaire: 157 (69.5%) in the electronic prompt group and 128 (60.7%) in the control group [difference 8.8%; 95% confidence interval (CI): -0.11%, 17.7%; P = 0.05]. The mean time to response was 23 days in the electronic prompt group and 33 days in the control group (hazard ratio = 1.27; 95% CI: 1.105, 1.47). The meta-analysis of all three studies showed an increase in response rate of 7.1% (95% CI: 0.8%, 13.3%). The use of electronic prompts increased response rates and reduces the time to response. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. 'Knowledge for better health' revisited - the increasing significance of health research systems: a review by departing Editors-in-Chief.

    Science.gov (United States)

    Hanney, Stephen R; González-Block, Miguel A

    2017-10-02

    How can nations organise research investments to obtain the best bundle of knowledge and the maximum level of improved health, spread as equitably as possible? This question was the central focus of a major initiative from WHO led by Prof Tikki Pang, which resulted in a range of developments, including the publication of a conceptual framework for national health research systems - Knowledge for better health - in 2003, and in the founding of the journal Health Research Policy and Systems (HARPS). As Editors-in-Chief of the journal since 2006, we mark our retirement by tracking both the progress of the journal and the development of national health research systems. HARPS has maintained its focus on a range of central themes that are key components of a national health research system in any country. These include building capacity to conduct and use health research, identifying appropriate priorities, securing funds and allocating them accountably, producing scientifically valid research outputs, promoting the use of research in polices and practice in order to improve health, and monitoring and evaluating the health research system. Some of the themes covered in HARPS are now receiving increased attention and, for example, with the assessment of research impact and development of knowledge translation platforms, the journal has covered their progress throughout that expansion of interest. In addition, there is increasing recognition of new imperatives, including the importance of promoting gender equality in health research if benefits are to be maximised. In this Editorial, we outline some of the diverse and developing perspectives considered within each theme, as well as considering how they are held together by the growing desire to build effective health research systems in all countries.From 2003 until mid-June 2017, HARPS published 590 articles on the above and related themes, with authors being located in 76 countries. We present quantitative data tracing

  6. Irrigation Is Significantly Associated with an Increased Prevalence of Listeria monocytogenes in Produce Production Environments in New York State.

    Science.gov (United States)

    Weller, Daniel; Wiedmann, Martin; Strawn, Laura K

    2015-06-01

    Environmental (i.e., meteorological and landscape) factors and management practices can affect the prevalence of foodborne pathogens in produce production environments. This study was conducted to determine the prevalence of Listeria monocytogenes, Listeria species (including L. monocytogenes), Salmonella, and Shiga toxin-producing Escherichia coli (STEC) in produce production environments and to identify environmental factors and management practices associated with their isolation. Ten produce farms in New York State were sampled during a 6-week period in 2010, and 124 georeferenced samples (80 terrestrial, 33 water, and 11 fecal) were collected. L. monocytogenes, Listeria spp., Salmonella, and STEC were detected in 16, 44, 4, and 5% of terrestrial samples, 30, 58, 12, and 3% of water samples, and 45, 45, 27, and 9% of fecal samples, respectively. Environmental factors and management practices were evaluated for their association with terrestrial samples positive for L. monocytogenes or other Listeria species by univariate logistic regression; analysis was not conducted for Salmonella or STEC because the number of samples positive for these pathogens was low. Although univariate analysis identified associations between isolation of L. monocytogenes or Listeria spp. from terrestrial samples and various water-related factors (e.g., proximity to wetlands and precipitation), multivariate analysis revealed that only irrigation within 3 days of sample collection was significantly associated with isolation of L. monocytogenes (odds ratio = 39) and Listeria spp. (odds ratio = 5) from terrestrial samples. These findings suggest that intervention at the irrigation level may reduce the risk of produce contamination.

  7. Plant Explants Grown on Medium Supplemented with Fe3O4 Nanoparticles Have a Significant Increase in Embryogenesis

    Directory of Open Access Journals (Sweden)

    Inese Kokina

    2017-01-01

    Full Text Available Development of nanotechnology leads to the increasing release of nanoparticles in the environment that results in accumulation of different NPs in living organisms including plants. This can lead to serious changes in plant cultures which leads to genotoxicity. The aims of the present study were to detect if iron oxide NPs pass through the flax cell wall, to compare callus morphology, and to estimate the genotoxicity in Linum usitatissimum L. callus cultures induced by different concentrations of Fe3O4 nanoparticles. Two parallel experiments were performed: experiment A, where flax explants were grown on medium supplemented with 0.5 mg/l, 1 mg/l, and 1.5 mg/l Fe3O4 NPs for callus culture obtaining, and experiment B, where calluses obtained from basal MS medium were transported into medium supplemented with concentrations of NPs identical to experiment A. Obtained results demonstrate similarly in both experiments that 25 nm Fe3O4 NPs pass into callus cells and induce low toxicity level in the callus cultures. Nevertheless, calluses from experiment A showed 100% embryogenesis in comparison with experiment B where 100% rhizogenesis was noticed. It could be associated with different stress levels and adaptation time for explants and calluses that were transported into medium with Fe3O4 NPs supplementation.

  8. Clinical significance of stress-related increase in blood pressure: current evidence in office and out-of-office settings.

    Science.gov (United States)

    Munakata, Masanori

    2018-05-29

    High blood pressure is the most significant risk factor of cardiovascular and cerebrovascular diseases worldwide. Blood pressure and its variability are recognized as risk factors. Thus, hypertension control should focus not only on maintaining optimal levels but also on achieving less variability in blood pressure. Psychosocial stress is known to contribute to the development and worsening of hypertension. Stress is perceived by the brain and induces neuroendocrine responses in either a rapid or long-term manner. Moreover, endothelial dysfunction and inflammation might be further involved in the modulation of blood pressure elevation associated with stress. White-coat hypertension, defined as high clinic blood pressure but normal out-of-office blood pressure, is the most popular stress-related blood pressure response. Careful follow-up is necessary for this type of hypertensive patients because some show organ damage or a worse prognosis. On the other hand, masked hypertension, defined as high out-of-office blood pressure but normal office blood pressure, has received considerable interest as a poor prognostic condition. The cause of masked hypertension is complex, but evidence suggests that chronic stress at the workplace or home could be involved. Chronic psychological stress could be associated with distorted lifestyle and mental distress as well as long-lasting allostatic load, contributing to the maintenance of blood pressure elevation. Stress issues are common in patients in modern society. Considering psychosocial stress as the pathogenesis of blood pressure elevation is useful for achieving an individual-focused approach and 24-h blood pressure control.

  9. High Dose Atorvastatin Associated with Increased Risk of Significant Hepatotoxicity in Comparison to Simvastatin in UK GPRD Cohort.

    Directory of Open Access Journals (Sweden)

    Alan T Clarke

    Full Text Available Occasional risk of serious liver dysfunction and autoimmune hepatitis during atorvastatin therapy has been reported. We compared the risk of hepatotoxicity in atorvastatin relative to simvastatin treatment.The UK GPRD identified patients with a first prescription for simvastatin [164,407] or atorvastatin [76,411] between 1997 and 2006, but with no prior record of liver disease, alcohol-related diagnosis, or liver dysfunction. Incident liver dysfunction in the following six months was identified by biochemical value and compared between statin groups by Cox regression model adjusting for age, sex, year treatment started, dose, alcohol consumption, smoking, body mass index and comorbid conditions.Moderate to severe hepatotoxicity [bilirubin >60μmol/L, AST or ALT >200U/L or alkaline phosphatase >1200U/L] developed in 71 patients on atorvastatin versus 101 on simvastatin. Adjusted hazard ratio [AHR] for all atorvastatin relative to simvastatin was 1.9 [95% confidence interval 1.4-2.6]. High dose was classified as 40-80mg daily and low dose 10-20mg daily. Hepatotoxicity occurred in 0.44% of 4075 patients on high dose atorvastatin [HDA], 0.07% of 72,336 on low dose atorvastatin [LDA], 0.09% of 44,675 on high dose simvastatin [HDS] and 0.05% of 119,732 on low dose simvastatin [LDS]. AHRs compared to LDS were 7.3 [4.2-12.7] for HDA, 1.4 [0.9-2.0] for LDA and 1.5 [1.0-2.2] for HDS.The risk of hepatotoxicity was increased in the first six months of atorvastatin compared to simvastatin treatment, with the greatest difference between high dose atorvastatin and low dose simvastatin. The numbers of events in the analyses were small.

  10. High Dose Atorvastatin Associated with Increased Risk of Significant Hepatotoxicity in Comparison to Simvastatin in UK GPRD Cohort

    Science.gov (United States)

    Clarke, Alan T.; Johnson, Paul C. D.; Hall, Gillian C.; Ford, Ian; Mills, Peter R.

    2016-01-01

    Background & Aims Occasional risk of serious liver dysfunction and autoimmune hepatitis during atorvastatin therapy has been reported. We compared the risk of hepatotoxicity in atorvastatin relative to simvastatin treatment. Methods The UK GPRD identified patients with a first prescription for simvastatin [164,407] or atorvastatin [76,411] between 1997 and 2006, but with no prior record of liver disease, alcohol-related diagnosis, or liver dysfunction. Incident liver dysfunction in the following six months was identified by biochemical value and compared between statin groups by Cox regression model adjusting for age, sex, year treatment started, dose, alcohol consumption, smoking, body mass index and comorbid conditions. Results Moderate to severe hepatotoxicity [bilirubin >60μmol/L, AST or ALT >200U/L or alkaline phosphatase >1200U/L] developed in 71 patients on atorvastatin versus 101 on simvastatin. Adjusted hazard ratio [AHR] for all atorvastatin relative to simvastatin was 1.9 [95% confidence interval 1.4–2.6]. High dose was classified as 40–80mg daily and low dose 10–20mg daily. Hepatotoxicity occurred in 0.44% of 4075 patients on high dose atorvastatin [HDA], 0.07% of 72,336 on low dose atorvastatin [LDA], 0.09% of 44,675 on high dose simvastatin [HDS] and 0.05% of 119,732 on low dose simvastatin [LDS]. AHRs compared to LDS were 7.3 [4.2–12.7] for HDA, 1.4 [0.9–2.0] for LDA and 1.5 [1.0–2.2] for HDS. Conclusions The risk of hepatotoxicity was increased in the first six months of atorvastatin compared to simvastatin treatment, with the greatest difference between high dose atorvastatin and low dose simvastatin. The numbers of events in the analyses were small. PMID:26983033

  11. A significant increase in the pepsinogen I/II ratio is a reliable biomarker for successful Helicobacter pylori eradication.

    Directory of Open Access Journals (Sweden)

    Hiroki Osumi

    Full Text Available Helicobacter pylori (H. pylori eradication is usually assessed using the 13C-urea breath test (UBT, anti-H. pylori antibody and the H. pylori stool antigen test. However, a few reports have used pepsinogen (PG, in particular, the percentage change in the PG I/II ratio. Here, we evaluated the usefulness of the percentage changes in serum PG I/II ratios for determining the success of eradication therapy for H. pylori.In total, 650 patients received eradication therapy from October 2008 to March 2013 in our Cancer Institute Hospital. We evaluated the relationship between H. pylori eradication and percentage changes in serum PG I/II ratios before and 3 months after treatment with CLEIA® (FUJIREBIO Inc, Tokyo, Japan. The gold standard of H. pylori eradication was defined as negative by the UBT performed 3 months after completion of eradication treatment. Cut-off values for percentage changes in serum PG I/II ratios were set as +40, +25 and +10% when the serum PG I/II ratio before treatment was below 3.0, above 3.0 but below 5.0 and 5.0 or above, respectively.Serum PG I and PG II levels were measured in 562 patients with H. pylori infection before and after eradication therapy. Eradication of H. pylori was achieved in 433 patients studied (77.0%. The ratios of first, second, third-line and penicillin allergy eradication treatment were 73.8% (317/429, 88.3% (99/112, 75% (12/16 and 100% (5/5, respectively. An increasing percentage in the serum levels of the PG I/II ratios after treatment compared with the values before treatment clearly distinguished success from failure of eradication (108.2±57.2 vs. 6.8±30.7, p<0.05. Using the above cut-off values, the sensitivity, specificity and validity for determination of H. pylori were 93.1, 93.8 and 93.2%, respectively.In conclusion, the percentage changes in serum PG I/II ratios are useful as evaluation criteria for assessing the success of eradication therapy for H. pylori.

  12. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  13. Cold-knife conisation and large loop excision of transformation zone significantly increase the risk for spontaneous preterm birth: a population-based cohort study.

    Science.gov (United States)

    Jančar, Nina; Mihevc Ponikvar, Barbara; Tomšič, Sonja

    2016-08-01

    Our aim was to explore the association between cold-knife conisation and large loop excision of transformation zone (LLETZ) with spontaneous preterm birth in a large 10-year national sample. We wanted to explore further the association of these procedures with preterm birth according to gestation. We conducted a population based retrospective cohort study, using data from national Medical Birth Registry. The study population consisted of all women giving birth to singletons in the period 2003-2012 in Slovenia, excluding all induced labors and elective cesarean sections before 37 weeks of gestation (N=192730). We compared the prevalence of spontaneous preterm births (before 28 weeks, before 32 weeks, before 34 weeks and before 37 weeks of gestation) in women with cold-knife conisation or LLETZ compared to women without history of conisation, calculating odds ratios (OR), adjusted for potential confounders. Chi-square test was used for descriptive analysis. Logistic regression analyses were performed to estimate crude odds ratio (OR) and adjusted odds ratio (aOR) and their 95% confidence intervals (95% CI) with two-sided probability (p) values. A total of 8420 (4.4%) women had a preterm birth before 37 weeks of gestation, 2250 (1.2%) before 34 weeks of gestation, 1333 (0.7%) before 32 weeks of gestation and 603 (0.3%) before 28 weeks of gestation. A total of 4580 (2.4%) women had some type of conisation in their medical history: 2083 (1.1%) had cold-knife conisation and 2498 (1.3%) had LLETZ. In women with history of cold-knife conisation, the adjusted OR for preterm birth before 37 weeks of gestation was 3.13 (95% CI; 2.74-3.57) and for preterm birth before 28 weeks of gestation 5.96 (95% CI; 4.3-8.3). In women with history of LLETZ, the adjusted OR was 1.95 (95% CI; 1.68-2.25) and 2.88 (95% CI; 1.87-4.43), respectively. Women with cervical excision procedure of any kind have significantly increased odds for preterm birth, especially for preterm birth before 28

  14. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  15. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  16. Changes in rocket salad phytochemicals within the commercial supply chain: Glucosinolates, isothiocyanates, amino acids and bacterial load increase significantly after processing.

    Science.gov (United States)

    Bell, Luke; Yahya, Hanis Nadia; Oloyede, Omobolanle Oluwadamilola; Methven, Lisa; Wagstaff, Carol

    2017-04-15

    Five cultivars of Eruca sativa and a commercial variety of Diplotaxis tenuifolia were grown in the UK (summer) and subjected to commercial growth, harvesting and processing, with subsequent shelf life storage. Glucosinolates (GSL), isothiocyanates (ITC), amino acids (AA), free sugars, and bacterial loads were analysed throughout the supply chain to determine the effects on phytochemical compositions. Bacterial load of leaves increased significantly over time and peaked during shelf life storage. Significant correlations were observed with GSL and AA concentrations, suggesting a previously unknown relationship between plants and endemic leaf bacteria. GSLs, ITCs and AAs increased significantly after processing and during shelf life. The supply chain did not significantly affect glucoraphanin concentrations, and its ITC sulforaphane significantly increased during shelf life in E. sativa cultivars. We hypothesise that commercial processing may increase the nutritional value of the crop, and have added health benefits for the consumer. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  18. Climate change increases the probability of heavy rains in Northern England/Southern Scotland like those of storm Desmond—a real-time event attribution revisited

    Science.gov (United States)

    Otto, Friederike E. L.; van der Wiel, Karin; van Oldenborgh, Geert Jan; Philip, Sjoukje; Kew, Sarah F.; Uhe, Peter; Cullen, Heidi

    2018-02-01

    On 4-6 December 2015, storm Desmond caused very heavy rainfall in Northern England and Southern Scotland which led to widespread flooding. A week after the event we provided an initial assessment of the influence of anthropogenic climate change on the likelihood of one-day precipitation events averaged over an area encompassing Northern England and Southern Scotland using data and methods available immediately after the event occurred. The analysis was based on three independent methods of extreme event attribution: historical observed trends, coupled climate model simulations and a large ensemble of regional model simulations. All three methods agreed that the effect of climate change was positive, making precipitation events like this about 40% more likely, with a provisional 2.5%-97.5% confidence interval of 5%-80%. Here we revisit the assessment using more station data, an additional monthly event definition, a second global climate model and regional model simulations of winter 2015/16. The overall result of the analysis is similar to the real-time analysis with a best estimate of a 59% increase in event frequency, but a larger confidence interval that does include no change. It is important to highlight that the observational data in the additional monthly analysis does not only represent the rainfall associated with storm Desmond but also that of storms Eve and Frank occurring towards the end of the month.

  19. Mechanistic considerations used in the development of the probability of failure in transient increases in power (PROFIT) pellet-zircaloy cladding (thermo-mechanical-chemical) interactions (pci) fuel failure model

    International Nuclear Information System (INIS)

    Pankaskie, P.J.

    1980-05-01

    A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) interactions (PCI) failure model for estimating the Probability of Failure in Transient Increases in Power (PROFIT) was developed. PROFIT is based on (1) standard statistical methods applied to available PCI fuel failure data and (2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmental and strain-rate dependent Strain Energy Absorption to Failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-dislocation interaction effects in the Zircaloy cladding

  20. Pattern of Prostate-Specific Antigen (PSA) Failure Dictates the Probability of a Positive Bone Scan in Patients With an Increasing PSA After Radical Prostatectomy

    Science.gov (United States)

    Dotan, Zohar A.; Bianco, Fernando J.; Rabbani, Farhang; Eastham, James A.; Fearn, Paul; Scher, Howard I.; Kelly, Kevin W.; Chen, Hui-Ni; Schöder, Heiko; Hricak, Hedvig; Scardino, Peter T.; Kattan, Michael W.

    2007-01-01

    Purpose Physicians often order periodic bone scans (BS) to check for metastases in patients with an increasing prostate-specific antigen (PSA; biochemical recurrence [BCR]) after radical prostatectomy (RP), but most scans are negative. We studied patient characteristics to build a predictive model for a positive scan. Patients and Methods From our prostate cancer database we identified all patients with detectable PSA after RP. We analyzed the following features at the time of each bone scan for association with a positive BS: preoperative PSA, time to BCR, pathologic findings of the RP, PSA before the BS (trigger PSA), PSA kinetics (PSA doubling time, PSA slope, and PSA velocity), and time from BCR to BS. The results were incorporated into a predictive model. Results There were 414 BS performed in 239 patients with BCR and no history of androgen deprivation therapy. Only 60 (14.5%) were positive for metastases. In univariate analysis, preoperative PSA (P = .04), seminal vesicle invasion (P = .02), PSA velocity (P < .001), and trigger PSA (P < .001) predicted a positive BS. In multivariate analysis, only PSA slope (odds ratio [OR], 2.71; P = .03), PSA velocity (OR, 0.93; P = .003), and trigger PSA (OR, 1.022; P < .001) predicted a positive BS. A nomogram for predicting the bone scan result was constructed with an overfit-corrected concordance index of 0.93. Conclusion Trigger PSA, PSA velocity, and slope were associated with a positive BS. A highly discriminating nomogram can be used to select patients according to their risk for a positive scan. Omitting scans in low-risk patients could reduce substantially the number of scans ordered. PMID:15774789

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  5. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  8. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  9. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  10. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  11. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  12. Reduced memory skills and increased hair cortisol levels in recent Ecstasy/MDMA users: significant but independent neurocognitive and neurohormonal deficits.

    Science.gov (United States)

    Downey, Luke A; Sands, Helen; Jones, Lewis; Clow, Angela; Evans, Phil; Stalder, Tobias; Parrott, Andrew C

    2015-05-01

    The goals of this study were to measure the neurocognitive performance of recent users of recreational Ecstasy and investigate whether it was associated with the stress hormone cortisol. The 101 participants included 27 recent light users of Ecstasy (one to four times in the last 3 months), 23 recent heavier Ecstasy users (five or more times) and 51 non-users. Rivermead paragraph recall provided an objective measure for immediate and delayed recall. The prospective and retrospective memory questionnaire provided a subjective index of memory deficits. Cortisol levels were taken from near-scalp 3-month hair samples. Cortisol was significantly raised in recent heavy Ecstasy users compared with controls, whereas hair cortisol in light Ecstasy users was not raised. Both Ecstasy groups were significantly impaired on the Rivermead delayed word recall, and both groups reported significantly more retrospective and prospective memory problems. Stepwise regression confirmed that lifetime Ecstasy predicted the extent of these memory deficits. Recreational Ecstasy is associated with increased levels of the bio-energetic stress hormone cortisol and significant memory impairments. No significant relationship between cortisol and the cognitive deficits was observed. Ecstasy users did display evidence of a metacognitive deficit, with the strength of the correlations between objective and subjective memory performances being significantly lower in the Ecstasy users. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  14. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  15. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  19. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  20. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  1. Final report of a randomized trial on altered-fractionated radiotherapy in nasopharyngeal carcinoma prematurely terminated by significant increase in neurologic complications

    International Nuclear Information System (INIS)

    Teo, Peter Man Lung; Leung, Sing Fai; Chan, Anthony Tak Cheung; Leung, Thomas Wai Tong; Choi, Peter Ho Keung; Kwan, Wing Hong; Lee, Wai Yee; Chau, Ricky Ming Chun; Yu, Peter Kau Wing; Johnson, Philip James

    2000-01-01

    Purpose: The aim of the present study was to compare the survival, local control and complications of conventional/accelerated-hyperfractionated radiotherapy and conventional radiotherapy in nonmetastatic nasopharyngeal carcinoma (NPC). Methods and Materials: From February 1993 to October 1995, 159 patients with newly diagnosed nonmetastatic (M0) NPC with N0 or 4 cm or less N1 disease (Ho's N-stage classification, 1978) were randomized to receive either conventional radiotherapy (Arm I, n = 82) or conventional/accelerated-hyperfractionated radiotherapy (Arm II, n = 77). Stratification was according to the T stage. The biologic effective dose (10 Grays) to the primary and the upper cervical lymphatics were 75.0 and 73.1 for Arm I and 84.4 and 77.2 for Arm II, respectively. Results: With comparable distribution among the T stages between the two arms, the free from local failure rate at 5 years after radiotherapy was not significantly different between the two arms (85.3%; 95% confidence interval, 77.2-93.4% for Arm I; and 88.9%; 95% confidence interval, 81.7-96.2% for Arm II). The two arms were also comparable in overall survival, relapse-free survival, and rates of distant metastasis and regional relapse. Conventional/accelerated-hyperfractionated radiotherapy was associated with significantly increased radiation-induced damage to the central nervous system (including temporal lobe, cranial nerves, optic nerve/chiasma, and brainstem/spinal cord) in Arm II. Although insignificant, radiation-induced cranial nerve(s) palsy (typically involving VIII-XII), trismus, neck soft tissue fibrosis, and hypopituiturism and hypothyroidism occurred more often in Arm II. In addition, the complications occurred at significantly shorter intervals after radiotherapy in Arm II. Conclusion: Accelerated hyperfractionation when used in conjunction with a two-dimensional radiotherapy planning technique, in this case the Ho's technique, resulted in increased radiation damage to the central

  2. Editorial Commentary: Big Data Suggest That Because of a Significant Increased Risk of Postoperative Infection, Steroid Injection Is Not Recommended After Ankle Arthroscopy.

    Science.gov (United States)

    Brand, Jefferson C

    2016-02-01

    A recent study addressing infection rate after intra-articular steroid injection during ankle arthroscopy gives pause to this practice, with an odds ratio of 2.2 in the entire population that was injected with a steroid simultaneously with ankle arthroscopy compared with patients who did not receive an ankle injection. Big data, used in the study upon which the Editor comments here, suggest that because of a significant increased risk of postoperative infection, steroid injection is not recommended after ankle arthroscopy. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  3. Prenatal prochloraz treatment significantly increases pregnancy length and reduces offspring weight but does not affect social-olfactory memory in rats

    DEFF Research Database (Denmark)

    Dmytriyeva, Oksana; Klementiev, Boris; Berezin, Vladimir

    2013-01-01

    Metabolites of the commonly used imidazole fungicide prochloraz are androgen receptor antagonists. They have been shown to block androgen-driven development and compromise reproductive function. We tested the effect of prochloraz on cognitive behavior following exposure to this fungicide during...... the perinatal period. Pregnant Wistar rats were administered a 200mg/kg dose of prochloraz on gestational day (GD) 7, GD11, and GD15. The social recognition test (SRT) was performed on 7-week-old male rat offspring. We found an increase in pregnancy length and a significantly reduced pup weight on PND15 and PND...

  4. Prognostic significance of repeat biopsy in lupus nephritis: Histopathologic worsening and a short time between biopsies is associated with significantly increased risk for end stage renal disease and death.

    Science.gov (United States)

    Arriens, Cristina; Chen, Sixia; Karp, David R; Saxena, Ramesh; Sambandam, Kamalanathan; Chakravarty, Eliza; James, Judith A; Merrill, Joan T

    2017-12-01

    histopathology had died compared to 2 (3.2%) of non-worsening patients. Biopsy worsening was associated with a significantly greater 15-year risk of ESRD (Hazard Ratio 4.2, p=0.0001) and death (Hazard Ratio 4.3, p=0.022), adjusting for age, gender, race, biopsy class, and treatment. Time between first and second biopsies was 5years in 28. Over a 15-year period, those with <1year between first and second biopsies (presumably enriched for patients with early clinical signs of progression) had a significantly greater risk of ESRD (Hazard Ratio 13.7, p<0.0001) and death (Hazard Ratio 16.9, p=0.0022) after adjusting for age, gender, race, biopsy class, and treatment. A repeat renal biopsy demonstrating worsening pathology increases the risk of ESRD and death more than four-fold compared to non-worsening patients. Given known potential mismatch between biopsy and clinical data, repeat biopsies may add important information and justify changes in treatment not considered on clinical grounds. Earlier detection of poor prognostic signs in those without early clinical deterioration might improve outcomes in enough patients to reconsider cost effectiveness of routine repeat biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Among Metabolic Factors, Significance of Fasting and Postprandial Increases in Acyl and Desacyl Ghrelin and the Acyl/Desacyl Ratio in Obstructive Sleep Apnea before and after Treatment.

    Science.gov (United States)

    Chihara, Yuichi; Akamizu, Takashi; Azuma, Masanori; Murase, Kimihiko; Harada, Yuka; Tanizawa, Kiminobu; Handa, Tomohiro; Oga, Toru; Mishima, Michiaki; Chin, Kazuo

    2015-08-15

    There are reports suggesting that obstructive sleep apnea (OSA) may itself cause weight gain. However, recent reports showed increases in body mass index (BMI) following continuous positive airway pressure (CPAP) treatments. When considering weight changes, changes in humoral factors that have significant effects on appetite such as acyl (AG) and desacyl ghrelin (DAG), leptin, insulin, and glucose and their interactions, examples of which are AG/DAG and AG/insulin, are important. The aim of this study was to test the hypothesis that some appetite-related factors had a specific profile before and after CPAP treatment. Metabolic parameters were measured cross-sectionally while fasting and 30, 60, 90, and 120 min following breakfast in no or mild OSA (apnea-hypopnea index fasting and postprandial glucose, insulin, and leptin levels did not differ between no or mild OSA and moderate-to-severe OSA participants, AG and DAG, including AG/DAG and AG/insulin, under fasting and postprandial conditions were significantly increased in the moderate-to-severe OSA patients (p continuous changes in ghrelin secretion in OSA patients existed at least within 3 months of CPAP treatment. Methods to prevent OSA as well as treatment in its early stage may be recommended. © 2015 American Academy of Sleep Medicine.

  6. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  7. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  8. Sunitinib significantly suppresses the proliferation, migration, apoptosis resistance, tumor angiogenesis and growth of triple-negative breast cancers but increases breast cancer stem cells.

    Science.gov (United States)

    Chinchar, Edmund; Makey, Kristina L; Gibson, John; Chen, Fang; Cole, Shelby A; Megason, Gail C; Vijayakumar, Srinivassan; Miele, Lucio; Gu, Jian-Wei

    2014-01-01

    The majority of triple-negative breast cancers (TNBCs) are basal-like breast cancers. However there is no reported study on anti-tumor effects of sunitinib in xenografts of basal-like TNBC (MDA-MB-468) cells. In the present study, MDA-MB-231, MDA-MB-468, MCF-7 cells were cultured using RPMI 1640 media with 10% FBS. Vascular endothelia growth factor (VEGF) protein levels were detected using ELISA (R & D Systams). MDA-MB-468 cells were exposed to sunitinib for 18 hours for measuring proliferation (3H-thymidine incorporation), migration (BD Invasion Chamber), and apoptosis (ApopTag and ApoScreen Anuexin V Kit). The effect of sunitinib on Notch-1 expression was determined by Western blot in cultured MDA-MB-468 cells. 10(6) MDA-MB-468 cells were inoculated into the left fourth mammary gland fat pad in athymic nude-foxn1 mice. When the tumor volume reached 100 mm(3), sunitinib was given by gavage at 80 mg/kg/2 days for 4 weeks. Tumor angiogenesis was determined by CD31 immunohistochemistry. Breast cancer stem cells (CSCs) isolated from the tumors were determined by flow cytometry analysis using CD44(+)/CD24(-) or low. ELISA indicated that VEGF was much more highly expressed in MDA-MB-468 cells than MDA-MB-231 and MCF-7 cells. Sunitinib significantly inhibited the proliferation, invasion, and apoptosis resistance in cultured basal like breast cancer cells. Sunitinib significantly increased the expression of Notch-1 protein in cultured MDA-MB-468 or MDA-MB-231 cells. The xenograft models showed that oral sunitinib significantly reduced the tumor volume of TNBCs in association with the inhibition of tumor angiogeneisis, but increased breast CSCs. These findings support the hypothesis that the possibility should be considered of sunitinib increasing breast CSCs though it inhibits TNBC tumor angiogenesis and growth/progression, and that effects of sunitinib on Notch expression and hypoxia may increase breast cancer stem cells. This work provides the groundwork for an

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. MaquiBright™ standardized maqui berry extract significantly increases tear fluid production and ameliorates dry eye-related symptoms in a clinical pilot trial.

    Science.gov (United States)

    Hitoe, S; Tanaka, J; Shimoda, H

    2014-09-01

    Dry eye symptoms, resulting from insufficient tear fluid generation, represent a considerable burden for a largely underestimated number of people. We concluded from earlier pre-clinical investigations that the etiology of dry eyes encompasses oxidative stress burden to lachrymal glands and that antioxidant MaquiBright™ Aristotelia chilensis berry extract helps restore glandular activity. In this pilot trial we investigated 13 healthy volunteers with moderately dry eyes using Schirmer test, as well as a questionnaire which allows for estimating the impact of dry eyes on daily routines. Study participants were assigned to one of two groups, receiving MaquiBright™ at daily dosage of either 30 mg (N.=7) or 60 mg (N.=6) over a period of 60 days. Both groups presented with significantly (Peye dryness on daily routines was evaluated employing the "Dry Eye-related Quality of life Score" (DEQS), with values spanning from zero (impact) to a maximum score of 60. Participants had comparable baseline values of 41.0±7.7 (30 mg) and 40.2±6.3 (60 mg). With 30 mg treatment the score significantly decreased to 21.8±3.9 and 18.9±3.9, after 30 and 60 days, respectively. With 60 mg treatment the DEQS significantly decreased to 26.9±5.3 and 11.1±2.7, after 30 and 60 days, respectively. Blood was drawn for safety analyses (complete blood rheology and -chemistry) at all three investigative time points without negative findings. In conclusion, while daily supplementation with 30 mg MaquiBright™ is effective, the dosage of 60 significantly increased tear fluid volume at all investigative time points and decreased dry eye symptoms to almost a quarter from initial values after two months treatment.

  11. The In Vitro Mass-Produced Model Mycorrhizal Fungus, Rhizophagus irregularis, Significantly Increases Yields of the Globally Important Food Security Crop Cassava

    Science.gov (United States)

    Ceballos, Isabel; Ruiz, Michael; Fernández, Cristhian; Peña, Ricardo

    2013-01-01

    The arbuscular mycorrhizal symbiosis is formed between arbuscular mycorrhizal fungi (AMF) and plant roots. The fungi provide the plant with inorganic phosphate (P). The symbiosis can result in increased plant growth. Although most global food crops naturally form this symbiosis, very few studies have shown that their practical application can lead to large-scale increases in food production. Application of AMF to crops in the tropics is potentially effective for improving yields. However, a main problem of using AMF on a large-scale is producing cheap inoculum in a clean sterile carrier and sufficiently concentrated to cheaply transport. Recently, mass-produced in vitro inoculum of the model mycorrhizal fungus Rhizophagus irregularis became available, potentially making its use viable in tropical agriculture. One of the most globally important food plants in the tropics is cassava. We evaluated the effect of in vitro mass-produced R. irregularis inoculum on the yield of cassava crops at two locations in Colombia. A significant effect of R. irregularis inoculation on yield occurred at both sites. At one site, yield increases were observed irrespective of P fertilization. At the other site, inoculation with AMF and 50% of the normally applied P gave the highest yield. Despite that AMF inoculation resulted in greater food production, economic analyses revealed that AMF inoculation did not give greater return on investment than with conventional cultivation. However, the amount of AMF inoculum used was double the recommended dose and was calculated with European, not Colombian, inoculum prices. R. irregularis can also be manipulated genetically in vitro, leading to improved plant growth. We conclude that application of in vitro R. irregularis is currently a way of increasing cassava yields, that there is a strong potential for it to be economically profitable and that there is enormous potential to improve this efficiency further in the future. PMID:23950975

  12. Addition of sodium caseinate to skim milk increases nonsedimentable casein and causes significant changes in rennet-induced gelation, heat stability, and ethanol stability.

    Science.gov (United States)

    Lin, Yingchen; Kelly, Alan L; O'Mahony, James A; Guinee, Timothy P

    2017-02-01

    The protein content of skim milk was increased from 3.3 to 4.1% (wt/wt) by the addition of a blend of skim milk powder and sodium caseinate (NaCas), in which the weight ratio of skim milk powder to NaCas was varied from 0.8:0.0 to 0.0:0.8. Addition of NaCas increased the levels of nonsedimentable casein (from ∼6 to 18% of total casein) and calcium (from ∼36 to 43% of total calcium) and reduced the turbidity of the fortified milk, to a degree depending on level of NaCas added. Rennet gelation was adversely affected by the addition of NaCas at 0.2% (wt/wt) and completely inhibited at NaCas ≥0.4% (wt/wt). Rennet-induced hydrolysis was not affected by added NaCas. The proportion of total casein that was nonsedimentable on centrifugation (3,000 × g, 1 h, 25°C) of the rennet-treated milk after incubation for 1 h at 31°C increased significantly on addition of NaCas at ≥0.4% (wt/wt). Heat stability in the pH range 6.7 to 7.2 and ethanol stability at pH 6.4 were enhanced by the addition of NaCas. It is suggested that the negative effect of NaCas on rennet gelation is due to the increase in nonsedimentable casein, which upon hydrolysis by chymosin forms into small nonsedimentable particles that physically come between, and impede the aggregation of, rennet-altered para-casein micelles, and thereby inhibit the development of a gel network. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. The in vitro mass-produced model mycorrhizal fungus, Rhizophagus irregularis, significantly increases yields of the globally important food security crop cassava.

    Directory of Open Access Journals (Sweden)

    Isabel Ceballos

    Full Text Available The arbuscular mycorrhizal symbiosis is formed between arbuscular mycorrhizal fungi (AMF and plant roots. The fungi provide the plant with inorganic phosphate (P. The symbiosis can result in increased plant growth. Although most global food crops naturally form this symbiosis, very few studies have shown that their practical application can lead to large-scale increases in food production. Application of AMF to crops in the tropics is potentially effective for improving yields. However, a main problem of using AMF on a large-scale is producing cheap inoculum in a clean sterile carrier and sufficiently concentrated to cheaply transport. Recently, mass-produced in vitro inoculum of the model mycorrhizal fungus Rhizophagus irregularis became available, potentially making its use viable in tropical agriculture. One of the most globally important food plants in the tropics is cassava. We evaluated the effect of in vitro mass-produced R. irregularis inoculum on the yield of cassava crops at two locations in Colombia. A significant effect of R. irregularis inoculation on yield occurred at both sites. At one site, yield increases were observed irrespective of P fertilization. At the other site, inoculation with AMF and 50% of the normally applied P gave the highest yield. Despite that AMF inoculation resulted in greater food production, economic analyses revealed that AMF inoculation did not give greater return on investment than with conventional cultivation. However, the amount of AMF inoculum used was double the recommended dose and was calculated with European, not Colombian, inoculum prices. R. irregularis can also be manipulated genetically in vitro, leading to improved plant growth. We conclude that application of in vitro R. irregularis is currently a way of increasing cassava yields, that there is a strong potential for it to be economically profitable and that there is enormous potential to improve this efficiency further in the future.

  14. Prognostic Significance of Creatinine Increases During an Acute Heart Failure Admission in Patients With and Without Residual Congestion: A Post Hoc Analysis of the PROTECT Data.

    Science.gov (United States)

    Metra, Marco; Cotter, Gad; Senger, Stefanie; Edwards, Christopher; Cleland, John G; Ponikowski, Piotr; Cursack, Guillermo C; Milo, Olga; Teerlink, John R; Givertz, Michael M; O'Connor, Christopher M; Dittrich, Howard C; Bloomfield, Daniel M; Voors, Adriaan A; Davison, Beth A

    2018-05-01

    The importance of a serum creatinine increase, traditionally considered worsening renal function (WRF), during admission for acute heart failure has been recently debated, with data suggesting an interaction between congestion and creatinine changes. In post hoc analyses, we analyzed the association of WRF with length of hospital stay, 30-day death or cardiovascular/renal readmission and 90-day mortality in the PROTECT study (Placebo-Controlled Randomized Study of the Selective A1 Adenosine Receptor Antagonist Rolofylline for Patients Hospitalized With Acute Decompensated Heart Failure and Volume Overload to Assess Treatment Effect on Congestion and Renal Function). Daily creatinine changes from baseline were categorized as WRF (an increase of 0.3 mg/dL or more) or not. Daily congestion scores were computed by summing scores for orthopnea, edema, and jugular venous pressure. Of the 2033 total patients randomized, 1537 patients had both available at study day 14. Length of hospital stay was longer and 30-day cardiovascular/renal readmission or death more common in patients with WRF. However, these were driven by significant associations in patients with concomitant congestion at the time of assessment of renal function. The mean difference in length of hospital stay because of WRF was 3.51 (95% confidence interval, 1.29-5.73) more days ( P =0.0019), and the hazard ratio for WRF on 30-day death or heart failure hospitalization was 1.49 (95% confidence interval, 1.06-2.09) times higher ( P =0.0205), in significantly congested than nonsignificantly congested patients. A similar trend was observed with 90-day mortality although not statistically significant. In patients admitted for acute heart failure, WRF defined as a creatinine increase of ≥0.3 mg/dL was associated with longer length of hospital stay, and worse 30- and 90-day outcomes. However, effects were largely driven by patients who had residual congestion at the time of renal function assessment. URL: https

  15. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  16. In vivo topical application of acetyl aspartic acid increases fibrillin-1 and collagen IV deposition leading to a significant improvement of skin firmness.

    Science.gov (United States)

    Gillbro, J M; Merinville, E; Cattley, K; Al-Bader, T; Hagforsen, E; Nilsson, M; Mavon, A

    2015-10-01

    Acetyl aspartic acid (A-A-A) was discovered through gene array analysis with corresponding Cmap analysis. We found that A-A-A increased keratinocyte regeneration, inhibited dermal matrix metalloprotease (MMP) expression and relieved fibroblast stiffness through reduction of the fibroblast stiffness marker F-actin. Dermal absorption studies showed successful delivery to both the epidermal and dermal regions, and in-use trial demonstrated that 1% A-A-A was well tolerated. In this study, the aim was to investigate whether A-A-A could stimulate the synthesis of extracellular matrix supporting proteins in vivo and thereby improving the viscoelastic properties of human skin by conducting a dual histological and biophysical clinical study. Two separate double-blind vehicle-controlled in vivo studies were conducted using a 1% A-A-A containing oil-in-water (o/w) emulsion. In the histological study, 16 female volunteers (>55 years of age) exhibiting photodamaged skin on their forearm were included, investigating the effect of a 12-day treatment of A-A-A on collagen IV (COLIV) and fibrillin-1. In a subsequent pilot study, 0.1% retinol was used for comparison to A-A-A (1%). The biomechanical properties of the skin were assessed in a panel of 16 women (>45 years of age) using the standard Cutometer MPA580 after topical application of the test products for 28 days. The use of multiple suction enabled the assessment of F4, an area parameter specifically representing skin firmness. Twelve-day topical application of 1% A-A-A significantly increased COLIV and fibrillin with 13% and 6%, respectively, compared to vehicle. 1% A-A-A and 0.1% retinol were found to significantly reduce F4 after 28 days of treatment by 15.8% and 14.7%, respectively, in the pilot Cutometer study. No significant difference was found between retinol and A-A-A. However, only A-A-A exhibited a significant effect vs. vehicle on skin firmness which indicated the incremental benefit of A-A-A as a skin

  17. Prenatal prochloraz treatment significantly increases pregnancy length and reduces offspring weight but does not affect social-olfactory memory in rats.

    Science.gov (United States)

    Dmytriyeva, Oksana; Klementiev, Boris; Berezin, Vladimir; Bock, Elisabeth

    2013-07-01

    Metabolites of the commonly used imidazole fungicide prochloraz are androgen receptor antagonists. They have been shown to block androgen-driven development and compromise reproductive function. We tested the effect of prochloraz on cognitive behavior following exposure to this fungicide during the perinatal period. Pregnant Wistar rats were administered a 200 mg/kg dose of prochloraz on gestational day (GD) 7, GD11, and GD15. The social recognition test (SRT) was performed on 7-week-old male rat offspring. We found an increase in pregnancy length and a significantly reduced pup weight on PND15 and PND40 but no effect of prenatal prochloraz exposure on social investigation or acquisition of social-olfactory memory. Copyright © 2012 Elsevier GmbH. All rights reserved.

  18. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  19. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  20. Long-term use of amiodarone before heart transplantation significantly reduces early post-transplant atrial fibrillation and is not associated with increased mortality after heart transplantation

    Directory of Open Access Journals (Sweden)

    Rivinius R

    2016-02-01

    group (P=0.0123. There was no statistically significant difference between patients with and without long-term use of amiodarone prior to HTX in 1-year (P=0.8596, 2-year (P=0.8620, 5-year (P=0.2737, or overall follow-up mortality after HTX (P=0.1049. Moreover, Kaplan–Meier survival analysis showed no statistically significant difference in overall survival (P=0.1786.Conclusion: Long-term use of amiodarone in patients before HTX significantly reduces early post-transplant AF and is not associated with increased mortality after HTX. Keywords: amiodarone, atrial fibrillation, heart failure, heart transplantation, mortality

  1. Loading of the knee during 3.0 T MRI is associated with significantly increased medial meniscus extrusion in mild and moderate osteoarthritis

    Energy Technology Data Exchange (ETDEWEB)

    Stehling, Christoph, E-mail: christoph.stehling@radiology.ucsf.edu [Musculoskeletal and Quantitative Imaging Group (MQIR), Department of Radiology and Biomedical Imaging, University of California San Francisco, San Francisco, CA (United States); Department of Clinical Radiology, University of Muenster, Muenster (Germany); Souza, Richard B. [Musculoskeletal and Quantitative Imaging Group (MQIR), Department of Radiology and Biomedical Imaging, University of California San Francisco, San Francisco, CA (United States); Graverand, Marie-Pierre Hellio Le; Wyman, Bradley T. [Pfizer Inc. New London, CT (United States); Li, Xiaojuan; Majumdar, Sharmila; Link, Thomas M. [Musculoskeletal and Quantitative Imaging Group (MQIR), Department of Radiology and Biomedical Imaging, University of California San Francisco, San Francisco, CA (United States)

    2012-08-15

    Purpose: Standard knee MRI is performed under unloading (ULC) conditions and not much is known about changes of the meniscus, ligaments or cartilage under loading conditions (LC). The aim is to study the influence of loading of different knee structures at 3 Tesla (T) in subjects with osteoarthritis (OA) and normal controls. Materials and methods: 30 subjects, 10 healthy and 20 with radiographic evidence of OA (10 mild and 10 moderate) underwent 3 T MRI under ULC and LC at 50% body weight. All images were analyzed by two musculoskeletal radiologists identifying and grading cartilage, meniscal, ligamentous abnormalities. The changes between ULC and LC were assessed. For meniscus, cartilage and ligaments the changes of lesions, signal and shape were evaluated. In addition, for the meniscus changes in extrusion were examined. A multivariate regression model was used for correlations to correct the data for the impact of age, gender, BMI. A paired T-Test was performed to calculate the differences in meniscus extrusion. Results: Subjects with degenerative knee abnormalities demonstrated significantly increased meniscus extrusion under LC when compared to normal subjects (p = 0.0008-0.0027). Subjects with knee abnormalities and higher KL scores showed significantly more changes in lesion, signal and shape of the meniscus (80% (16/20) vs. 20% (2/10); p = 0.0025), ligaments and cartilage during LC. Conclusion: The study demonstrates that axial loading has an effect on articular cartilage, ligament, and meniscus morphology, which is more significant in subjects with degenerative disease and may serve as an additional diagnostic tool for disease diagnosis and assessing progression in subjects with knee OA.

  2. Loading of the knee during 3.0 T MRI is associated with significantly increased medial meniscus extrusion in mild and moderate osteoarthritis

    International Nuclear Information System (INIS)

    Stehling, Christoph; Souza, Richard B.; Graverand, Marie-Pierre Hellio Le; Wyman, Bradley T.; Li, Xiaojuan; Majumdar, Sharmila; Link, Thomas M.

    2012-01-01

    Purpose: Standard knee MRI is performed under unloading (ULC) conditions and not much is known about changes of the meniscus, ligaments or cartilage under loading conditions (LC). The aim is to study the influence of loading of different knee structures at 3 Tesla (T) in subjects with osteoarthritis (OA) and normal controls. Materials and methods: 30 subjects, 10 healthy and 20 with radiographic evidence of OA (10 mild and 10 moderate) underwent 3 T MRI under ULC and LC at 50% body weight. All images were analyzed by two musculoskeletal radiologists identifying and grading cartilage, meniscal, ligamentous abnormalities. The changes between ULC and LC were assessed. For meniscus, cartilage and ligaments the changes of lesions, signal and shape were evaluated. In addition, for the meniscus changes in extrusion were examined. A multivariate regression model was used for correlations to correct the data for the impact of age, gender, BMI. A paired T-Test was performed to calculate the differences in meniscus extrusion. Results: Subjects with degenerative knee abnormalities demonstrated significantly increased meniscus extrusion under LC when compared to normal subjects (p = 0.0008–0.0027). Subjects with knee abnormalities and higher KL scores showed significantly more changes in lesion, signal and shape of the meniscus (80% (16/20) vs. 20% (2/10); p = 0.0025), ligaments and cartilage during LC. Conclusion: The study demonstrates that axial loading has an effect on articular cartilage, ligament, and meniscus morphology, which is more significant in subjects with degenerative disease and may serve as an additional diagnostic tool for disease diagnosis and assessing progression in subjects with knee OA.

  3. Symmetric dimeric bisbenzimidazoles DBP(n reduce methylation of RARB and PTEN while significantly increase methylation of rRNA genes in MCF-7 cancer cells.

    Directory of Open Access Journals (Sweden)

    Svetlana V Kostyuk

    Full Text Available Hypermethylation is observed in the promoter regions of suppressor genes in the tumor cancer cells. Reactivation of these genes by demethylation of their promoters is a prospective strategy of the anticancer therapy. Previous experiments have shown that symmetric dimeric bisbenzimidazoles DBP(n are able to block DNA methyltransferase activities. It was also found that DBP(n produces a moderate effect on the activation of total gene expression in HeLa-TI population containing epigenetically repressed avian sarcoma genome.It is shown that DBP(n are able to penetrate the cellular membranes and accumulate in breast carcinoma cell MCF-7, mainly in the mitochondria and in the nucleus, excluding the nucleolus. The DBP(n are non-toxic to the cells and have a weak overall demethylation effect on genomic DNA. DBP(n demethylate the promoter regions of the tumor suppressor genes PTEN and RARB. DBP(n promotes expression of the genes RARB, PTEN, CDKN2A, RUNX3, Apaf-1 and APC "silent" in the MCF-7 because of the hypermethylation of their promoter regions. Simultaneously with the demethylation of the DNA in the nucleus a significant increase in the methylation level of rRNA genes in the nucleolus was detected. Increased rDNA methylation correlated with a reduction of the rRNA amount in the cells by 20-30%. It is assumed that during DNA methyltransferase activity inhibition by the DBP(n in the nucleus, the enzyme is sequestered in the nucleolus and provides additional methylation of the rDNA that are not shielded by DBP(n.It is concluded that DBP (n are able to accumulate in the nucleus (excluding the nucleolus area and in the mitochondria of cancer cells, reducing mitochondrial potential. The DBP (n induce the demethylation of a cancer cell's genome, including the demethylation of the promoters of tumor suppressor genes. DBP (n significantly increase the methylation of ribosomal RNA genes in the nucleoli. Therefore the further study of these compounds is needed

  4. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  5. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  6. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  7. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  8. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  9. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  10. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  11. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  12. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  13. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  14. PARP-1 depletion in combination with carbon ion exposure significantly reduces MMPs activity and overall increases TIMPs expression in cultured HeLa cells

    International Nuclear Information System (INIS)

    Ghorai, Atanu; Sarma, Asitikantha; Chowdhury, Priyanka; Ghosh, Utpal

    2016-01-01

    Hadron therapy is an innovative technique where cancer cells are precisely killed leaving surrounding healthy cells least affected by high linear energy transfer (LET) radiation like carbon ion beam. Anti-metastatic effect of carbon ion exposure attracts investigators into the field of hadron biology, although details remain poor. Poly(ADP-ribose) polymerase-1 (PARP-1) inhibitors are well-known radiosensitizer and several PARP-1 inhibitors are in clinical trial. Our previous studies showed that PARP-1 depletion makes the cells more radiosensitive towards carbon ion than gamma. The purpose of the present study was to investigate combining effects of PARP-1 inhibition with carbon ion exposure to control metastatic properties in HeLa cells. Activities of matrix metalloproteinases-2, 9 (MMP-2, MMP-9) were measured using the gelatin zymography after 85 MeV carbon ion exposure or gamma irradiation (0- 4 Gy) to compare metastatic potential between PARP-1 knock down (HsiI) and control cells (H-vector - HeLa transfected with vector without shRNA construct). Expression of MMP-2, MMP-9, tissue inhibitor of MMPs such as TIMP-1, TIMP-2 and TIMP-3 were checked by immunofluorescence and western blot. Cell death by trypan blue, apoptosis and autophagy induction were studied after carbon ion exposure in each cell-type. The data was analyzed using one way ANOVA and 2-tailed paired-samples T-test. PARP-1 silencing significantly reduced MMP-2 and MMP-9 activities and carbon ion exposure further diminished their activities to less than 3 % of control H-vector. On the contrary, gamma radiation enhanced both MMP-2 and MMP-9 activities in H-vector but not in HsiI cells. The expression of MMP-2 and MMP-9 in H-vector and HsiI showed different pattern after carbon ion exposure. All three TIMPs were increased in HsiI, whereas only TIMP-1 was up-regulated in H-vector after irradiation. Notably, the expressions of all TIMPs were significantly higher in HsiI than H-vector at 4 Gy. Apoptosis was

  15. Treatment of rheumatoid arthritis with tumor necrosis factor inhibitors may predispose to significant increase in tuberculosis risk: a multicenter active-surveillance report.

    Science.gov (United States)

    Gómez-Reino, Juan J; Carmona, Loreto; Valverde, Vicente Rodríguez; Mola, Emilio Martín; Montero, Maria Dolores

    2003-08-01

    The long-term safety of therapeutic agents that neutralize tumor necrosis factor (TNF) is uncertain. Recent evidence based on spontaneous reporting shows an association with active tuberculosis (TB). We undertook this study to determine and describe the long-term safety of 2 of these agents, infliximab and etanercept, in rheumatic diseases based on a national active-surveillance system following the commercialization of the drugs. We analyzed the safety data actively collected in the BIOBADASER (Base de Datos de Productos Biológicos de la Sociedad Española de Reumatología) database, which was launched in February 2000 by the Spanish Society of Rheumatology. For the estimation of TB risk, the annual incidence rate in patients treated with these agents was compared with the background rate and with the rate in a cohort of patients with rheumatoid arthritis (RA) assembled before the era of anti-TNF treatment. Seventy-one participating centers sent data on 1,578 treatments with infliximab (86%) or etanercept (14%) in 1,540 patients. Drug survival rates (reported as the cumulative percentage of patients still receiving medication) for infliximab and etanercept pooled together were 85% and 81% at 1 year and 2 years, respectively. Instances of discontinuation were essentially due to adverse events. Seventeen cases of TB were found in patients treated with infliximab. The estimated incidence of TB associated with infliximab in RA patients was 1,893 per 100,000 in the year 2000 and 1,113 per 100,000 in the year 2001. These findings represent a significant increased risk compared with background rates. In the first 5 months of 2002, after official guidelines were established for TB prevention in patients treated with biologics, only 1 new TB case was registered (in January). Therapy with infliximab is associated with an increased risk of active TB. Proper measures are needed to prevent and manage this adverse event.

  16. Dephytinisation with Intrinsic Wheat Phytase and Iron Fortification Significantly Increase Iron Absorption from Fonio (Digitaria exilis) Meals in West African Women

    Science.gov (United States)

    Moretti, Diego; Schuth, Stephan; Egli, Ines; Zimmermann, Michael B.; Brouwer, Inge D.

    2013-01-01

    Low iron and high phytic acid content make fonio based meals a poor source of bioavailable iron. Phytic acid degradation in fonio porridge using whole grain cereals as phytase source and effect on iron bioavailability when added to iron fortified fonio meals were investigated. Grains, nuts and seeds collected in Mali markets were screened for phytic acid and phytase activity. We performed an iron absorption study in Beninese women (n = 16), using non-dephytinised fonio porridge (FFP) and dephytinised fonio porridge (FWFP; 75% fonio-25% wheat), each fortified with 57Fe or 58Fe labeled FeSO4. Iron absorption was quantified by measuring the erythrocyte incorporation of stable iron isotopes. Phytic acid varied from 0.39 (bambara nut) to 4.26 g/100 g DM (pumpkin seed), with oilseeds values higher than grains and nuts. Phytase activity ranged from 0.17±1.61 (fonio) to 2.9±1.3 phytase unit (PU) per g (whole wheat). Phytic acid was almost completely degraded in FWFP after 60 min of incubation (pH≈5.0, 50°C). Phytate∶iron molar ratios decreased from 23.7∶1 in FFP to 2.7∶1 in FWFP. Iron fortification further reduced phytate∶iron molar ratio to 1.9∶1 in FFP and 0.3∶1 in FWFP, respectively. Geometric mean (95% CI) iron absorption significantly increased from 2.6% (0.8–7.8) in FFP to 8.3% (3.8–17.9) in FWFP (Pphytase increased fractional iron absorption 3.2 times, suggesting it could be a possible strategy to decrease PA in cereal-based porridges. PMID:24124445

  17. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  18. Significantly increased detection rate of drugs of abuse in urine following the introduction of new German driving licence re-granting guidelines.

    Science.gov (United States)

    Agius, Ronald; Nadulski, Thomas; Kahl, Hans-Gerhard; Dufaux, Bertin

    2012-02-10

    In this paper we present the first assessment of the new German driving licence re-granting medical and psychological assessment (MPA) guidelines by comparing over 3500 urine samples tested under the old MPA cut-offs to over 5000 samples tested under the new MPA cut-offs. Since the enzyme multiplied immunoassay technique (EMIT) technology used previously was not sensitive enough to screen for drugs at such low concentrations, as suggested by the new MPA guidelines, enzyme-linked immunosorbent assay (ELISA) screening kits were used to screen for the drugs of abuse at the new MPA cut-offs. The above comparison revealed significantly increased detection rates of drug use or exposure during the rehabilitation period as follows: 1.61, 2.33, 3.33, and 7 times higher for 11-nor-delta-9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH), morphine, benzoylecgonine and amphetamine respectively. The present MPA guidelines seem to be more effective to detect non-abstinence from drugs of abuse and hence to detecting drivers who do not yet fulfil the MPA requirements to regain their revoked driving licence. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. The Oral Bioavailability of Trans-Resveratrol from a Grapevine-Shoot Extract in Healthy Humans is Significantly Increased by Micellar Solubilization.

    Science.gov (United States)

    Calvo-Castro, Laura A; Schiborr, Christina; David, Franziska; Ehrt, Heidi; Voggel, Jenny; Sus, Nadine; Behnam, Dariush; Bosy-Westphal, Anja; Frank, Jan

    2018-05-01

    Grapevine-shoot extract Vineatrol30 contains abundant resveratrol monomers and oligomers with health-promoting potential. However, the oral bioavailability of these compounds in humans is low (˂1-2%). The aim of this study was to improve the oral bioavailability of resveratrol from vineatrol by micellar solubilization. Twelve healthy volunteers (six women, six men) randomly ingested a single dose of 500 mg vineatrol (30 mg trans-resveratrol, 75 mg trans-ε-viniferin) as native powder or liquid micelles. Plasma and urine were collected at baseline and over 24 h after intake. Resveratrol and viniferin were analyzed by HPLC. The area under the plasma concentration-time curve (AUC) and mean maximum plasma trans-resveratrol concentrations were 5.0-fold and 10.6-fold higher, respectively, after micellar supplementation relative to the native powder. However, no detectable amounts of trans-ε-viniferin were found in either plasma or urine. The transepithelial permeability of trans-resveratrol and trans-ε-viniferin across differentiated Caco-2 monolayers was consistent to the absorbed fractions in vivo. The oral bioavailability of trans-resveratrol from the grapevine-shoot extract Vineatrol30 was significantly increased using a liquid micellar formulation, without any treatment-related adverse effects, making it a suitable system for improved supplementation of trans-resveratrol. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  1. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  2. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  3. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  5. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  6. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  7. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  8. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  9. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  10. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  11. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  12. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  13. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  14. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  15. Single-source dual-energy spectral multidetector CT of pancreatic adenocarcinoma: Optimization of energy level viewing significantly increases lesion contrast

    International Nuclear Information System (INIS)

    Patel, B.N.; Thomas, J.V.; Lockhart, M.E.; Berland, L.L.; Morgan, D.E.

    2013-01-01

    V was 31 ± 25 HU (p = 0.007). Conclusion: Significantly increased pancreatic lesion contrast was noted at lower viewing energies using spectral MDCT. Individual patient CNR-optimized energy level images have the potential to improve lesion conspicuity.

  16. Single-source dual-energy spectral multidetector CT of pancreatic adenocarcinoma: optimization of energy level viewing significantly increases lesion contrast.

    Science.gov (United States)

    Patel, B N; Thomas, J V; Lockhart, M E; Berland, L L; Morgan, D E

    2013-02-01

    .007). Significantly increased pancreatic lesion contrast was noted at lower viewing energies using spectral MDCT. Individual patient CNR-optimized energy level images have the potential to improve lesion conspicuity. Copyright © 2012 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  17. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  18. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  19. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  20. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  1. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  2. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  3. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  4. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  5. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  6. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  7. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  8. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  9. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    Science.gov (United States)

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  10. Experimentally-induced immune activation in natural hosts of SIV induces significant increases in viral replication and CD4+ T cell depletion

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Ruy M [Los Alamos National Laboratory

    2008-01-01

    Chronically SIVagm-infected African green monkeys (AGMs) have a remarkably stable non-pathogenic disease course, with levels of immune activation in chronic SIVagm infection similar to those observed in uninfected monkeys and stable viral loads (VLs) for long periods of time. In vivo administration of lipopolysaccharide (LPS) or an IL-2/diphtheria toxin fusion protein (Ontak) to chronically SIVagm-infected AGMs triggered increases in immune activation and subsequently of viral replication and depletion of intestinal CD4{sup +} T cells. Our study indicates that circulating microbial products can increase viral replication by inducing immune activation and increasing the number of viral target cells, thus demonstrating that immune activation and T cell prolifeation are key factors in AIDS pathogenesis.

  11. Increased superior frontal gyrus activation during working memory processing in psychosis: Significant relation to cumulative antipsychotic medication and to negative symptoms.

    Science.gov (United States)

    Vogel, Tobias; Smieskova, Renata; Schmidt, André; Walter, Anna; Harrisberger, Fabienne; Eckert, Anne; Lang, Undine E; Riecher-Rössler, Anita; Graf, Marc; Borgwardt, Stefan

    2016-08-01

    Impairment in working memory (WM) is a core symptom in schizophrenia. However, little is known about how clinical features influence functional brain activity specific to WM processing during the development of first-episode psychosis (FEP) to schizophrenia (SZ). We compared functional WM-specific brain activity in FEP and SZ patients, including the effects of the duration of illness, psychopathological factors and antipsychotic medication. Cross-sectional study of male FEP (n=22) and SZ (n=20) patients performing an n-back task when undergoing functional magnetic resonance imaging (fMRI). Clinical features were collected by semi-structured interviews and medical records. The SZ group performed significantly worse than the FEP group in the 2-back condition. The SZ group also showed significantly higher activation in the left superior frontal gyrus in the 2-back versus 0-back condition (2-back>0-back). This frontal activation correlated positively with negative symptoms and with cumulative antipsychotic medication during the year before the fMRI examination. There were no significant correlations between activation and duration of illness. There was greater frontal neural activation in SZ than in FEP. This indicated differences in WM processing, and was significantly related to cumulative antipsychotic exposure and negative symptoms, but not to the duration of illness. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  13. Co-downregulation of the hydroxycinnamoyl-CoA:shikimate hydroxycinnamoyl transferase and coumarate 3-hydroxylase significantly increases cellulose content in transgenic alfalfa (Medicago sativa L.).

    Science.gov (United States)

    Tong, Zongyong; Li, Heng; Zhang, Rongxue; Ma, Lei; Dong, Jiangli; Wang, Tao

    2015-10-01

    Lignin is a component of the cell wall that is essential for growth, development, structure and pathogen resistance in plants, but high lignin is an obstacle to the conversion of cellulose to ethanol for biofuel. Genetically modifying lignin and cellulose contents can be a good approach to overcoming that obstacle. Alfalfa (Medicago sativa L.) is rich in lignocellulose biomass and used as a model plant for the genetic modification of lignin in this study. Two key enzymes in the lignin biosynthesis pathway-hydroxycinnamoyl -CoA:shikimate hydroxycinnamoyl transferase (HCT) and coumarate 3-hydroxylase (C3H)-were co-downregulated. Compared to wild-type plants, the lignin content in the modified strain was reduced by 38%, cellulose was increased by 86.1%, enzyme saccharification efficiency was increased by 10.9%, and cell wall digestibility was increased by 13.0%. The modified alfalfa exhibited a dwarf phenotype, but normal above ground biomass. This approach provides a new strategy for reducing lignin and increasing cellulose contents and creates a new genetically modified crop with enhanced value for biofuel. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. The significance of the European beaver (Castor fibre activity for the process of renaturalization of river valleys in the era of increasing

    Directory of Open Access Journals (Sweden)

    Kusztal Piotr

    2017-03-01

    Full Text Available Changes in the environment that are caused by the activity of beavers bring numerous advantages. They affect the increase in biodiversity, contribute to improving the condition of cleanliness of watercourses, improve local water relations and restore the natural landscape of river valleys.

  15. Significant increase in cultivation of Gardnerella vaginalis, Alloscardovia omnicolens, Actinotignum schaalii, and Actinomyces spp. in urine samples with total laboratory automation.

    Science.gov (United States)

    Klein, Sabrina; Nurjadi, Dennis; Horner, Susanne; Heeg, Klaus; Zimmermann, Stefan; Burckhardt, Irene

    2018-04-13

    While total laboratory automation (TLA) is well established in laboratory medicine, only a few microbiological laboratories are using TLA systems. Especially in terms of speed and accuracy, working with TLA is expected to be superior to conventional microbiology. We compared in total 35,564 microbiological urine cultures with and without incubation and processing with BD Kiestra TLA for a 6-month period each retrospectively. Sixteen thousand three hundred thirty-eight urine samples were analyzed in the pre-TLA period and 19,226 with TLA. Sixty-two percent (n = 10,101/16338) of the cultures processed without TLA and 68% (n = 13,102/19226) of the cultures processed with TLA showed growth. There were significantly more samples with two or more species per sample and with low numbers of colony forming units (CFU) after incubation with TLA. Regarding the type of bacteria, there were comparable amounts of Enterobacteriaceae in the samples, slightly less non-fermenting Gram-negative bacteria, but significantly more Gram-positive cocci, and Gram-positive rods. Especially Alloscardivia omnicolens, Gardnerella vaginalis, Actinomyces spp., and Actinotignum schaalii were significantly more abundant in the samples incubated and processed with TLA. The time to report was significantly lower in the TLA processed samples by 1.5 h. We provide the first report in Europe of a large number of urine samples processed with TLA. TLA showed enhanced growth of non-classical and rarely cultured bacteria from urine samples. Our findings suggest that previously underestimated bacteria may be relevant pathogens for urinary tract infections. Further studies are needed to confirm our findings.

  16. Increased levels of IL-6 in the cerebrospinal fluid of patients with chronic schizophrenia — significance for activation of the kynurenine pathway

    Science.gov (United States)

    Schwieler, Lilly; Larsson, Markus K.; Skogh, Elisabeth; Kegel, Magdalena E.; Orhan, Funda; Abdelmoaty, Sally; Finn, Anja; Bhat, Maria; Samuelsson, Martin; Lundberg, Kristina; Dahl, Marja-Liisa; Sellgren, Carl; Schuppe-Koistinen, Ina; Svensson, Camilla I.; Erhardt, Sophie; Engberg, Göran

    2015-01-01

    Background Accumulating evidence indicates that schizophrenia is associated with brain immune activation. While a number of reports suggest increased cytokine levels in patients with schizophrenia, many of these studies have been limited by their focus on peripheral cytokines or confounded by various antipsychotic treatments. Here, well-characterized patients with schizophrenia, all receiving olanzapine treatment, and healthy volunteers were analyzed with regard to cerebrospinal fluid (CSF) levels of cytokines. We correlated the CSF cytokine levels to previously analyzed metabolites of the kynurenine (KYN) pathway. Methods We analyzed the CSF from patients and controls using electrochemiluminescence detection with regard to cytokines. Cell culture media from human cortical astrocytes were analyzed for KYN and kynurenic acid (KYNA) using high-pressure liquid chromatography or liquid chromatography/mass spectrometry. Results We included 23 patients and 37 controls in our study. Patients with schizophrenia had increased CSF levels of interleukin (IL)-6 compared with healthy volunteers. In patients, we also observed a positive correlation between IL-6 and the tryptophan:KYNA ratio, indicating that IL-6 activates the KYN pathway. In line with this, application of IL-6 to cultured human astrocytes increased cell medium concentration of KYNA. Limitations The CSF samples had been frozen and thawed twice before analysis of cytokines. Median age differed between patients and controls. When appropriate, all present analyses were adjusted for age. Conclusion We have shown that IL-6, KYN and KYNA are elevated in patients with chronic schizophrenia, strengthening the idea of brain immune activation in patients with this disease. Our concurrent cell culture and clinical findings suggest that IL-6 induces the KYN pathway, leading to increased production of the N-methyl-d-aspartate receptor antagonist KYNA in patients with schizophrenia. PMID:25455350

  17. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  19. Human circulating plasma DNA significantly decreases while lymphocyte DNA damage increases under chronic occupational exposure to low-dose gamma-neutron and tritium β-radiation

    Energy Technology Data Exchange (ETDEWEB)

    Korzeneva, Inna B., E-mail: inna.korzeneva@molgen.vniief.ru [Russian Federal Nuclear Center – All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) 607190, Sarov, 37 Mira ave., Nizhniy Novgorod Region (Russian Federation); Kostuyk, Svetlana V.; Ershova, Liza S. [Research Centre for Medical Genetics, Russian Academy of Medical Sciences, 115478 Moscow, 1 Moskvorechye str. (Russian Federation); Osipov, Andrian N. [Federal Medial and Biological Center named after Burnazyan of the Federal Medical and Biological Agency (FMBTz named after Burnazyan of FMBA), Moscow (Russian Federation); State Research Center - Burnasyan Federal Medical Biophysical Center of Federal Medical Biological Agency, Zhivopisnaya, 46, Moscow, 123098 (Russian Federation); Zhuravleva, Veronika F.; Pankratova, Galina V. [Russian Federal Nuclear Center – All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) 607190, Sarov, 37 Mira ave., Nizhniy Novgorod Region (Russian Federation); Porokhovnik, Lev N.; Veiko, Natalia N. [Research Centre for Medical Genetics, Russian Academy of Medical Sciences, 115478 Moscow, 1 Moskvorechye str. (Russian Federation)

    2015-09-15

    Highlights: • The chronic exposure to low-dose IR induces DSBs in human lymphocytes (TM index). • Exposure to IR decreases the level of human circulating DNA (cfDNA index). • IR induces an increase of DNase1 activity (DNase1 index) in plasma. • IR induces an increase of the level of antibodies to DNA (Ab DNA index) in plasma. • The ratio cfDNA/(DNase 1 × Ab DNA × TM) is a potential marker of human exposure to IR. - Abstract: The blood plasma of healthy people contains cell-fee (circulating) DNA (cfDNA). Apoptotic cells are the main source of the cfDNA. The cfDNA concentration increases in case of the organism’s cell death rate increase, for example in case of exposure to high-dose ionizing radiation (IR). The objects of the present research are the blood plasma and blood lymphocytes of people, who contacted occupationally with the sources of external gamma/neutron radiation or internal β-radiation of tritium N = 176). As the controls (references), blood samples of people, who had never been occupationally subjected to the IR sources, were used (N = 109). With respect to the plasma samples of each donor there were defined: the cfDNA concentration (the cfDNA index), DNase1 activity (the DNase1 index) and titre of antibodies to DNA (the Ab DNA index). The general DNA damage in the cells was defined (using the Comet assay, the tail moment (TM) index). A chronic effect of the low-dose ionizing radiation on a human being is accompanied by the enhancement of the DNA damage in lymphocytes along with a considerable cfDNA content reduction, while the DNase1 content and concentration of antibodies to DNA (Ab DNA) increase. All the aforementioned changes were also observed in people, who had not worked with the IR sources for more than a year. The ratio cfDNA/(DNase1 × Ab DNA × TM) is proposed to be used as a marker of the chronic exposure of a person to the external low-dose IR. It was formulated the assumption that the joint analysis of the cfDNA, DNase1, Ab

  20. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  1. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  2. Ibuprofen therapy resulted in significantly decreased tissue bacillary loads and increased survival in a new murine experimental model of active tuberculosis.

    Science.gov (United States)

    Vilaplana, Cristina; Marzo, Elena; Tapia, Gustavo; Diaz, Jorge; Garcia, Vanesa; Cardona, Pere-Joan

    2013-07-15

    C3HeB/FeJ mice infected with Mycobacterium tuberculosis were used in an experimental animal model mimicking active tuberculosis in humans to evaluate the effect of antiinflammatory agents. No other treatment but ibuprofen was given, and it was administered when the animals' health started to deteriorate. Animals treated with ibuprofen had statistically significant decreases in the size and number of lung lesions, decreases in the bacillary load, and improvements in survival, compared with findings for untreated animals. Because antiinflammatory agents are already on the market, further clinical trials should be done to evaluate this effect in humans as soon as possible, to determine their suitability as coadjuvant tuberculosis treatment.

  3. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  4. The stage-classified matrix models project a significant increase in biomass carbon stocks in China's forests between 2005 and 2050.

    Science.gov (United States)

    Hu, Huifeng; Wang, Shaopeng; Guo, Zhaodi; Xu, Bing; Fang, Jingyun

    2015-06-25

    China's forests are characterized by young age, low carbon (C) density and a large plantation area, implying a high potential for increasing C sinks in the future. Using data of provincial forest area and biomass C density from China's forest inventories between 1994 and 2008 and the planned forest coverage of the country by 2050, we developed a stage-classified matrix model to predict biomass C stocks of China's forests from 2005 to 2050. The results showed that total forest biomass C stock would increase from 6.43 Pg C (1 Pg = 10(15) g) in 2005 to 9.97 Pg C (95% confidence interval: 8.98 ~ 11.07 Pg C) in 2050, with an overall net C gain of 78.8 Tg C yr(-1) (56.7 ~ 103.3 Tg C yr(-1); 1 Tg = 10(12) g). Our findings suggest that China's forests will be a large and persistent biomass C sink through 2050.

  5. The stage-classified matrix models project a significant increase in biomass carbon stocks in China’s forests between 2005 and 2050

    Science.gov (United States)

    Hu, Huifeng; Wang, Shaopeng; Guo, Zhaodi; Xu, Bing; Fang, Jingyun

    2015-01-01

    China’s forests are characterized by young age, low carbon (C) density and a large plantation area, implying a high potential for increasing C sinks in the future. Using data of provincial forest area and biomass C density from China’s forest inventories between 1994 and 2008 and the planned forest coverage of the country by 2050, we developed a stage-classified matrix model to predict biomass C stocks of China’s forests from 2005 to 2050. The results showed that total forest biomass C stock would increase from 6.43 Pg C (1 Pg = 1015 g) in 2005 to 9.97 Pg C (95% confidence interval: 8.98 ~ 11.07 Pg C) in 2050, with an overall net C gain of 78.8 Tg C yr−1 (56.7 ~ 103.3 Tg C yr−1; 1 Tg = 1012 g). Our findings suggest that China’s forests will be a large and persistent biomass C sink through 2050. PMID:26110831

  6. Human circulating plasma DNA significantly decreases while lymphocyte DNA damage increases under chronic occupational exposure to low-dose gamma-neutron and tritium β-radiation.

    Science.gov (United States)

    Korzeneva, Inna B; Kostuyk, Svetlana V; Ershova, Liza S; Osipov, Andrian N; Zhuravleva, Veronika F; Pankratova, Galina V; Porokhovnik, Lev N; Veiko, Natalia N

    2015-09-01

    The blood plasma of healthy people contains cell-fee (circulating) DNA (cfDNA). Apoptotic cells are the main source of the cfDNA. The cfDNA concentration increases in case of the organism's cell death rate increase, for example in case of exposure to high-dose ionizing radiation (IR). The objects of the present research are the blood plasma and blood lymphocytes of people, who contacted occupationally with the sources of external gamma/neutron radiation or internal β-radiation of tritium N = 176). As the controls (references), blood samples of people, who had never been occupationally subjected to the IR sources, were used (N = 109). With respect to the plasma samples of each donor there were defined: the cfDNA concentration (the cfDNA index), DNase1 activity (the DNase1 index) and titre of antibodies to DNA (the Ab DNA index). The general DNA damage in the cells was defined (using the Comet assay, the tail moment (TM) index). A chronic effect of the low-dose ionizing radiation on a human being is accompanied by the enhancement of the DNA damage in lymphocytes along with a considerable cfDNA content reduction, while the DNase1 content and concentration of antibodies to DNA (Ab DNA) increase. All the aforementioned changes were also observed in people, who had not worked with the IR sources for more than a year. The ratio cfDNA/(DNase1×Ab DNA × TM) is proposed to be used as a marker of the chronic exposure of a person to the external low-dose IR. It was formulated the assumption that the joint analysis of the cfDNA, DNase1, Ab DNA and TM values may provide the information about the human organism's cell resistivity to chronic exposure to the low-dose IR and about the development of the adaptive response in the organism that is aimed, firstly, at the effective cfDNA elimination from the blood circulation, and, secondly - at survival of the cells, including the cells with the damaged DNA. Copyright © 2015. Published by Elsevier B.V.

  7. Sieve-based device for MALDI sample preparation. I. Influence of sample deposition conditions in oligonucleotide analysis to achieve significant increases in both sensitivity and resolution.

    Science.gov (United States)

    Molin, Laura; Cristoni, Simone; Crotti, Sara; Bernardi, Luigi Rossi; Seraglia, Roberta; Traldi, Pietro

    2008-11-01

    Spraying of oligonucleotide-matrix solutions through a stainless steel (ss) sieve (38 microm, 450 mesh) leads to the formation, on the matrix-assisted laser desorption/ionization (MALDI) sample holder, of uniformly distributed microcrystals, well separated from each other. When the resulting sample holder surface is irradiated by laser, abundant molecular species form, with a clear increase in both intensity and resolution with respect to values obtained by 'Dried Droplet', 'Double Layer', and 'Sandwich' deposition methods. In addition, unlike the usual situation, the sample is perfectly homogeneous, and identical spectra are obtained by irradiating different areas. On one hand, the data indicate that this method is highly effective for oligonucleotide MALDI analysis, and on the other, that it can be validly employed for fully automated MALDI procedures.

  8. Reinnervation of Vastus lateralis is increased significantly in seniors (70-years old with a lifelong history of high-level exercise

    Directory of Open Access Journals (Sweden)

    Simone Mosole

    2013-12-01

    Full Text Available It has long been recognized that histological changes observed in aging muscle suggest that denervation contributes to muscle deterioration and that disuse accelerates the process while running activity, sustained for decades, protects against age-related loss of motor units. Here we show at the histological level that lifelong increased physical activity promotes reinnervation of muscle fibers. In muscle biopsies from 70-year old men with a lifelong history of high-level physical activity, we observed a considerable increase in fiber-type groupings (almost exclusively of the slow type in comparison to sedentary seniors, revealing a large population of reinnervated muscle fibers in the sportsmen. Slow-type transformation by reinnervation in senior sportsmen seems to be a clinically relevant mechanism: the muscle biopsies fluctuate from those with scarce fiber-type transformation and groupings to almost fully transformed muscle, going through a process in which isolated fibers co-expressing fast and slow MHCs seems to fill the gaps. Taken together, our results suggest that, beyond the direct effects of aging on the muscle fibers, changes occurring in skeletal muscle tissue appear to be largely, although not solely, a result of sparse denervation. Our data suggest that lifelong exercise allows the body to adapt to the consequences of the age-related denervation and to preserve muscle structure and function by saving otherwise lost muscle fibers through recruitment to different, mainly slow, motor units. These beneficial effects on motoneurons and, subsequently on muscle fibers, serve to maintain size, structure and function of muscle fibers, delaying the functional decline and loss of independence that are commonly seen in late aging.

  9. Conjugation of the CRM197-inulin conjugate significantly increases the immunogenicity of Mycobacterium tuberculosis CFP10-TB10.4 fusion protein.

    Science.gov (United States)

    Hu, Shun; Yu, Weili; Hu, Chunyang; Wei, Dong; Shen, Lijuan; Hu, Tao; Yi, Youjin

    2017-11-01

    Mycobacterium tuberculosis (Mtb) is a serious fatal pathogen that causes tuberculosis (TB). Effective vaccination is urgently needed to deal with the serious threat from TB. Mtb-secreted protein antigens are important virulence determinants of Mtb with poor immunogenicity. Adjuvants and antigen delivery systems are thus highly desired to improve the immunogenicity of protein antigens. Inulin is a biocompatible polysaccharide (PS) adjuvant that can stimulate a strong cellular and humoral immunity. Bacterial capsular PS and haptens have been conjugated with cross-reacting material 197 (CRM 197 ) to improve their immunogenicity. CFP10 and TB10.4 were two Mtb-secreted immunodominant protein antigens. A CFP10-TB10.4 fusion protein (CT) was used as the antigen for covalent conjugation with the CRM 197 -inulin conjugate (CRM-inu). The resultant conjugate (CT-CRM-inu) elicited high CT-specific IgG titers, stimulated splenocyte proliferation and provoked the secretion of Th1-type and Th2-type cytokines. Conjugation with CRM-inu significantly prolonged the systemic circulation of CT and exposure to the immune system. Moreover, CT-CRM-inu showed no apparent toxicity to cardiac, hepatic and renal functions. Thus, conjugation of CT with CRM-inu provided an effective strategy for development of protein-based vaccines against Mtb infection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Repeated oral administration of a cathepsin K inhibitor significantly suppresses bone resorption in exercising horses with evidence of increased bone formation and maintained bone turnover.

    Science.gov (United States)

    Hussein, H; Dulin, J; Smanik, L; Drost, W T; Russell, D; Wellman, M; Bertone, A

    2017-08-01

    Our investigations evaluated the effect of VEL-0230, a highly specific irreversible inhibitor of cathepsin K (CatK). The objectives of our study were to determine whether repeated dosing of a CatK inhibitor (CatKI) produced a desired inhibition of the bone resorption biomarker (CTX-1), and document the effect of repeated dosing on bone homeostasis, structure, and dynamics of bone resorption and formation in horses. Twelve young exercising horses were randomized in a prospective, controlled clinical trial and received 4 weekly doses of a CatKI or vehicle. Baseline and poststudy nuclear scintigraphy, blood sampling and analysis of plasma bone biomarkers (CTX-1 and osteocalcin), poststudy bone fluorescent labeling, and bone biopsy were performed. Bone specimens were further processed for microcomputed tomography and bone histomorphometry. Each dose of this CatKI transiently inhibited plasma CTX-1 (reflecting inhibition of bone collagen resorption) and increased bone plasma osteocalcin concentrations, with no detectable adverse effect on normal bone turnover in the face of exercise. Bone morphology, density, and formation rate were not different between control and treated group. Further investigation of CatK inhibition in abnormal bone turnover is required in animals with bone diseases. © 2016 John Wiley & Sons Ltd.

  11. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  12. Deficiencies in both starch synthase IIIa and branching enzyme IIb lead to a significant increase in amylose in SSIIa-inactive japonica rice seeds.

    Science.gov (United States)

    Asai, Hiroki; Abe, Natsuko; Matsushima, Ryo; Crofts, Naoko; Oitome, Naoko F; Nakamura, Yasunori; Fujita, Naoko

    2014-10-01

    Starch synthase (SS) IIIa has the second highest activity of the total soluble SS activity in developing rice endosperm. Branching enzyme (BE) IIb is the major BE isozyme, and is strongly expressed in developing rice endosperm. A mutant (ss3a/be2b) was generated from wild-type japonica rice which lacks SSIIa activity. The seed weight of ss3a/be2b was 74-94% of that of the wild type, whereas the be2b seed weight was 59-73% of that of the wild type. There were significantly fewer amylopectin short chains [degree of polymerization (DP) ≤13] in ss3a/be2b compared with the wild type. In contrast, the amount of long chains (DP ≥25) connecting clusters of amylopectin in ss3a/be2b was higher than in the wild type and lower than in be2b. The apparent amylose content of ss3a/be2b was 45%, which was >1.5 times greater than that of either ss3a or be2b. Both SSIIIa and BEIIb deficiencies led to higher activity of ADP-glucose pyrophosphorylase (AGPase) and granule-bound starch synthase I (GBSSI), which partly explains the high amylose content in the ss3a/be2b endosperm. The percentage apparent amylose content of ss3a and ss3a/be2b at 10 days after flowering (DAF) was higher than that of the wild type and be2b. At 20 DAF, amylopectin biosynthesis in be2b and ss3a/be2b was not observed, whereas amylose biosynthesis in these lines was accelerated at 30 DAF. These data suggest that the high amylose content in the ss3a/be2b mutant results from higher amylose biosynthesis at two stages, up to 20 DAF and from 30 DAF to maturity. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  13. High fasting blood glucose and obesity significantly and independently increase risk of breast cancer death in hormone receptor-positive disease.

    Science.gov (United States)

    Minicozzi, Pamela; Berrino, Franco; Sebastiani, Federica; Falcini, Fabio; Vattiato, Rosa; Cioccoloni, Francesca; Calagreti, Gioia; Fusco, Mario; Vitale, Maria Francesca; Tumino, Rosario; Sigona, Aurora; Budroni, Mario; Cesaraccio, Rosaria; Candela, Giuseppa; Scuderi, Tiziana; Zarcone, Maurizio; Campisi, Ildegarda; Sant, Milena

    2013-12-01

    We investigated the effect of fasting blood glucose and body mass index (BMI) at diagnosis on risk of breast cancer death for cases diagnosed in five Italian cancer registries in 2003-2005 and followed up to the end of 2008. For 1607 Italian women (≥15 years) with information on BMI or blood glucose or diabetes, we analysed the risk of breast cancer death in relation to glucose tertiles (≤84.0, 84.1-94.0, >94.0 mg/dl) plus diabetic and unspecified categories; BMI tertiles (≤23.4, 23.5-27.3, >27.3 kg/m(2), unspecified), stage (T1-3N0M0, T1-3N+M0 plus T4anyNM0, M1, unspecified), oestrogen (ER) and progesterone (PR) status (ER+PR+, ER-PR-, ER and PR unspecified, other), age, chemotherapy and endocrine therapy, using multiple regression models. Separate models for ER+PR+ and ER-PR- cases were also run. Patients often had T1-3N0M0, ER+PR+ cancers and received chemotherapy or endocrine therapy; only 6% were M1 and 17% ER-PR-. Diabetic patients were older and had more often high BMI (>27 kg/m(2)), ER-PR-, M1 cancers than other patients. For ER+PR+ cases, with adjustment for other variables, breast cancer mortality was higher in women with high BMI than those with BMI 23.5-27.3 kg/m(2) (hazard ratio (HR)=2.9, 95% confidence interval (CI) 1.2-6.9). Breast cancer mortality was also higher in women with high (>94 mg/dl) blood glucose compared to those with glucose 84.1-94.0mg/dl (HR=2.6, 95% CI 1.2-5.7). Our results provide evidence that in ER+PR+ patients, high blood glucose and high BMI are independently associated with increased risk of breast cancer death. Detection and correction of these factors in such patients may improve prognosis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Site-directed immobilization of a genetically engineered anti-methotrexate antibody via an enzymatically introduced biotin label significantly increases the binding capacity of immunoaffinity columns.

    Science.gov (United States)

    Davenport, Kaitlynn R; Smith, Christopher A; Hofstetter, Heike; Horn, James R; Hofstetter, Oliver

    2016-05-15

    In this study, the effect of random vs. site-directed immobilization techniques on the performance of antibody-based HPLC columns was investigated using a single-domain camelid antibody (VHH) directed against methotrexate (MTX) as a model system. First, the high flow-through support material POROS-OH was activated with disuccinimidyl carbonate (DSC), and the VHH was bound in a random manner via amines located on the protein's surface. The resulting column was characterized by Frontal Affinity Chromatography (FAC). Then, two site-directed techniques were explored to increase column efficiency by immobilizing the antibody via its C-terminus, i.e., away from the antigen-binding site. In one approach, a tetra-lysine tail was added, and the antibody was immobilized onto DSC-activated POROS. In the second site-directed approach, the VHH was modified with the AviTag peptide, and a biotin-residue was enzymatically incorporated at the C-terminus using the biotin ligase BirA. The biotinylated antibody was subsequently immobilized onto NeutrAvidin-derivatized POROS. A comparison of the FAC analyses, which for all three columns showed excellent linearity (R(2)>0.999), revealed that both site-directed approaches yield better results than the random immobilization; the by far highest efficiency, however, was determined for the immunoaffinity column based on AviTag-biotinylated antibody. As proof of concept, all three columns were evaluated for quantification of MTX dissolved in phosphate buffered saline (PBS). Validation using UV-detection showed excellent linearity in the range of 0.04-12μM (R(2)>0.993). The lower limit of detection (LOD) and lower limit of quantification (LLOQ) were found to be independent of the immobilization strategy and were 40nM and 132nM, respectively. The intra- and inter-day precision was below 11.6%, and accuracy was between 90.7% and 112%. To the best of our knowledge, this is the first report of the AviTag-system in chromatography, and the first

  15. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  16. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  17. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  18. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  19. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  20. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  1. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  2. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  3. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  4. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  5. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  6. Probability of brittle failure

    Science.gov (United States)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  7. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  8. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  9. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  10. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Predicting significant torso trauma.

    Science.gov (United States)

    Nirula, Ram; Talmor, Daniel; Brasel, Karen

    2005-07-01

    Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.

  14. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  15. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  16. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  17. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  18. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  19. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  20. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  1. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  2. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  3. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  4. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  5. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  6. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  8. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  9. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  10. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  11. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  12. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  13. Impact of MCNP Unresolved Resonance Probability-Table Treatment on Uranium and Plutonium Benchmarks

    International Nuclear Information System (INIS)

    Mosteller, R.D.; Little, R.C.

    1999-01-01

    A probability-table treatment recently has been incorporated into an intermediate version of the MCNP Monte Carlo code named MCNP4XS. This paper presents MCNP4XS results for a variety of uranium and plutonium criticality benchmarks, calculated with and without the probability-table treatment. It is shown that the probability-table treatment can produce small but significant reactivity changes for plutonium and 233 U systems with intermediate spectra. More importantly, it can produce substantial reactivity increases for systems with large amounts of 238 U and intermediate spectra

  14. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  15. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  16. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  17. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  19. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. Infections and mixed infections with the selected species of Borrelia burgdorferi sensu lato complex in Ixodes ricinus ticks collected in eastern Poland: a significant increase in the course of 5 years.

    Science.gov (United States)

    Wójcik-Fatla, Angelina; Zając, Violetta; Sawczyn, Anna; Sroka, Jacek; Cisak, Ewa; Dutkiewicz, Jacek

    2016-02-01

    In the years 2008-2009 and 2013-2014, 1620 and 1500 questing Ixodes ricinus ticks, respectively, were examined on the territory of the Lublin province (eastern Poland). The presence of three pathogenic species causing Lyme disease was investigated: Borrelia burgdorferi sensu stricto, B. afzelii and B. garinii. The proportion of I. ricinus ticks infected with B. burgdorferi sensu lato showed a highly significant increase between 2008-2009 and 2013-2014, from 6.0 to 15.3%. A significant increase was noted with regard to all types of infections with individual species: single (4.7-7.8%), dual (1.2-6.6%), and triple (0.1-0.9%). When expressed as the percent of all infections, the frequency of mixed infections increased from 21.4 to 49.2%. Statistical analysis performed with two methods (by calculating of odds ratios and by Fisher's exact test) showed that the frequencies of mixed infections in most cases proved to be significantly greater than expected. The strongest associations were found between B. burgdorferi s. s. and B. afzelii, and between B. burgdorferi s. s. and B. garinii. They appeared to be highly significant (P eastern Poland, and dramatic enhancement of mixed infections with individual species, which may result in mixed infections of humans and exacerbation of the clinical course of Lyme disease cases on the studied area.

  2. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    Science.gov (United States)

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Gadolinium-enhanced cardiac MR exams of human subjects are associated with significant increases in the DNA repair marker 53BP1, but not the damage marker γH2AX.

    Directory of Open Access Journals (Sweden)

    Jennifer S McDonald

    Full Text Available Magnetic resonance imaging is considered low risk, yet recent studies have raised a concern of potential damage to DNA in peripheral blood leukocytes. This prospective Institutional Review Board-approved study examined potential double-strand DNA damage by analyzing changes in the DNA damage and repair markers γH2AX and 53BP1 in patients who underwent a 1.5 T gadolinium-enhanced cardiac magnetic resonance (MR exam. Sixty patients were enrolled (median age 55 years, 39 males. Patients with history of malignancy or who were receiving chemotherapy, radiation therapy, or steroids were excluded. MR sequence data were recorded and blood samples obtained immediately before and after MR exposure. An automated immunofluorescence assay quantified γH2AX or 53BP1 foci number in isolated peripheral blood mononuclear cells. Changes in foci number were analyzed using the Wilcoxon signed-rank test. Clinical and MR procedural characteristics were compared between patients who had a >10% increase in γH2AX or 53BP1 foci numbers and patients who did not. The number of γH2AX foci did not significantly change following cardiac MR (median foci per cell pre-MR = 0.11, post-MR = 0.11, p = .90, but the number of 53BP1 foci significantly increased following MR (median foci per cell pre-MR = 0.46, post-MR = 0.54, p = .0140. Clinical and MR characteristics did not differ significantly between patients who had at least a 10% increase in foci per cell and those who did not. We conclude that MR exposure leads to a small (median 25% increase in 53BP1 foci, however the clinical relevance of this increase is unknown and may be attributable to normal variation instead of MR exposure.

  5. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  6. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  7. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  8. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  9. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  10. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  11. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  12. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  13. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  14. Gas prices: realities and probabilities

    International Nuclear Information System (INIS)

    Broadfoot, M.

    2000-01-01

    An assessment of price trends suggests continuing rise in 2001, with some easing of upward price movement in 2002 and 2003. Storage levels as of Nov. 1, 2000 are expected to be at 2.77 Tcf, but if the winter of 2000/2001 proves to be more severe than usual, inventory levels could sink as low as 500 Bcf by April 1, 2001. With increasing demand for natural gas for non-utility electric power generation the major challenge will be to achieve significant supply growth, which means increased developmental drilling and inventory draw-downs, as well as more exploratory drilling in deepwater and frontier regions. Absence of a significant supply response by next summer will affect both growth in demand and in price levels, and the increased demand for electric generation in the summer will create a flatter consumption profile, erasing the traditional summer/winter spread in consumption, further intensifying price volatility. Managing price fluctuations is the second biggest challenge (after potential supply problems) facing the industry

  15. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  16. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  17. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  18. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  19. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  20. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  1. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  2. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  3. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  4. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  5. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  6. Testing Significance Testing

    Directory of Open Access Journals (Sweden)

    Joachim I. Krueger

    2018-04-01

    Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.

  7. Asymptomatic proteinuria. Clinical significance.

    Science.gov (United States)

    Papper, S

    1977-09-01

    Patients with asymptomatic proteinuria have varied reasons for the proteinuria and travel diverse courses. In the individual with normal renal function and no systemic cause, ie, idiopathic asymptomatic proteinuria, the outlook is generally favorable. Microscopic hematuria probably raises some degree of question about prognosis. The kidney shows normal glomeruli, subtle changes, or an identifiable lesion. The initial approach includes a clinical and laboratory search for systemic disease, repeated urinalyses, quantitative measurements of proteinuria, determination of creatinine clearance, protein electrophoresis where indicated, and intravenous pyelography. The need for regularly scheduled follow-up evaluation is emphasized. Although the initial approach need not include renal biopsy, a decline in creatinine clearance, an increase in proteinuria, or both are indications for biopsy and consideration of drug therapy.

  8. Prolonged continuous intravenous infusion of the dipeptide L-alanine- L-glutamine significantly increases plasma glutamine and alanine without elevating brain glutamate in patients with severe traumatic brain injury.

    Science.gov (United States)

    Nägeli, Mirjam; Fasshauer, Mario; Sommerfeld, Jutta; Fendel, Angela; Brandi, Giovanna; Stover, John F

    2014-07-02

    Low plasma glutamine levels are associated with worse clinical outcome. Intravenous glutamine infusion dose- dependently increases plasma glutamine levels, thereby correcting hypoglutaminemia. Glutamine may be transformed to glutamate which might limit its application at a higher dose in patients with severe traumatic brain injury (TBI). To date, the optimal glutamine dose required to normalize plasma glutamine levels without increasing plasma and cerebral glutamate has not yet been defined. Changes in plasma and cerebral glutamine, alanine, and glutamate as well as indirect signs of metabolic impairment reflected by increased intracranial pressure (ICP), lactate, lactate-to-pyruvate ratio, electroencephalogram (EEG) activity were determined before, during, and after continuous intravenous infusion of 0.75 g L-alanine-L-glutamine which was given either for 24 hours (group 1, n = 6) or 5 days (group 2, n = 6) in addition to regular enteral nutrition. Lab values including nitrogen balance, urea and ammonia were determined daily. Continuous L-alanine-L-glutamine infusion significantly increased plasma and cerebral glutamine as well as alanine levels, being mostly sustained during the 5 day infusion phase (plasma glutamine: from 295 ± 62 to 500 ± 145 μmol/ l; brain glutamine: from 183 ± 188 to 549 ± 120 μmol/ l; plasma alanine: from 327 ± 91 to 622 ± 182 μmol/ l; brain alanine: from 48 ± 55 to 89 ± 129 μmol/ l; p alanine-L-glutamine infusion (0.75 g/ kg/ d up to 5 days) increased plasma and brain glutamine and alanine levels. This was not associated with elevated glutamate or signs of potential glutamate-mediated cerebral injury. The increased nitrogen load should be considered in patients with renal and hepatic dysfunction. Clinicaltrials.gov NCT02130674. Registered 5 April 2014.

  9. FTY720-loaded poly(DL-lactide-co-glycolide) electrospun scaffold significantly increases microvessel density over 7 days in streptozotocin-induced diabetic C57b16/J mice: preliminary results.

    Science.gov (United States)

    Bowers, D T; Chhabra, P; Langman, L; Botchwey, E A; Brayman, K L

    2011-11-01

    Nanofiber scaffolds could improve islet transplant success by physically mimicking the shape of extracellular matrix and by acting as a drug-delivery vehicle. Scaffolds implanted in alternate transplant sites must be prevascularized or very quickly vascularized following transplantation to prevent hypoxia-induced islet necrosis. The local release of the S1P prodrug FTY720 induces diameter enlargement and increases in length density. The objective of this preliminary study was to evaluate length and diameter differences between diabetic and nondiabetic animals implanted with FTY720-containing electrospun scaffolds using intravital imaging of dorsal skinfold window chambers. Electrospun mats of randomly oriented fibers we created from polymer solutions of PLAGA (50:50 LA:GA) with and without FTY720 loaded at a ratio of 1:200 (FTY720:PLAGA by wt). The implanted fiber mats were 4 mm in diameter and ∼0.2 mm thick. Increases in length density and vessel diameter were assessed by automated analysis of images over 7 days in RAVE, a Matlab program. Image analysis of repeated measures of microvessel metrics demonstrated a significant increase in the length density from day 0 to day 7 in the moderately diabetic animals of this preliminary study (P < .05). Furthermore, significant differences in length density at day 0 and day 3 were found between recently STZ-induced moderately diabetic and nondiabetic animals in response to FTY720 local release (P < .05, Student t test). Driving the islet revascularization process using local release of factors, such as FTY720, from biodegradable polymers makes an attractive system for the improvement of islet transplant success. Preliminary study results suggest that a recently induced moderately diabetic state may potentiate the mechanism by which local release of FTY720 from polymer fibers increases length density of microvessels. Therefore, local release of S1P receptor-targeted drugs is under further investigation for improvement of

  10. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  11. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  12. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  13. A single amino acid change (Y318F) in the L-arabitol dehydrogenase (LadA) from Aspergillus niger results in a significant increase in affinity for D-sorbitol

    Science.gov (United States)

    2009-01-01

    Background L-arabitol dehydrogenase (LAD) and xylitol dehydrogenase (XDH) are involved in the degradation of L-arabinose and D-xylose, which are among the most abundant monosaccharides on earth. Previous data demonstrated that LAD and XDH not only differ in the activity on their biological substrate, but also that only XDH has significant activity on D-sorbitol and may therefore be more closely related to D-sorbitol dehydrogenases (SDH). In this study we aimed to identify residues involved in the difference in substrate specificity. Results Phylogenetic analysis demonstrated that LAD, XDH and SDH form 3 distinct groups of the family of dehydrogenases containing an Alcohol dehydrogenase GroES-like domain (pfam08240) and likely have evolved from a common ancestor. Modelling of LadA and XdhA of the saprobic fungus Aspergillus niger on human SDH identified two residues in LadA (M70 and Y318), that may explain the absence of activity on D-sorbitol. While introduction of the mutation M70F in LadA of A. niger resulted in a nearly complete enzyme inactivation, the Y318F resulted in increased activity for L-arabitol and xylitol. Moreover, the affinity for D-sorbitol was increased in this mutant. Conclusion These data demonstrates that Y318 of LadA contributes significantly to the substrate specificity difference between LAD and XDH/SDH. PMID:19674460

  14. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  15. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  16. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  17. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  18. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  19. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    Science.gov (United States)

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  20. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  1. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  2. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  3. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  4. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  5. Quantum Probability Zero-One Law for Sequential Terminal Events

    Science.gov (United States)

    Rehder, Wulf

    1980-07-01

    On the basis of the Jauch-Piron quantum probability calculus a zero-one law for sequential terminal events is proven, and the significance of certain crucial axioms in the quantum probability calculus is discussed. The result shows that the Jauch-Piron set of axioms is appropriate for the non-Boolean algebra of sequential events.

  6. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  7. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  8. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. In Vivo Imaging Reveals Significant Tumor Vascular Dysfunction and Increased Tumor Hypoxia-Inducible Factor-1α Expression Induced by High Single-Dose Irradiation in a Pancreatic Tumor Model

    Energy Technology Data Exchange (ETDEWEB)

    Maeda, Azusa [Princess Margaret Cancer Centre, University Health Network, Toronto, Ontario (Canada); Department of Medical Biophysics, University of Toronto, Toronto, Ontario (Canada); Chen, Yonghong; Bu, Jiachuan; Mujcic, Hilda [Princess Margaret Cancer Centre, University Health Network, Toronto, Ontario (Canada); Wouters, Bradly G. [Princess Margaret Cancer Centre, University Health Network, Toronto, Ontario (Canada); Department of Medical Biophysics, University of Toronto, Toronto, Ontario (Canada); Department of Radiation Oncology, University of Toronto, Toronto, Ontario (Canada); DaCosta, Ralph S., E-mail: rdacosta@uhnres.utoronto.ca [Princess Margaret Cancer Centre, University Health Network, Toronto, Ontario (Canada); Department of Medical Biophysics, University of Toronto, Toronto, Ontario (Canada); Techna Institute, University Health Network, Toronto, Ontario (Canada)

    2017-01-01

    Purpose: To investigate the effect of high-dose irradiation on pancreatic tumor vasculature and microenvironment using in vivo imaging techniques. Methods and Materials: A BxPC3 pancreatic tumor xenograft was established in a dorsal skinfold window chamber model and a subcutaneous hind leg model. Tumors were irradiated with a single dose of 4, 12, or 24 Gy. The dorsal skinfold window chamber model was used to assess tumor response, vascular function and permeability, platelet and leukocyte adhesion to the vascular endothelium, and tumor hypoxia for up to 14 days after 24-Gy irradiation. The hind leg model was used to monitor tumor size, hypoxia, and vascularity for up to 65 days after 24-Gy irradiation. Tumors were assessed histologically to validate in vivo observations. Results: In vivo fluorescence imaging revealed temporary vascular dysfunction in tumors irradiated with a single dose of 4 to 24 Gy, but most significantly with a single dose of 24 Gy. Vascular functional recovery was observed by 14 days after irradiation in a dose-dependent manner. Furthermore, irradiation with 24 Gy caused platelet and leukocyte adhesion to the vascular endothelium within hours to days after irradiation. Vascular permeability was significantly higher in irradiated tumors compared with nonirradiated controls 14 days after irradiation. This observation corresponded with increased expression of hypoxia-inducible factor-1α in irradiated tumors. In the hind leg model, irradiation with a single dose of 24 Gy led to tumor growth delay, followed by tumor regrowth. Conclusions: Irradiation of the BxPC3 tumors with a single dose of 24 Gy caused transient vascular dysfunction and increased expression of hypoxia-inducible factor-1α. Such biological changes may impact tumor response to high single-dose and hypofractionated irradiation, and further investigations are needed to better understand the clinical outcomes of stereotactic body radiation therapy.

  10. EMPIRICAL STUDY OF THE PROBABILITY OF DEFAULT IN CASE OF ROMANIAN COMPANIES LISTED ON STOCK EXCHANGE

    Directory of Open Access Journals (Sweden)

    Marton Noemi, Racz Timea Erzsebet

    2011-07-01

    Full Text Available The importance of estimation of a firm's probability of default increased significantly during the economic and financial crisis for financial institutions, which can be explained by the fact that the share of nonperforming loans increased in this period. The probability of default can be estimated with structural models, which have on base the methodology developed by Merton (1974, methodology used by Moody's Corporation (known as KMV Merton model. The aim of this study is to estimate the probability of default of companies listed on Bucharest Stock Exchange using this methodology. This approach was widely used in the literature by many researchers (i.e., Kealhofer and Kurbat (2000, Crosbie and Bohn (2002, Duffie and Wang (2004, Bharath and Shumway (2004, 2008. In Romania this methodology was empirically tested by Codirlaşu (2007, who estimated using Merton's methodology the probability of default of companies listed on the Bucharest Stock Exchange, respectively by Bobircă et al. (2008, where the probabilities of default were estimated in case of 42 companies listed on the Bucharest Stock Exchange for 2000-2008 time period. In this paper we used Merton's model, which assumes that a company defaults if the value of its assets is less than the promised dept repayment at time T. The process of estimating the probability of default starts from the following firm specific variables: the market value of the firm's assets, the share prices, the value of the liabilities and the risk-free rate. The analyzed period is 2003-2010, containing the economic and financial crisis period, too. Analyzing the financial statements of the companies listed on the Bucharest Stock Exchange, we determined the input parameters of the model and calculated the quarterly probabilities of default of each analyzed company. According to our results the probabilities of default have a reduced value in the majority of the cases.

  11. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  12. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  13. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  14. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  15. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Here we derive the most probable degree distribution emerging ... the structural entropy of power-law networks is an increasing function of the expo- .... tition function Z of the network as the sum over all degree distributions, with given energy.

  16. Significant reduction of peripheral blood interleukin-35 and CD4+EBI3+ T cells, which are negatively correlated with an increase in the plasma IL-17 and cTnI level, in viral myocarditis patients

    Directory of Open Access Journals (Sweden)

    Han Ouyang

    2017-02-01

    Full Text Available Introduction: Viral myocarditis (VMC has become an increasingly common heart disease that endangers human health. In the present study, the plasma interleukin-35 (IL-35 level and the percentage of CD4 + EBI3 + T cells in VMC patients were detected to investigate the significance of changes in these parameters in the plasma of VMC patients and their association with the disease. Material and methods: ELISA was performed to detect the plasma IL-35 level and the percentage of peripheral blood CD4 + EBI3 + T cells in 40 VMC patients and in 20 healthy individuals. Moreover, the plasma IL-17 levels in the VMC patients and in the healthy individuals were detected using an ELISA, and the cardiac Troponin-I (cTnI levels were detected using a chemiluminescent microparticle immunoassay to compare the differences in the groups. Results : Plasma IL-35 level and the percentage of CD4 + EBI3 + T cells in acute phase VMC patients was lower than that in the healthy control group and the convalescent phase VMC patients. Additionally, the plasma IL-35 level in the VMC patients exhibited a negative correlation with the levels of cTnI and IL-17. The percentage of CD4 + EBI3 + T cells also showed a negative correlation with the levels of cTnI and IL-17. Conclusions : The plasma IL-35 level and the percentage of CD4 + EBI3 + T cells in VMC patients was reduced, and the amount of the decrease was associated with the severity of the disease. These results suggest that IL-35 and CD4 + EBI3 + T might play important roles in the progression of VMC and could be used as indictors of the disease.

  17. Probabilities for profitable fungicide use against gray leaf spot in hybrid maize.

    Science.gov (United States)

    Munkvold, G P; Martinson, C A; Shriver, J M; Dixon, P M

    2001-05-01

    ABSTRACT Gray leaf spot, caused by the fungus Cercospora zeae-maydis, causes considerable yield losses in hybrid maize grown in the north-central United States and elsewhere. Nonchemical management tactics have not adequately prevented these losses. The probability of profitably using fungicide application as a management tool for gray leaf spot was evaluated in 10 field experiments under conditions of natural inoculum in Iowa. Gray leaf spot severity in untreated control plots ranged from 2.6 to 72.8% for the ear leaf and from 3.0 to 7.7 (1 to 9 scale) for whole-plot ratings. In each experiment, fungicide applications with propiconazole or mancozeb significantly reduced gray leaf spot severity. Fungicide treatment significantly (P increased yield by as much as 1.65 t/ha with a single propiconazole application. There were significant (P < 0.05) correlations between gray leaf spot severity and yield. We used a Bayesian inference method to calculate for each experiment the probability of achieving a positive net return with one or two propiconazole applications, based on the mean yields and standard deviations for treated and untreated plots, the price of grain, and the costs of the fungicide applications. For one application, the probability ranged from approximately 0.06 to more than 0.99, and exceeded 0.50 in six of nine scenarios (specific experiment/hybrid). The highest probabilities occurred in the 1995 experiments with the most susceptible hybrid. Probabilities were almost always higher for a single application of propiconazole than for two applications. These results indicate that a single application of propiconazole frequently can be profitable for gray leaf spot management in Iowa, but the probability of a profitable application is strongly influenced by hybrid susceptibility. The calculation of probabilities for positive net returns was more informative than mean separation in terms of assessing the economic success of the fungicide applications.

  18. Pretest probability of a normal echocardiography: validation of a simple and practical algorithm for routine use.

    Science.gov (United States)

    Hammoudi, Nadjib; Duprey, Matthieu; Régnier, Philippe; Achkar, Marc; Boubrit, Lila; Preud'homme, Gisèle; Healy-Brucker, Aude; Vignalou, Jean-Baptiste; Pousset, Françoise; Komajda, Michel; Isnard, Richard

    2014-02-01

    Management of increased referrals for transthoracic echocardiography (TTE) examinations is a challenge. Patients with normal TTE examinations take less time to explore than those with heart abnormalities. A reliable method for assessing pretest probability of a normal TTE may optimize management of requests. To establish and validate, based on requests for examinations, a simple algorithm for defining pretest probability of a normal TTE. In a retrospective phase, factors associated with normality were investigated and an algorithm was designed. In a prospective phase, patients were classified in accordance with the algorithm as being at high or low probability of having a normal TTE. In the retrospective phase, 42% of 618 examinations were normal. In multivariable analysis, age and absence of cardiac history were associated to normality. Low pretest probability of normal TTE was defined by known cardiac history or, in case of doubt about cardiac history, by age>70 years. In the prospective phase, the prevalences of normality were 72% and 25% in high (n=167) and low (n=241) pretest probability of normality groups, respectively. The mean duration of normal examinations was significantly shorter than abnormal examinations (13.8 ± 9.2 min vs 17.6 ± 11.1 min; P=0.0003). A simple algorithm can classify patients referred for TTE as being at high or low pretest probability of having a normal examination. This algorithm might help to optimize management of requests in routine practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  19. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  20. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  1. The thresholds for statistical and clinical significance

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...

  2. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  3. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  4. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  5. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  6. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  7. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  8. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  9. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  10. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Influence of interference terms on probability of color connections among partons in e+e-→qq-bar + ng process

    International Nuclear Information System (INIS)

    Jin Yi; Xie Qubing; Li Shiyuan

    2003-01-01

    Large N c approximation is adopted in popular e + e - event generators where the production probability of singlet chain states is 100% and that of color separate states is 0. In the real world N c =3, we investigate the origin and character of color and kinematics aspects in interference terms. We find that the production probability of color singlet chain states decreases from 83%, 77% to 67%, 58% for qq-bar + 2g and qq-bar + 3g system respectively after considering the interference terms. Especially, the production probability of color separate states increases to twice of that without interference terms for qq-bar + 2g. Hence when n is larger, we can expect that the production probability of singlet chain states will be far less than 1 and that of color separate states will significantly increase when n is larger

  12. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  13. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  14. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  15. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  16. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  17. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  18. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  19. Cladding failure probability modeling for risk evaluations of fast reactors

    International Nuclear Information System (INIS)

    Mueller, C.J.; Kramer, J.M.

    1987-01-01

    This paper develops the methodology to incorporate cladding failure data and associated modeling into risk evaluations of liquid metal-cooled fast reactors (LMRs). Current US innovative designs for metal-fueled pool-type LMRs take advantage of inherent reactivity feedback mechanisms to limit reactor temperature increases in response to classic anticipated-transient-without-scram (ATWS) initiators. Final shutdown without reliance on engineered safety features can then be accomplished if sufficient time is available for operator intervention to terminate fission power production and/or provide auxiliary cooling prior to significant core disruption. Coherent cladding failure under the sustained elevated temperatures of ATWS events serves as one indicator of core disruption. In this paper we combine uncertainties in cladding failure data with uncertainties in calculations of ATWS cladding temperature conditions to calculate probabilities of cladding failure as a function of the time for accident recovery

  20. Cladding failure probability modeling for risk evaluations of fast reactors

    International Nuclear Information System (INIS)

    Mueller, C.J.; Kramer, J.M.

    1987-01-01

    This paper develops the methodology to incorporate cladding failure data and associated modeling into risk evaluations of liquid metal-cooled fast reactors (LMRs). Current U.S. innovative designs for metal-fueled pool-type LMRs take advantage of inherent reactivity feedback mechanisms to limit reactor temperature increases in response to classic anticipated-transient-without-scram (ATWS) initiators. Final shutdown without reliance on engineered safety features can then be accomplished if sufficient time is available for operator intervention to terminate fission power production and/or provide auxiliary cooling prior to significant core disruption. Coherent cladding failure under the sustained elevated temperatures of ATWS events serves as one indicator of core disruption. In this paper we combine uncertainties in cladding failure data with uncertainties in calculations of ATWS cladding temperature conditions to calculate probabilities of cladding failure as a function of the time for accident recovery. (orig.)

  1. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  2. Probability of Alzheimer's disease in breast cancer survivors based on gray-matter structural network efficiency.

    Science.gov (United States)

    Kesler, Shelli R; Rao, Vikram; Ray, William J; Rao, Arvind

    2017-01-01

    Breast cancer chemotherapy is associated with accelerated aging and potentially increased risk for Alzheimer's disease (AD). We calculated the probability of AD diagnosis from brain network and demographic and genetic data obtained from 47 female AD converters and 47 matched healthy controls. We then applied this algorithm to data from 78 breast cancer survivors. The classifier discriminated between AD and healthy controls with 86% accuracy ( P  < .0001). Chemotherapy-treated breast cancer survivors demonstrated significantly higher probability of AD compared to healthy controls ( P  < .0001) and chemotherapy-naïve survivors ( P  = .007), even after stratifying for apolipoprotein e4 genotype. Chemotherapy-naïve survivors also showed higher AD probability compared to healthy controls ( P  = .014). Chemotherapy-treated breast cancer survivors who have a particular profile of brain structure may have a higher risk for AD, especially those who are older and have lower cognitive reserve.

  3. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  4. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  5. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  6. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  7. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  8. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  9. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  10. Release of Hormones from Conjugates: Chloroplast Expression of β-Glucosidase Results in Elevated Phytohormone Levels Associated with Significant Increase in Biomass and Protection from Aphids or Whiteflies Conferred by Sucrose Esters1[C][OA

    Science.gov (United States)

    Jin, Shuangxia; Kanagaraj, Anderson; Verma, Dheeraj; Lange, Theo; Daniell, Henry

    2011-01-01

    Transplastomic tobacco (Nicotiana tabacum) plants expressing β-glucosidase (Bgl-1) show modified development. They flower 1 month earlier with an increase in biomass (1.9-fold), height (1.5-fold), and leaf area (1.6-fold) than untransformed plants. Trichome density on the upper and lower leaf surfaces of BGL-1 plants increase by 10- and 7-fold, respectively, harboring 5-fold more glandular trichomes (as determined by rhodamine B staining), suggesting that BGL-1 lines produce more sugar esters than control plants. Gibberellin (GA) levels were investigated because it is a known regulator of flowering time, plant height, and trichome development. Both GA1 and GA4 levels are 2-fold higher in BGL-1 leaves than in untransformed plants but do not increase in other organs. In addition, elevated levels of other plant hormones, including zeatin and indole-3-acetic acid, are observed in BGL-1 lines. Protoplasts from BGL-1 lines divide and form calli without exogenous hormones. Cell division in protoplasts is enhanced 7-fold in the presence of exogenously applied zeatin-O-glucoside conjugate, indicating the release of active hormones from their conjugates. Whitefly (Bemisia tabaci) and aphid (Myzus persicae) populations in control plants are 18 and 15 times higher than in transplastomic lines, respectively. Lethal dose to kill 50% of the test population values of 26.3 and 39.2 μg per whitefly and 23.1 and 35.2 μg per aphid for BGL-1 and untransformed control exudates, respectively, confirm the enhanced toxicity of transplastomic exudates. These data indicate that increase in sugar ester levels in BGL-1 lines might function as an effective biopesticide. This study provides a novel strategy for designing plants for enhanced biomass production and insect control by releasing plant hormones or sugar esters from their conjugates stored within their chloroplasts. PMID:21068365

  11. Release of hormones from conjugates: chloroplast expression of β-glucosidase results in elevated phytohormone levels associated with significant increase in biomass and protection from aphids or whiteflies conferred by sucrose esters.

    Science.gov (United States)

    Jin, Shuangxia; Kanagaraj, Anderson; Verma, Dheeraj; Lange, Theo; Daniell, Henry

    2011-01-01

    Transplastomic tobacco (Nicotiana tabacum) plants expressing β-glucosidase (Bgl-1) show modified development. They flower 1 month earlier with an increase in biomass (1.9-fold), height (1.5-fold), and leaf area (1.6-fold) than untransformed plants. Trichome density on the upper and lower leaf surfaces of BGL-1 plants increase by 10- and 7-fold, respectively, harboring 5-fold more glandular trichomes (as determined by rhodamine B staining), suggesting that BGL-1 lines produce more sugar esters than control plants. Gibberellin (GA) levels were investigated because it is a known regulator of flowering time, plant height, and trichome development. Both GA(1) and GA(4) levels are 2-fold higher in BGL-1 leaves than in untransformed plants but do not increase in other organs. In addition, elevated levels of other plant hormones, including zeatin and indole-3-acetic acid, are observed in BGL-1 lines. Protoplasts from BGL-1 lines divide and form calli without exogenous hormones. Cell division in protoplasts is enhanced 7-fold in the presence of exogenously applied zeatin-O-glucoside conjugate, indicating the release of active hormones from their conjugates. Whitefly (Bemisia tabaci) and aphid (Myzus persicae) populations in control plants are 18 and 15 times higher than in transplastomic lines, respectively. Lethal dose to kill 50% of the test population values of 26.3 and 39.2 μg per whitefly and 23.1 and 35.2 μg per aphid for BGL-1 and untransformed control exudates, respectively, confirm the enhanced toxicity of transplastomic exudates. These data indicate that increase in sugar ester levels in BGL-1 lines might function as an effective biopesticide. This study provides a novel strategy for designing plants for enhanced biomass production and insect control by releasing plant hormones or sugar esters from their conjugates stored within their chloroplasts.

  12. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  13. Probability Distribution and Projected Trends of Daily Precipitation in China

    Institute of Scientific and Technical Information of China (English)

    CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER

    2013-01-01

    Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.

  14. Misclassification probability as obese or lean in hypercaloric and normocaloric diet

    Directory of Open Access Journals (Sweden)

    ANDRÉ F NASCIMENTO

    2008-01-01

    Full Text Available The aim of the present study was to determine the classification error probabilities, as lean or obese, in hypercaloric diet-induced obesity, which depends on the variable used to characterize animal obesity. In addition, the misclassification probabilities in animáis submitted to normocaloric diet were also evaluated. Male Wistar rats were randomly distributed into two groups: normal diet (ND; n=31; 3,5 Kcal/g and hypercaloric diet (HD; n=31; 4,6 Kcal/g. The ND group received commercial Labina rat feed and HD animáis a cycle of five hypercaloric diets for a 14-week period. The variables analysed were body weight, body composition, body weight to length ratio, Lee Índex, body mass Índex and misclassification probability. A 5% significance level was used. The hypercaloric pellet-diet cycle promoted increase of body weight, carcass fat, body weight to length ratio and Lee Índex. The total misclassification probabilities ranged from 19.21% to 40.91%. In conclusión, the results of this experiment show that misclassification probabilities occur when dietary manipulation is used to promote obesity in animáis. This misjudgement ranges from 19.49% to 40.52% in hypercaloric diet and 18.94% to 41.30% in normocaloric diet.

  15. Significant increase of Curie temperature and large piezoelectric coefficient in Ba(Ti0.80Zr0.20)O3-0.5(Ba0.70Ca0.30)TiO3 nanofibers

    Science.gov (United States)

    Fu, Bi; Yang, Yaodong; Gao, Kun; Wang, Yaping

    2015-07-01

    Ba(Ti0.80Zr0.20)O3-0.5(Ba0.7Ca0.3)TiO3 (abbreviated as BTZ-0.5BCT) is a piezoelectric ceramic with a high piezoelectric coefficient d33 (˜620 pC N-1) and has been regarded as one of the most promising candidates to replace PZT-based materials (200-710 pC N-1). However, its Curie temperature TC is relatively low (93 °C) limiting its application. In this letter, we found a temperature dependent Raman spectrum in BTZ-0.5BCT nanofibers (NFs), demonstrating a diffused tetragonal-to-cubic phase transition at 300 °C. This means that the TC of the NFs is nearly 207 °C higher than that of the normal bulk material. The increased TC is considered to be associated with the size effect of BTZ-0.5BCT nanoceramic subunits and the nanoporous nature of the fiber, resulting in discontinuous physical properties. The variation of the ferro/piezoelectricity over the fiber surface is attributed to the polycrystalline structure. The d33 (173.32 pm V-1) is improved in terms of the decreased Q factor result in an increase in d33 of 236.54 pm V-1 after polarization. With a high TC and a very large d33, BTZ-0.5BCT NFs are capable of providing electromechanical behavior used in moderate temperatures.

  16. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  17. Guanylin and uroguanylin mRNA expression is increased following Roux-en-Y gastric bypass, but guanylins do not play a significant role in body weight regulation and glycemic control

    DEFF Research Database (Denmark)

    Fernandez-Cachon, María L; Pedersen, Søren L; Rigbolt, Kristoffer T

    2018-01-01

    AIM: To determine whether intestinal expression of guanylate cyclase activator 2A (GUCA2A) and guanylate cyclase activator 2B (GUCA2B) genes is regulated in obese humans following Roux-en-Y gastric bypass (RYGB), and to evaluate the corresponding guanylin (GN) and uroguanylin (UGN) peptides...... for potentially contributing to the beneficial metabolic effects of RYGB. METHODS: Enteroendocrine cells were harvested peri- and post-RYGB, and GUCA2A/GUCA2B mRNA expression was compared. GN, UGN and their prohormones (proGN, proUGN) were administered subcutaneously in normal-weight mice to evaluate effects...... on food intake. GN and UGN, as well as their prohormones, were evaluated for effects on glucose-stimulated insulin secretion (GSIS) in rat pancreatic islets and perfused rat pancreas. RESULTS: GUCA2A and GUCA2B mRNA expression was significantly upregulated in enteroendocrine cells after RYGB. Peripheral...

  18. Human circulating ribosomal DNA content significantly increases while circulating satellite III (1q12) content decreases under chronic occupational exposure to low-dose gamma- neutron and tritium beta-radiation.

    Science.gov (United States)

    Korzeneva, Inna B; Kostuyk, Svetlana V; Ershova, Elizaveta S; Skorodumova, Elena N; Zhuravleva, Veronika F; Pankratova, Galina V; Volkova, Irina V; Stepanova, Elena V; Porokhovnik, Lev N; Veiko, Natalia N

    A single exposure to ionizing radiation (IR) results in an elevated cell-free DNA (cfDNA) content in the blood plasma. In this case, the cfDNA concentration can be a marker of the cell death in the organism. However, a chronic exposure to a low-dose IR enhances both the endonuclease activity and titer of antibodies to DNA in blood plasma, resulting in a decrease of the total concentration of circulating cfDNA in exposed people. In this case, the total cfDNA concentration should not be considered as a marker of the cell death in an exposed body. We assumed that a pool of the cfDNA circulating in the exposed people contains DNA fragments, which are resistant to a double-strand break formation in the environment of the elevated plasma endonuclease activity, and can be accumulated in the blood plasma. In order to test this hypothesis, we studied the content of GC-rich sequences (69%GC) of the transcribed region of human ribosomal repeat (rDNA), as well as the content of AT-rich repeat (63%AT) of satellite III (1q12) in the cfDNA samples obtained from 285 individuals. We have found that a chronic exposure to gamma-neutron radiation (N=88) and tritium β-radiation (N=88) evokes an increase of the rDNA content (RrDNA index) and a decrease of the satellite III content (RsatIII index) in the circulating cfDNA as compared with the cfDNA of non-exposed people (N=109). Such index that simultaneously displays both the increase of rDNA content and decrease of satellite III content in the cfDNA (RrDNA/RsatIII) can be recommended as a marker of chronic processes in the body that involve the elevated cell death rate and/or increased blood plasma endonuclease activity. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  20. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  1. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  2. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  4. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  5. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  6. Analysis of Drop Call Probability in Well Established Cellular ...

    African Journals Online (AJOL)

    Technology in Africa has increased over the past decade. The increase in modern cellular networks requires stringent quality of service (QoS). Drop call probability is one of the most important indices of QoS evaluation in a large scale well-established cellular network. In this work we started from an accurate statistical ...

  7. The probable effect of integrated reporting on audit quality

    Directory of Open Access Journals (Sweden)

    Tamer A. El Nashar

    2016-06-01

    Full Text Available This paper examines a probable effect of integrated reporting on improving the audit quality of organizations. I correlate the hypothesis of this paper in relation to the current trends of protecting the economies, the financial markets and the societies. I predict an improvement of the audit quality, as a result to an estimated percentage of organizations’ reliance on the integrated reporting in their accountability perspective. I used a decision tree and a Bayes’ theorem approach, to predict the probabilities of the significant effect on improving the auditing quality. I find the overall result of this paper, indicates that the probability of organizations to rely on the integrated reporting by a significant percentage, predicts also a significant improvement in audit quality.

  8. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  9. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  10. Human circulating ribosomal DNA content significantly increases while circulating satellite III (1q12) content decreases under chronic occupational exposure to low-dose gamma- neutron and tritium beta-radiation

    International Nuclear Information System (INIS)

    Korzeneva, Inna B.; Kostuyk, Svetlana V.; Ershova, Elizaveta S.; Skorodumova, Elena N.; Zhuravleva, Veronika F.; Pankratova, Galina V.; Volkova, Irina V.; Stepanova, Elena V.; Porokhovnik, Lev N.; Veiko, Natalia N.

    2016-01-01

    Highlights: • A transcribed region of human ribosomal repeat is resistant to double-strand breaks in the environment of a raised endonuclease activity. • Hybridization-based techniques are preferable for the analysis of damaged and/or oxidized genomic fragments, rather than the qRT-PCR method. • A chronic exposure to the low-dose IR induces an elevation of the rDNA content in the human circulating cfDNA as compared to cellular DNA. • An exposure to IR entails a decrease of the level of the human circulating satellite III (1q12) as compared to cellular DNA (RsatIII index). • The RrDNA/RsatIII ratio is a potential marker of a chronic IR individual exposure. - Abstract: A single exposure to ionizing radiation (IR) results in an elevated cell-free DNA (cfDNA) content in the blood plasma. In this case, the cfDNA concentration can be a marker of the cell death in the organism. However, a chronic exposure to a low-dose IR enhances both the endonuclease activity and titer of antibodies to DNA in blood plasma, resulting in a decrease of the total concentration of circulating cfDNA in exposed people. In this case, the total cfDNA concentration should not be considered as a marker of the cell death in an exposed body. We assumed that a pool of the cfDNA circulating in the exposed people contains DNA fragments, which are resistant to a double-strand break formation in the environment of the elevated plasma endonuclease activity, and can be accumulated in the blood plasma. In order to test this hypothesis, we studied the content of GC-rich sequences (69%GC) of the transcribed region of human ribosomal repeat (rDNA), as well as the content of AT-rich repeat (63%AT) of satellite III (1q12) in the cfDNA samples obtained from 285 individuals. We have found that a chronic exposure to gamma-neutron radiation (N = 88) and tritium β-radiation (N = 88) evokes an increase of the rDNA content (RrDNA index) and a decrease of the satellite III content (RsatIII index) in the

  11. Human circulating ribosomal DNA content significantly increases while circulating satellite III (1q12) content decreases under chronic occupational exposure to low-dose gamma- neutron and tritium beta-radiation

    Energy Technology Data Exchange (ETDEWEB)

    Korzeneva, Inna B., E-mail: inna.korzeneva@molgen.vniief.ru [Russian Federal Nuclear Center – All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) 607190 Sarov, 37 Mira ave., Nizhniy Novgorod Region (Russian Federation); Kostuyk, Svetlana V. [Research Centre for Medical Genetics, 115478 Moscow, 1 Moskvorechye str. (Russian Federation); Ershova, Elizaveta S. [Research Centre for Medical Genetics, 115478 Moscow, 1 Moskvorechye str. (Russian Federation); V. A. Negovsky Research Institute of General Reanimatology, Moscow, 107031 (Russian Federation); Skorodumova, Elena N.; Zhuravleva, Veronika F.; Pankratova, Galina V.; Volkova, Irina V.; Stepanova, Elena V. [Russian Federal Nuclear Center – All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) 607190 Sarov, 37 Mira ave., Nizhniy Novgorod Region (Russian Federation); Porokhovnik, Lev N. [Research Centre for Medical Genetics, 115478 Moscow, 1 Moskvorechye str. (Russian Federation); Veiko, Natalia N. [Research Centre for Medical Genetics, 115478 Moscow, 1 Moskvorechye str. (Russian Federation); V. A. Negovsky Research Institute of General Reanimatology, Moscow, 107031 (Russian Federation)

    2016-09-15

    Highlights: • A transcribed region of human ribosomal repeat is resistant to double-strand breaks in the environment of a raised endonuclease activity. • Hybridization-based techniques are preferable for the analysis of damaged and/or oxidized genomic fragments, rather than the qRT-PCR method. • A chronic exposure to the low-dose IR induces an elevation of the rDNA content in the human circulating cfDNA as compared to cellular DNA. • An exposure to IR entails a decrease of the level of the human circulating satellite III (1q12) as compared to cellular DNA (RsatIII index). • The RrDNA/RsatIII ratio is a potential marker of a chronic IR individual exposure. - Abstract: A single exposure to ionizing radiation (IR) results in an elevated cell-free DNA (cfDNA) content in the blood plasma. In this case, the cfDNA concentration can be a marker of the cell death in the organism. However, a chronic exposure to a low-dose IR enhances both the endonuclease activity and titer of antibodies to DNA in blood plasma, resulting in a decrease of the total concentration of circulating cfDNA in exposed people. In this case, the total cfDNA concentration should not be considered as a marker of the cell death in an exposed body. We assumed that a pool of the cfDNA circulating in the exposed people contains DNA fragments, which are resistant to a double-strand break formation in the environment of the elevated plasma endonuclease activity, and can be accumulated in the blood plasma. In order to test this hypothesis, we studied the content of GC-rich sequences (69%GC) of the transcribed region of human ribosomal repeat (rDNA), as well as the content of AT-rich repeat (63%AT) of satellite III (1q12) in the cfDNA samples obtained from 285 individuals. We have found that a chronic exposure to gamma-neutron radiation (N = 88) and tritium β-radiation (N = 88) evokes an increase of the rDNA content (RrDNA index) and a decrease of the satellite III content (RsatIII index) in the

  12. Decreased Serum Lipids in Patients with Probable Alzheimer´s Disease

    Directory of Open Access Journals (Sweden)

    Orhan Lepara

    2009-08-01

    Full Text Available Alzheimer’s disease (AD is a multifactorial disease but its aetiology and pathophisiology are still not fully understood. Epidemiologic studies examining the association between lipids and dementia have reported conflicting results. High total cholesterol has been associated with both an increased, and decreased, risk of AD and/or vascular dementia (VAD, whereas other studies found no association. The aim of this study was to investigate the serum lipids concentration in patients with probable AD, as well as possible correlation between serum lipids concentrations and cognitive impairment.Our cross-sectional study included 30 patients with probable AD and 30 age and sex matched control subjects. The probable AD was clinically diagnosed by NINCDS-ADRDA criteria. Serum total cholesterol (TC, high-density lipoprotein cholesterol (HDL-C and triglyceride (TG levels were determined at the initial assessment using standard enzymatic colorimetric techniques. Low-den- sity lipoprotein cholesterol (LDL-C and very low density lipoprotein cholesterol (VLDL-C levels were calculated. Subjects with probable AD had significantly lower serum TG (p<0,01, TC (p<0,05, LDL-C (p<0,05 and VLDL-C (p<0,01 compared to the control group. We did not observe signifi-cant difference in HDL-C level between patients with probable AD and control subjects. Negative, although not significant correlation between TG, TC and VLDL-C and MMSE in patients with AD was observed. In the control group of subjects there was a negative correlation between TC and MMSE but it was not statistically significant (r = -0,28. Further studies are required to explore the possibility for serum lipids to serve as diagnostic and therapeutic markers of AD.

  13. The transmission probability method in one-dimensional cylindrical geometry

    International Nuclear Information System (INIS)

    Rubin, I.E.

    1983-01-01

    The collision probability method widely used in solving the problems of neutron transpopt in a reactor cell is reliable for simple cells with small number of zones. The increase of the number of zones and also taking into account the anisotropy of scattering greatly increase the scope of calculations. In order to reduce the time of calculation the transmission probability method is suggested to be used for flux calculation in one-dimensional cylindrical geometry taking into account the scattering anisotropy. The efficiency of the suggested method is verified using the one-group calculations for cylindrical cells. The use of the transmission probability method allows to present completely angular and spatial dependences is neutrons distributions without the increase in the scope of calculations. The method is especially effective in solving the multi-group problems

  14. Preimplantation genetic screening for all 24 chromosomes by microarray comparative genomic hybridization significantly increases implantation rates and clinical pregnancy rates in patients undergoing in vitro fertilization with poor prognosis

    Science.gov (United States)

    Majumdar, Gaurav; Majumdar, Abha; Lall, Meena; Verma, Ishwar C.; Upadhyaya, Kailash C.

    2016-01-01

    CONTEXT: A majority of human embryos produced in vitro are aneuploid, especially in couples undergoing in vitro fertilization (IVF) with poor prognosis. Preimplantation genetic screening (PGS) for all 24 chromosomes has the potential to select the most euploid embryos for transfer in such cases. AIM: To study the efficacy of PGS for all 24 chromosomes by microarray comparative genomic hybridization (array CGH) in Indian couples undergoing IVF cycles with poor prognosis. SETTINGS AND DESIGN: A retrospective, case–control study was undertaken in an institution-based tertiary care IVF center to compare the clinical outcomes of twenty patients, who underwent 21 PGS cycles with poor prognosis, with 128 non-PGS patients in the control group, with the same inclusion criterion as for the PGS group. MATERIALS AND METHODS: Single cells were obtained by laser-assisted embryo biopsy from day 3 embryos and subsequently analyzed by array CGH for all 24 chromosomes. Once the array CGH results were available on the morning of day 5, only chromosomally normal embryos that had progressed to blastocyst stage were transferred. RESULTS: The implantation rate and clinical pregnancy rate (PR) per transfer were found to be significantly higher in the PGS group than in the control group (63.2% vs. 26.2%, P = 0.001 and 73.3% vs. 36.7%, P = 0.006, respectively), while the multiple PRs sharply declined from 31.9% to 9.1% in the PGS group. CONCLUSIONS: In this pilot study, we have shown that PGS by array CGH can improve the clinical outcome in patients undergoing IVF with poor prognosis. PMID:27382234

  15. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  16. A probability model for the failure of pressure containing parts

    International Nuclear Information System (INIS)

    Thomas, H.M.

    1978-01-01

    The model provides a method of estimating the order of magnitude of the leakage failure probability of pressure containing parts. It is a fatigue based model which makes use of the statistics available for both specimens and vessels. Some novel concepts are introduced but essentially the model simply quantifies the obvious i.e. that failure probability increases with increases in stress levels, number of cycles, volume of material and volume of weld metal. A further model based on fracture mechanics estimates the catastrophic fraction of leakage failures. (author)

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  18. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  19. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  20. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  1. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  2. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  3. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  4. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  5. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  6. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  7. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  8. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  9. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  10. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  11. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  12. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  13. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  14. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  15. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  16. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  17. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  19. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  20. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  1. Zero field reversal probability in thermally assisted magnetization reversal

    Science.gov (United States)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  2. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  3. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  4. Can confidence indicators forecast the probability of expansion in Croatia?

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2016-04-01

    Full Text Available The aim of this paper is to investigate how reliable are confidence indicators in forecasting the probability of expansion. We consider three Croatian Business Survey indicators: the Industrial Confidence Indicator (ICI, the Construction Confidence Indicator (BCI and the Retail Trade Confidence Indicator (RTCI. The quarterly data, used in the research, covered the periods from 1999/Q1 to 2014/Q1. Empirical analysis consists of two parts. The non-parametric Bry-Boschan algorithm is used for distinguishing periods of expansion from the period of recession in the Croatian economy. Then, various nonlinear probit models were estimated. The models differ with respect to the regressors (confidence indicators and the time lags. The positive signs of estimated parameters suggest that the probability of expansion increases with an increase in Confidence Indicators. Based on the obtained results, the conclusion is that ICI is the most powerful predictor of the probability of expansion in Croatia.

  5. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    Science.gov (United States)

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  6. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  7. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  8. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    Science.gov (United States)

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  9. Meaning and significance of

    Directory of Open Access Journals (Sweden)

    Ph D Student Roman Mihaela

    2011-05-01

    Full Text Available The concept of "public accountability" is a challenge for political science as a new concept in this area in full debate and developement ,both in theory and practice. This paper is a theoretical approach of displaying some definitions, relevant meanings and significance odf the concept in political science. The importance of this concept is that although originally it was used as a tool to improve effectiveness and eficiency of public governance, it has gradually become a purpose it itself. "Accountability" has become an image of good governance first in the United States of America then in the European Union.Nevertheless,the concept is vaguely defined and provides ambiguous images of good governance.This paper begins with the presentation of some general meanings of the concept as they emerge from specialized dictionaries and ancyclopaedies and continues with the meanings developed in political science. The concept of "public accontability" is rooted in economics and management literature,becoming increasingly relevant in today's political science both in theory and discourse as well as in practice in formulating and evaluating public policies. A first conclusin that emerges from, the analysis of the evolution of this term is that it requires a conceptual clarification in political science. A clear definition will then enable an appropriate model of proving the system of public accountability in formulating and assessing public policies, in order to implement a system of assessment and monitoring thereof.

  10. Software Tool for Significantly Increasing Airport Throughput, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's Next Generation Air Transportation System (NextGen) Airportal effort seeks to optimize aircraft surface movements through approaches that could double or...

  11. Significantly increased lifetime of recent microchannel-plate photomultipliers

    Energy Technology Data Exchange (ETDEWEB)

    Britting, Alexander; Eyrich, Wolfgang; Lehmann, Albert; Uhlig, Fred [Physikalisches Institut, Universitaet Erlangen-Nuernberg (Germany)

    2013-07-01

    Micro-channel plate photo multipliers (MCP-PMT) are the favored sensors for the DIRC detectors (Detection of Internally Reflected Cherenkov Light) of the PANDA experiment. The main reasons for this are their usability in high magnetic fields of up to 2 T, a time resolution of better than σ = 50 ps and a rate capability high enough to withstand a detected photon rate of about 200 kHz cm{sup -2} at the MCP-PMTs surface, which is anticipated at the average luminosity of 2 . 10{sup 32} cm{sup -2}s{sup -1} in PANDA. Moreover, for the reconstruction of the Cherenkov angle using the planned optics for the barrel DIRC a spatial resolution of about 5 mm at the focal plane is needed. Until recently the major drawback of MCP-PMTs was their limited lifetime which was by far not sufficient to stand the integrated anode charge, which is ∼ 5 C/cm{sup 2} for the Barrel-DIRC and even more for the Disc-DIRC. However, the latest MCP prototype devices show a hugh step forward in this respect. The results of these lifetime measurements are presented. The achieved values are meanwhile close to the PANDA requirements for the Barrel-DIRC.

  12. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  13. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  14. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  15. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  16. Effect of Urban Green Spaces and Flooded Area Type on Flooding Probability

    Directory of Open Access Journals (Sweden)

    Hyomin Kim

    2016-01-01

    Full Text Available Countermeasures to urban flooding should consider long-term perspectives, because climate change impacts are unpredictable and complex. Urban green spaces have emerged as a potential option to reduce urban flood risks, and their effectiveness has been highlighted in notable urban water management studies. In this study, flooded areas in Seoul, Korea, were divided into four flooded area types by cluster analysis based on topographic and physical characteristics and verified using discriminant analysis. After division by flooded area type, logistic regression analysis was performed to determine how the flooding probability changes with variations in green space area. Type 1 included regions where flooding occurred in a drainage basin that had a flood risk management infrastructure (FRMI. In Type 2, the slope was steep; the TWI (Topographic Wetness Index was relatively low; and soil drainage was favorable. Type 3 represented the gentlest sloping areas, and these were associated with the highest TWI values. In addition, these areas had the worst soil drainage. Type 4 had moderate slopes, imperfect soil drainage and lower than average TWI values. We found that green spaces exerted a considerable influence on urban flooding probabilities in Seoul, and flooding probabilities could be reduced by over 50% depending on the green space area and the locations where green spaces were introduced. Increasing the area of green spaces was the most effective method of decreasing flooding probability in Type 3 areas. In Type 2 areas, the maximum hourly precipitation affected the flooding probability significantly, and the flooding probability in these areas was high despite the extensive green space area. These findings can contribute towards establishing guidelines for urban spatial planning to respond to urban flooding.

  17. Improving detection probabilities for pests in stored grain.

    Science.gov (United States)

    Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant

    2010-12-01

    The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.

  18. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  19. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  20. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  1. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  2. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  3. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  4. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  5. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  6. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  7. Efficient Simulation of the Outage Probability of Multihop Systems

    KAUST Repository

    Ben Issaid, Chaouki; Alouini, Mohamed-Slim; Tempone, Raul

    2017-01-01

    In this paper, we present an efficient importance sampling estimator for the evaluation of the outage probability of multihop systems with amplify-and-forward channel state-information-assisted. The proposed estimator is endowed with the bounded relative error property. Simulation results show a significant reduction in terms of number of simulation runs compared to naive Monte Carlo.

  8. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    Science.gov (United States)

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  9. Determination of bounds on failure probability in the presence of ...

    Indian Academy of Sciences (India)

    In particular, fuzzy set theory provides a more rational framework for ..... indicating that the random variations inT andO2 do not affect failure probability significantly. ... The upper-bound for PF shown in figure 6 can be used in decision-making.

  10. Efficient Simulation of the Outage Probability of Multihop Systems

    KAUST Repository

    Ben Issaid, Chaouki

    2017-10-23

    In this paper, we present an efficient importance sampling estimator for the evaluation of the outage probability of multihop systems with amplify-and-forward channel state-information-assisted. The proposed estimator is endowed with the bounded relative error property. Simulation results show a significant reduction in terms of number of simulation runs compared to naive Monte Carlo.

  11. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  12. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    Science.gov (United States)

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  13. Probable leaching mechanisms for spent fuel

    International Nuclear Information System (INIS)

    Wang, R.; Katayama, Y.B.

    1981-01-01

    At the Pacific Northwest Laboratory, researchers in the Waste/Rock Interaction Technology Program are studying spent fuel as a possible waste form for the Office of Nuclear Waste Isolation. This paper presents probable leaching mechanisms for spent fuel and discusses current progress in identifying and understanding the leaching process. During the past year, experiments were begun to study the complex leaching mechanism of spent fuel. The initial work in this investigation was done with UO 2 , which provided the most information possible on the behavior of the spent-fuel matrix without encountering the very high radiation levels associated with spent fuel. Both single-crystal and polycrystalline UO 2 samples were used for this study, and techniques applicable to remote experimentation in a hot cell are being developed. The effects of radiation are being studied in terms of radiolysis of water and surface activation of the UO 2 . Dissolution behavior and kinetics of UO 2 were also investigated by electrochemical measurement techniques. These data will be correlated with those acquired when spent fuel is tested in a hot cell. Oxidation effects represent a major area of concern in evaluating the stability of spent fuel. Dissolution of UO 2 is greatly increased in an oxidizing solution because the dissolution is then controlled by the formation of hexavalent uranium. In solutions containing very low oxygen levels (i.e., reducing solutions), oxidation-induced dissolution may be possible via a previously oxidized surface, through exposure to air during storage, or by local oxidants such as O 2 and H 2 O 2 produced from radiolysis of water and radiation-activated UO 2 surfaces. The effects of oxidation not only increase the dissolution rate, but could lead to the disintegration of spent fuel into fine fragments

  14. Failure-probability driven dose painting

    International Nuclear Information System (INIS)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.

    2013-01-01

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity

  15. Talking probabilities: communicating probalistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to provide

  16. Talking probabilities: communicating probabilistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to

  17. Statistical significance of cis-regulatory modules

    Directory of Open Access Journals (Sweden)

    Smith Andrew D

    2007-01-01

    Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  20. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  1. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  2. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  3. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  4. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  5. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  6. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  7. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  8. Probability and information theory, with applications to radar

    CERN Document Server

    Woodward, P M; Higinbotham, W

    1964-01-01

    Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in

  9. Influence of nucleon density distribution in nucleon emission probability

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Nandy, Maitreyee; Mohanty, A.K.; Sarkar, P.K.; Gambhir, Y.K.

    2014-01-01

    Different decay modes are observed in heavy ion reactions at low to intermediate energies. It is interesting to study total neutron emission in these reactions which may be contributed by all/many of these decay modes. In an attempt to understand the importance of mean field and the entrance channel angular momentum, we study their influence on the emission probability of nucleons in heavy ion reactions in this work. This study owes its significance to the fact that once population of different states are determined, emission probability governs the double differential neutron yield

  10. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  11. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  12. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  13. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  14. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  15. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  16. Dynamic probability of reinforcement for cooperation: Random game termination in the centipede game.

    Science.gov (United States)

    Krockow, Eva M; Colman, Andrew M; Pulford, Briony D

    2018-03-01

    Experimental games have previously been used to study principles of human interaction. Many such games are characterized by iterated or repeated designs that model dynamic relationships, including reciprocal cooperation. To enable the study of infinite game repetitions and to avoid endgame effects of lower cooperation toward the final game round, investigators have introduced random termination rules. This study extends previous research that has focused narrowly on repeated Prisoner's Dilemma games by conducting a controlled experiment of two-player, random termination Centipede games involving probabilistic reinforcement and characterized by the longest decision sequences reported in the empirical literature to date (24 decision nodes). Specifically, we assessed mean exit points and cooperation rates, and compared the effects of four different termination rules: no random game termination, random game termination with constant termination probability, random game termination with increasing termination probability, and random game termination with decreasing termination probability. We found that although mean exit points were lower for games with shorter expected game lengths, the subjects' cooperativeness was significantly reduced only in the most extreme condition with decreasing computer termination probability and an expected game length of two decision nodes. © 2018 Society for the Experimental Analysis of Behavior.

  17. Detecting Novelty and Significance

    Science.gov (United States)

    Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.

    2013-01-01

    Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680

  18. Flux continuity and probability conservation in complexified Bohmian mechanics

    International Nuclear Information System (INIS)

    Poirier, Bill

    2008-01-01

    Recent years have seen increased interest in complexified Bohmian mechanical trajectory calculations for quantum systems as both a pedagogical and computational tool. In the latter context, it is essential that trajectories satisfy probability conservation to ensure they are always guided to where they are most needed. We consider probability conservation for complexified Bohmian trajectories. The analysis relies on time-reversal symmetry considerations, leading to a generalized expression for the conjugation of wave functions of complexified variables. This in turn enables meaningful discussion of complexified flux continuity, which turns out not to be satisfied in general, though a related property is found to be true. The main conclusion, though, is that even under a weak interpretation, probability is not conserved along complex Bohmian trajectories

  19. Probability theory versus simulation of petroleum potential in play analysis

    Science.gov (United States)

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  20. Significant NRC Enforcement Actions

    Data.gov (United States)

    Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...