WorldWideScience

Sample records for probable essential thrombocythemia

  1. Probable essential thrombocythemia in a dog

    Energy Technology Data Exchange (ETDEWEB)

    Hopper, P.E.; Mandell, C.P.; Turrel, J.M.; Jain, N.C.; Tablin, F.; Zinkl, J.G.

    1989-04-01

    Essential thrombocythemia (ET), in an 11-year-old dog was characterized by persistently high platelet counts range, 4.19 X 10(6)/microliters to 4.95 X 10(6)/microliters, abnormal platelet morphology, marked megakaryocytic hyperplasia in the bone marrow, absence of circulating megakaryoblasts, and history of splenomegaly and gastrointestinal bleeding. Increased numbers of megakaryocytes and megakaryoblasts (15% to 20%) in the bone marrow were confirmed by a positive acetylcholinesterase reaction. Another significant finding was the presence of a basophilia in blood (4,836/microliters) and bone marrow. The marked persistent thrombocytosis, absence of reactive (secondary) thrombocytosis, abnormal platelet morphology, and quantitative and qualitative changes in the megakaryocytic series in the bone marrow suggested the presence of a myeloproliferative disease. Cytochemical and ultrastructural findings aided in the diagnosis of ET. The dog was treated with radiophosphorus. The results was a rapid decline in the numbers of megakaryoblasts and megakaryocytes in the bone marrow and platelets and basophils in the peripheral blood. The dog died unexpectedly of acute necrotizing pancreatitis and diabetes mellitus before a complete remission was achieved.

  2. Essential thrombocythemia: a rare disease in childhood

    Directory of Open Access Journals (Sweden)

    Julia Maimone Beatrice

    2013-01-01

    Full Text Available Essential thrombocythemia is an acquired myeloproliferative disorder characterized by the proliferation of megakaryocytes in bone marrow, leading to a persistent increase in the number of circulating platelets and thus increasing the risk for thrombotic and hemorrhagic events. The disease features leukocytosis, splenomegaly, vascular occlusive events, hemorrhages and vasomotor disorders. The intricate mechanisms underlying the molecular pathogenesis of this disorder are not completely understood and are still a matter of discussion. Essential thrombocythemia is an extremely rare disorder during childhood. We report on a case of essential thrombocythemia in a child and discuss the diagnostic approach and treatment strategy.

  3. Emerging treatments for essential thrombocythemia

    Directory of Open Access Journals (Sweden)

    Okoli S

    2011-12-01

    Full Text Available Steven Okoli, Claire HarrisonDepartment of Haematology, Guy's and St Thomas' NHS Foundation Trust, Great Maze Pond, London, UKAbstract: In 1934, Epstein and Goedel used the term hemorrhagic thrombocythemia to describe a disorder characterized by permanent elevation of a platelet count to more than three times normal, hyperplasia of megakaryocytes, and the tendency for venous thrombosis and spontaneous hemorrhage. Over the last 75 years, and particularly in the past 6 years, major progress has been made in our understanding of essential thrombocythemia (ET and its pathogenesis with the identification of the highly prevalent JAK-2 V617F and other mutations. Current management of this condition is based upon historical data and with treatments that have not changed significantly for nearly two decades. This study discusses this and recent progress, highlighting exciting new data with old and new drugs, as well as which patients in particular should be evaluated for these new therapies.Keywords: essential thrombocythemia, interferon, JAK inhibitor

  4. MODALITY OF TREATMENT IN ESSENTIAL THROMBOCYTHEMIA

    Directory of Open Access Journals (Sweden)

    Lana Macukanovic-Golubovic

    2008-10-01

    Full Text Available Essential thrombocytosis (ET is clonal chronic myeloproliferative disorder which originates from abnormality of a multipotent hematopoietic stem cell.It is characterized by an increased platelet count, megakaryocytic hyperplasia and by hemorrhagic or thrombotic tendency. Symptoms and signs may include weakness, headaches, paresthesias, bleeding, splenomegaly, and digital ischemia. ET patients showed equal or slightly shorter survival than age- and sex-matched healthy population. Major causes of death were thrombotic and hemorrhagic complications or malignant progression due to both the natural history of the disease and, possibly, the use of chemotherapeutic agents.Diagnostic criteria for essential thrombocythemia were proposed in 2005 by the PVSG and demand diagnosis of exclusion.Myelosuppressive therapy to lower the platelet count usually consists of hydroxyurea, interferon alpha or anagrelide. Hydroxyurea is the most commonly used treatment, because of its efficacy, low cost and rare acute toxicity. Interferon alpha is a biological response modifier. It is not known to be teratogenic and does not cross the placenta, and is often the treatment of choice during pregnancy. Anagrelid suppresses bone marrow megakaryocytes by interfering with the maturation process and decreasing platelet production without affecting other blood cell lines. Low-dose aspirin may be used to control microvascular symptoms.Recommendations for management of patients with essential thrombocythemia were given by ASH. From a treatment standpoint, hydroxyurea is now confirmed to be the drug of choice for high-risk patients with essential thrombocythemia. Interferon alpha and anagrelide are reasonable second-line agents. Low-risk patients should receive low-dose aspirin alone. For the intermediate-risk patients, a consensus could not be reached on a recommendation for platelet-lowering treatment.

  5. Essential thrombocythemia in a child: diagnostic and therapeutic dilemma.

    Science.gov (United States)

    Asghar, Ramyar; Behzad, Elahi; Mohammad, Golsorkhtabar-Amiri

    2005-11-01

    We report an 11-year-old child with essential thrombocythemia ET, a very rare myeloproliferative disorder among children. Essential thrombocythemia can be complicated by life-threatening thrombosis with a risk of converting into acute leukemia. Cytoreductive therapy may reduce the risk of thromboembolic complications. We usually recommend cytoreductive treatment for asymptomatic adult patients with platelet counts of more than 1.5 million/micro liter, but treatment remains obscure in children. Herein, we report the results of child with ET, treated successfully with hydroxuea.

  6. Assessment of rotation thromboelastometry parameters in patients with essential thrombocythemia at diagnosis and after hydroxyurea therapy.

    Science.gov (United States)

    Treliński, Jacek; Okońska, Marta; Robak, Marta; Chojnowski, Krzysztof

    2016-03-01

    Patients with essential thrombocythemia suffer from thrombotic complications that are the main source of mortality. Due to its complex pathogenesis, no existing single laboratory method is able to identify the patients at highest risk for developing thrombosis. Twenty patients with essential thrombocythemia at diagnosis, 15 healthy volunteers and 20 patients treated with hydroxyurea were compared with regard to certain rotation thromboelastometry parameters. Clotting time (CT), clot formation time (CFT), α-angle, and maximum clot firmness (MCF) were assessed by using the INTEM, EXTEM, FIBTEM, and NATEM tests. Patients with essential thrombocythemia at diagnosis demonstrated significantly higher mean platelet count and markedly lower mean red blood count than controls. CT and CFT readings were found to be markedly lower in essential thrombocythemia patients at diagnosis than in the control group according to the EXTEM test. Patients at diagnosis had markedly lower CT values (EXTEM, FIBTEM) than patients on hydroxyurea therapy. Alpha angle values were markedly higher in essential thrombocythemia patients at diagnosis than in controls, according to the EXTEM, FIBTEM and NATEM tests. MCF readings were significantly higher in essential thrombocythemia patients at diagnosis than in controls according to EXTEM, INTEM, FIBTEM, and NATEM tests. Patients on hydroxyurea therapy had markedly lower MCF values according to EXTEM test than patients at diagnosis. Patients with essential thrombocythemia demonstrate a prothrombotic state at the time of diagnosis, which is reflected in changes by certain rotation thromboelastometry parameters. The hydroxyurea therapy induces downregulation of the prothrombotic features seen in essential thrombocythemia patients at diagnosis.

  7. Optimal therapy for polycythemia vera and essential thrombocythemia

    DEFF Research Database (Denmark)

    Silver, Richard T; Hasselbalch, Hans K

    2016-01-01

    Objectives: To determine the value of recombinant interferon-alfa (rIFNα) in the treatment of polycythemia vera (PV) and essential thrombocythemia (ET) based on its biological activities and phase 2 clinical studies, pending completion of phase 3 trials; to determine importance of the Internet...

  8. Perspectives on chronic inflammation in essential thrombocythemia, polycythemia vera, and myelofibrosis

    DEFF Research Database (Denmark)

    Hasselbalch, Hans K

    2012-01-01

    The morbidity and mortality of patients with the chronic Philadelphia-negative myeloproliferative neoplasms (MPNs), essential thrombocythemia, polycythemia vera, and primary myelofibrosis are mainly caused by cardiovascular diseases, thrombohemorrhagic complications, and bone marrow failure because...

  9. Interferon and the treatment of polycythemia vera, essential thrombocythemia and myelofibrosis

    DEFF Research Database (Denmark)

    Silver, Richard T; Kiladjian, Jean-Jacques; Hasselbalch, Hans K

    2013-01-01

    . In polycythemia vera, this has resulted in a significant clinical, hematologic, morphologic and molecular response manifested by reduction in the JAK2(V617F) allele burden, sustained even after discontinuation of recombinant IFN. In essential thrombocythemia, platelet count reduction is prompt and durable without...

  10. Cytogenetics, JAK2 and MPL mutations in polycythemia vera, primary myelofibrosis and essential thrombocythemia

    Directory of Open Access Journals (Sweden)

    Leonardo Caires dos Santos

    2011-12-01

    Full Text Available BACKGROUND: The detection of molecular and cytogenetic alterations is important for the diagnosis, prognosis and classification of myeloproliferative neoplasms. OBJECTIVE: The aim of this study was to detect the following mutations: JAK2 V617F, JAK2 exon 12 and MPL W515K/L, besides chromosomal abnormalities. Furthermore, molecular and cytogenetic alterations were correlated with the leukocyte and platelet counts, hemoglobin levels and age in all patients and with the degree of fibrosis in primary myelofibrosis cases. METHODS: Twenty cases of polycythemia vera, 17 of essential thrombocythemia and 21 of primary myelofibrosis were selected in the Hematology Department of the Universidade Federal de São Paulo (UNIFESP between February 2008 and December 2009. The JAK2 V617F, JAK2 exon 12 mutations, MPL W515K and MPL W515L mutations were investigated by real-time PCR and direct sequencing. G-band karyotyping and fluorescence in situ hybridization were used to detect chromosomal abnormalities. RESULTS: Chromosomal abnormalities were observed only in polycythemia vera (11.8% and primary myelofibrosis cases (17.6%, without correlation to clinical data. Chromosomal abnormalities were not detected by fluorescence in situ hybridization. The JAK2 V617F mutation was observed in polycythemia vera (90%, primary myelofibrosis (42.8% and essential thrombocythemia (47%. Patients with JAK2 V617F-negative polycythemia vera had lower platelet and leukocyte counts compared to V617F-positive polycythemia vera (p-value = 0.0001 and p-value = 0.023, respectively. JAK2 V617F-positive and MPL W515L-positive primary myelofibrosis cases had a higher degree of fibrosis than V617F-negative cases (p-value = 0.022. JAK2 exon 12 mutations were not detected in polycythemia vera patients. The MPL W515L mutation was observed in one case of primary myelofibrosis and in one of essential thrombocythemia. The MPL W515K mutation was not found in patients with essential thrombocythemia

  11. Cytogenetics, JAK2 and MPL mutations in polycythemia vera, primary myelofibrosis and essential thrombocythemia.

    Science.gov (United States)

    Dos Santos, Leonardo Caires; Ribeiro, Juliana Corrêa da Costa; Silva, Neusa Pereira; Cerutti, Janete; da Silva, Maria Regina Regis; Chauffaille, Maria de Lourdes Lopes Ferrari

    2011-01-01

    The detection of molecular and cytogenetic alterations is important for the diagnosis, prognosis and classification of myeloproliferative neoplasms. THE AIM OF THIS STUDY WAS TO DETECT THE FOLLOWING MUTATIONS: JAK2 V617F, JAK2 exon 12 and MPL W515K/L, besides chromosomal abnormalities. Furthermore, molecular and cytogenetic alterations were correlated with the leukocyte and platelet counts, hemoglobin levels and age in all patients and with the degree of fibrosis in primary myelofibrosis cases. Twenty cases of polycythemia vera, 17 of essential thrombocythemia and 21 of primary myelofibrosis were selected in the Hematology Department of the Universidade Federal de São Paulo (UNIFESP) between February 2008 and December 2009. The JAK2 V617F, JAK2 exon 12 mutations, MPL W515K and MPL W515L mutations were investigated by real-time PCR and direct sequencing. G-band karyotyping and fluorescence in situ hybridization were used to detect chromosomal abnormalities. Chromosomal abnormalities were observed only in polycythemia vera (11.8%) and primary myelofibrosis cases (17.6%), without correlation to clinical data. Chromosomal abnormalities were not detected by fluorescence in situ hybridization. The JAK2 V617F mutation was observed in polycythemia vera (90%), primary myelofibrosis (42.8%) and essential thrombocythemia (47%). Patients with JAK2 V617F-negative polycythemia vera had lower platelet and leukocyte counts compared to V617F-positive polycythemia vera (p-value = 0.0001 and p-value = 0.023, respectively). JAK2 V617F-positive and MPL W515L-positive primary myelofibrosis cases had a higher degree of fibrosis than V617F-negative cases (p-value = 0.022). JAK2 exon 12 mutations were not detected in polycythemia vera patients. The MPL W515L mutation was observed in one case of primary myelofibrosis and in one of essential thrombocythemia. The MPL W515K mutation was not found in patients with essential thrombocythemia or primary myelofibrosis. The MPL W515L

  12. Bone mineral density and microarchitecture in patients with essential thrombocythemia and polycythemia vera

    DEFF Research Database (Denmark)

    Farmer, S.; Shanbhogue, V. V.; Hansen, S.

    2017-01-01

    In this cross-sectional study of 45 patients with myeloproliferative neoplasms, we found no evidence of secondary osteoporosis. INTRODUCTION: Patients with essential thrombocythemia (ET) and polycythaemia vera (PV) are at increased risk of fractures but the underlying mechanisms have not been set...

  13. Essential thrombocythemia: Rare cause of chorea

    Directory of Open Access Journals (Sweden)

    Eswaradass Prasanna Venkatesan

    2014-01-01

    Full Text Available Essential thrombocythemia (ET is a clonal myeloproliferative disorder (MPD, characterized predominantly by a markedly elevated platelet count without known cause. It is rare hematological disorder. In ET clinical picture is dominated by a predisposition to vascular occlusive events and hemorrhages. Headache, transient ischemic attack, stroke, visual disturbances and light headedness are some of the neurological manifestations of ET. Here, we describe a 55 year-old female who presented to us with generalized chorea. On evaluation, she was found to have thrombocytosis. After ruling out the secondary causes of thrombocytosis and other MPD we confirmed diagnosis of ET in her by bone marrow studies. Polycythemia vera (PV another MPD closely related to ET may be present with generalized chorea. There are few case reports of PV presenting as chorea in the literature, but none with ET. We report the first case of ET presenting as generalized chorea.

  14. The JAK2 V617F allele burden in essential thrombocythemia, polycythemia vera and primary myelofibrosis

    DEFF Research Database (Denmark)

    Larsen, Thomas Stauffer; Pallisgaard, Niels; Møller, Michael Boe

    2007-01-01

    BACKGROUND AND OBJECTIVES: The JAK2 V617F tyrosine kinase mutation is present in the great majority of patients with polycythemia vera (PV), and approximately half of the patients with essential thrombocythemia (ET) and primary myelofibrosis (PMF). The three distinct disease entities may be consi......BACKGROUND AND OBJECTIVES: The JAK2 V617F tyrosine kinase mutation is present in the great majority of patients with polycythemia vera (PV), and approximately half of the patients with essential thrombocythemia (ET) and primary myelofibrosis (PMF). The three distinct disease entities may...... = 0.03), CD34 counts (P = 0.03), lactate dehydrogenase and Polycythemia Rubra Vera gene 1 levels (P = 0.03 and P

  15. Peripheral circulatory disorders in essential thrombocythemia.

    Science.gov (United States)

    Małecki, Rafał; Gacka, Małgorzata; Fiodorenko-Dumas, Żanna; Dumas, Ilias; Kwiatkowski, Jacek; Adamiec, Rajmund; Kuliszkiewicz-Janus, Małgorzata

    2018-03-01

    A significant number of patients with essential thrombocythemia (ET) complain of symptoms including distal parts of the extremities (e.g., paresthesias or Raynaud's phenomenon). The aim of the present study was to examine peripheral circulation in the upper extremities of individuals with ET. The study included 45 ET patients and 30 control subjects. All participants were subjected to thermography, photoplethysmography, impedance plethysmography, and applanation tonometry pulse wave analysis. The patients with ET differed significantly from the control subjects in terms of 3rd finger skin temperature (mean 31.04 vs. 32.45°C), skin temperature gradient (mean 1.82 vs. 0.11°C), photoplethysmographic amplitude (median 0.25 vs. 0.74%), and pulse waveform in the radial artery (more frequent occurrence of type B waveform). Pulse wave parameters correlated with the skin temperature gradient. The study findings imply the altered regulation of peripheral circulation in ET, including a decreased flow and an increased resistance. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Essential Thrombocythemia in a Two-year-old Child, Responsive to Hydroxyurea but Not Aspirin

    Directory of Open Access Journals (Sweden)

    Tariq N. Aladily

    2017-06-01

    Full Text Available Essential thrombocythemia (ET is a myeloproliferative neoplasm that occurs mostly in patients above the age of 50 years. Its incidence in children is very rare, with around 100 cases reported in the literature. High-risk patients are defined by previous life threatening major thrombotic or severe hemorrhagic complication or age > 60. Those patients probably benefit from cytoreductive therapy. On the other hand, antiplatelet drugs are recommended for patients with low risk group. Although rare, ET should be considered in the differential diagnosis of persistent thrombocytosis in children, even at a very young age. A constellation of clinical, pathologic ,and molecular testing are essential for diagnosis. Given the rarity of these cases, there is currently no consensus for treatment guidelines in children, especially in asymptomatic patients. We describe a case of a two-year old girl who presented with unexplained, isolated thrombocytosis which persisted for eight years. Bone marrow biopsy demonstrated typical features of ET. Over the course of the disease, hydroxyurea, but not aspirin, showed better control of symptoms and lowered the platelets level.

  17. Acquired RhD mosaicism identifies fibrotic transformation of thrombopoietin receptor-mutated essential thrombocythemia.

    Science.gov (United States)

    Montemayor-Garcia, Celina; Coward, Rebecca; Albitar, Maher; Udani, Rupa; Jain, Prachi; Koklanaris, Eleftheria; Battiwalla, Minoo; Keel, Siobán; Klein, Harvey G; Barrett, A John; Ito, Sawa

    2017-09-01

    Acquired copy-neutral loss of heterozygosity has been described in myeloid malignant progression with an otherwise normal karyotype. A 65-year-old woman with MPL-mutated essential thrombocythemia and progression to myelofibrosis was noted upon routine pretransplant testing to have mixed field reactivity with anti-D and an historic discrepancy in RhD type. The patient had never received transfusions or transplantation. Gel immunoagglutination revealed group A red blood cells and a mixed-field reaction for the D phenotype, with a predominant D-negative population and a small subset of circulating red blood cells carrying the D antigen. Subsequent genomic microarray single nucleotide polymorphism profiling revealed copy-neutral loss of heterozygosity of chromosome 1 p36.33-p34.2, a known molecular mechanism underlying fibrotic progression of MPL-mutated essential thrombocythemia. The chromosomal region affected by this copy-neutral loss of heterozygosity encompassed the RHD, RHCE, and MPL genes. We propose a model of chronological molecular events that is supported by RHD zygosity assays in peripheral lymphoid and myeloid-derived cells. Copy-neutral loss of heterozygosity events that lead to clonal selection and myeloid malignant progression may also affect the expression of adjacent unrelated genes, including those encoding for blood group antigens. Detection of mixed-field reactions and investigation of discrepant blood typing results are important for proper transfusion support of these patients and can provide useful surrogate markers of myeloproliferative disease progression. © 2017 AABB.

  18. Immune Thrombocytopenia and JAK2V617F Positive Essential Thrombocythemia: Literature Review and Case Report

    Directory of Open Access Journals (Sweden)

    M. A. Sobas

    2017-01-01

    Full Text Available We present the case where immune thrombocytopenia (ITP and essential thrombocythemia (ET sequentially appeared in the space of twenty-one years of follow-up. Impaired platelet production is present in both diseases, but clinical presentation and treatment are different. On the basis of this case history a possible role of autoimmunity as a predisposing factor to myeloproliferation has been discussed.

  19. Immune Thrombocytopenia and JAK2V617F Positive Essential Thrombocythemia: Literature Review and Case Report.

    Science.gov (United States)

    Sobas, M A; Wróbel, T; Zduniak, K; Podolak-Dawidziak, M; Rybka, J; Biedroń, M; Sawicki, M; Dybko, J; Kuliczkowski, K

    2017-01-01

    We present the case where immune thrombocytopenia (ITP) and essential thrombocythemia (ET) sequentially appeared in the space of twenty-one years of follow-up. Impaired platelet production is present in both diseases, but clinical presentation and treatment are different. On the basis of this case history a possible role of autoimmunity as a predisposing factor to myeloproliferation has been discussed.

  20. Circulating YKL-40 in patients with essential thrombocythemia and polycythemia vera treated with the novel histone deacetylase inhibitor vorinostat

    DEFF Research Database (Denmark)

    Andersen, Christen Lykkegaard; Bjørn, Mads Emil; McMullin, Mary Frances

    2014-01-01

    YKL-40 regulates vascular endothelial growth factors and induces tumor proliferation. We investigated YKL-40 before and after treatment with vorinostat in 31 polycythemia vera (PV) and 16 essential thrombocythemia (ET) patients. Baseline PV patient levels were 2 times higher than in healthy...

  1. An Ulceronecrotic Foot Lesion in a Patient with Essential Thrombocythemia: Successful Treatment with Hydroxyurea

    Directory of Open Access Journals (Sweden)

    Tokue Kato

    2012-01-01

    Full Text Available The patient was a 47-year-old woman with a painful ulcer that had appeared on the right 5th toe two weeks before she visited our hospital. Histopathological examination showed that thrombi were present in small blood vessels in the dermis and pancytosis was detected in a blood test, suggesting polycythemia-associated ulceration of the toe. Essential thrombocythemia was diagnosed based on bone marrow puncture and chromosomal test findings. Platelet count and the ulcer were improved by oral hydroxyurea.

  2. Application of prognostic score IPSET-thrombosis in patients with essential thrombocythemia of a Brazilian public service

    Directory of Open Access Journals (Sweden)

    Luana Magalhães Navarro

    Full Text Available Summary Introduction: In patients with essential thrombocythemia (ET, the vascular complications contribute to morbidity and mortality. To better predict the occurrence of thrombotic events, an International Prognostic Score for Thrombosis in Essential Thrombocythemia (IPSET-thrombosis has recently been proposed. We present the application of this score and compare its results with the usual classification system. Method: We retrospectively evaluated the characteristics and risk factors for thrombosis of 46 patients with a diagnosis of ET seen in the last 6 years at Faculdade de Medicina do ABC (FMABC. Results: Thrombosis in the arterial territory was more prevalent than in venous sites. We observed that cardiovascular risk factors (hypertension, hypercholesterolemia, diabetes mellitus, and smoking were also risk factors for thrombosis (p<0.001. Age over 60 years and presence of JAK2 V617F mutation were not associated with the occurrence of thrombotic events. No patient classified by IPSET-thrombosis as low risk had a thrombotic event. Furthermore, using the IPSET-thrombosis scale, we identified two patients who had thrombotic events during follow-up and were otherwise classified in the low-risk group of the traditional classification. Leukocytosis at diagnosis was significantly associated with arterial thrombosis (p=0.02, while splenomegaly was associated with venous thrombotic events (p=0.01. Conclusion: Cardiovascular risk factors and leukocytosis were directly associated with arterial thrombosis. IPSET-thrombosis appears to be better than the traditional classification at identifying lower risk patients who do not need specific therapy.

  3. [Essential thrombocythemia: baseline characteristics and risk factors for survival and thrombosis in a series of 214 patients].

    Science.gov (United States)

    Angona, Anna; Alvarez-Larrán, Alberto; Bellosillo, Beatriz; Martínez-Avilés, Luz; Garcia-Pallarols, Francesc; Longarón, Raquel; Ancochea, Àgueda; Besses, Carles

    2015-03-15

    Two prognostic models to predict overall survival and thrombosis-free survival have been proposed: International Prognostic Score for Essential Thrombocythemia (IPSET) and IPSET-Thrombosis, respectively, based on age, leukocytes count, history of previous thrombosis, the presence of cardiovascular risk factors and the JAK2 mutational status. The aim of the present study was to assess the clinical and biological characteristics at diagnosis and during evolution in essential thrombocythemia (ET) patients as well as the factors associated with survival and thrombosis and the usefulness of these new prognostic models. We have evaluated the clinical data and the mutation status of JAK2, MPL and calreticulin of 214 ET patients diagnosed in a single center between 1985 and 2012, classified according to classical risk stratification, IPSET and IPSET-Thrombosis. With a median follow-up of 6.9 years, overall survival was not associated with any variable by multivariate analysis. Thrombotic history and leukocytes>10×10(9)/l were associated with thrombosis-free survival (TFS). In our series, IPSET prognostic systems of survival and thrombosis did not provide more clinically relevant information regarding the classic risk of thrombosis stratification. Thrombotic history and leukocytosis>10×10(9)/l were significantly associated with lower TFS, while the prognostic IPSET-Thrombosis system did not provide more information than classical thrombotic risk assessment. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  4. Anagrelide compared with hydroxyurea in essential thrombocythemia: a meta-analysis.

    Science.gov (United States)

    Samuelson, Bethany; Chai-Adisaksopha, Chatree; Garcia, David

    2015-11-01

    Cytoreductive therapy, with or without low-dose aspirin, is the mainstay of thrombotic risk reduction in patients with essential thrombocythemia (ET), but the optimal choice of agent remains unclear. The aim of this study was to meta-analyze currently available data comparing anagrelide to hydroxyurea for reduction of rates of thrombosis, bleeding and death among patients with ET. A literature search for randomized, controlled trials comparing anagrelide to hydroxyurea among patients with ET revealed two published studies. Statistical analysis was performed using fixed effects meta-analysis. Rates of thrombosis were similar between patients treated with hydroxyurea vs anagrelide (RR 0.86, 95 % CI 0.64-1.16). Rates of major bleeding were lower in patients treated with hydroxyurea (RR 0.37, 95 % CI 0.18-0.75). Rates of progression to acute myeloid leukemia were not statistically different (RR 1.50, 95 % CI 0.43-5.29). The composite of thrombosis, major bleeding and death favored hydroxyurea (RR 0.78, 95 % CI 0.63-0.97). In conclusion, our analysis supports use of hydroxyurea as a first-line cytoreductive agent for patients with ET, based largely on decreased rates of major bleeding. Anagrelide appears to be equally effective for protection against thrombotic events and may be an appropriate alternative for patients who are intolerant of hydroxyurea.

  5. Heparin-Induced Thrombocytopenia in a Patient with Essential Thrombocythemia: A Case Based Update

    Directory of Open Access Journals (Sweden)

    Edva Noel

    2015-01-01

    Full Text Available Vascular thrombosis is a common clinical feature of both essential thrombocythemia (ET and heparin-induced thrombocytopenia (HIT. The development of HIT in a patient with ET is rare and underrecognized. We report the case of a 77-year-old woman with preexisting ET, who was admitted with acute coronary syndrome, and IV heparin was started. She was exposed to unfractionated heparin (UFH 5 days prior to this admission. Decrease in platelet count was noted, and HIT panel was sent. Heparin was discontinued. Patient developed atrial fibrillation, and Dabigatran was started. On day three, patient also developed multiple tiny cerebral infarctions and acute right popliteal DVT. On day ten of admission, HIT panel was positive, and Dabigatran was changed to Lepirudin. Two days later, Lepirudin was also discontinued because patient developed pseudoaneurysm on the right common femoral artery at the site of cardiac catheterization access. A progressive increase in the platelet count was noted after discontinuing heparin. Physicians should be aware of the coexistence of HIT and ET, accompanied challenges of the prompt diagnosis, and initiation of appropriate treatment.

  6. Frequency and clinical features of the JAK2 V617F mutation in pediatric patients with sporadic essential thrombocythemia.

    Science.gov (United States)

    Nakatani, Takuya; Imamura, Toshihiko; Ishida, Hiroyuki; Wakaizumi, Katsuji; Yamamoto, Tohru; Otabe, Osamu; Ishigami, Tsuyoshi; Adachi, Souichi; Morimoto, Akira

    2008-12-01

    Pediatric essential thrombocythemia (ET) is a rare and heterogenous disease entity. While several recent studies have focused on the role of the JAK2 V617F mutation in pediatric ET, the frequency of pediatric ET cases with this mutation and the associated clinical features remain unclear. We examined six childhood cases who had been diagnosed with ET according to WHO criteria (onset age: 0.2-14 years) for the presence of the JAK2 V617F mutation, MPLW515L mutation and JAK2 exon 12 mutations. Two sensitive PCR-based methods were used for the JAK2 V617F genotyping. We also examined the expression of polycythemia rubra vera-1 (PRV-1), which is a diagnostic marker for clonal ET. We found that three of the six cases had the JAK2 V617F mutation and that all six cases expressed PRV-1 in their peripheral granulocytes. Neither MPL W515L mutation nor JAK2 exon 12 mutations was detected in the patients without JAK2 V617F mutation. The two patients who developed thrombocythemia during infancy were JAK2 V617F-negative. These findings suggest that the JAK2 V617F mutation is not rare in childhood sporadic ET cases, and that these cases might be older and myeloproliferative features.

  7. Zap-70 positive chronic lymphocytic leukemia co-existing with Jak 2 V671F positive essential thrombocythemia: A common defective stem cell?

    OpenAIRE

    Tabaczewski, Piotr; Nadesan, Sushani; Lim, Seah H

    2008-01-01

    Essential thrombocythemia (ET) co-existing with chronic lymphocytic leukemia (CLL) is extremely rare. We report two cases of ET with Jak 2 V617F in Zap-70+ CLL. ET is a myeloproliferative stem cell disease. Zap-70 expression in CLL correlates with non-mutated immunoglobulin genes. The occurrence of a less mature CLL in patients with a pluripotential stem cell disease raises the possibility that an initial “trigger hit” occurred in a pre-Jak 2 common early progenitor in these patients. Subsequ...

  8. Spontaneous Hemopericardium Leading to Cardiac Tamponade in a Patient with Essential Thrombocythemia

    Directory of Open Access Journals (Sweden)

    Anand Deshmukh

    2011-01-01

    thrombocythemia (ET has never been reported in the literature. We report a case of a 72-year-old Caucasian female who presented with spontaneous hemopericardium and tamponade requiring emergent pericardiocentesis. The patient was subsequently diagnosed to have ET. ET is characterized by elevated platelet counts that can lead to thrombosis but paradoxically it can also lead to a bleeding diathesis. Physicians should be aware of this complication so that timely life-saving measures can be taken if this complication arises.

  9. Driver mutations (JAK2V617F, MPLW515L/K or CALR), pentraxin-3 and C-reactive protein in essential thrombocythemia and polycythemia vera

    OpenAIRE

    Lussana, Federico; Carobbio, Alessandra; Salmoiraghi, Silvia; Guglielmelli, Paola; Vannucchi, Alessandro Maria; Bottazzi, Barbara; Leone, Roberto; Mantovani, Alberto; Barbui, Tiziano; Rambaldi, Alessandro

    2017-01-01

    Abstract Background The driver mutations JAK2V617F, MPLW515L/K and CALR influence disease phenotype of myeloproliferative neoplasms (MPNs) and might sustain a condition of chronic inflammation. Pentraxin 3 (PTX3) and high-sensitivity C-reactive protein (hs-CRP) are inflammatory biomarkers potentially useful for refining prognostic classification of MPNs. Methods We evaluated 305 with essential thrombocythemia (ET) and 172 polycythemia vera (PV) patients diagnosed according to the 2016 WHO cri...

  10. Significance of combined detection of JAK2V617F, MPL and CALR gene mutations in patients with essential thrombocythemia.

    Science.gov (United States)

    Ji, Liying; Qian, Mengyao; Wu, Nana; Wu, Jianmin

    2017-03-01

    The aim of this study was to analyze the mutation rate of JAK2V617F, MPLW515L/K and CALR genes in adult patients with essential thrombocythemia (ET) and the accuracy of the combined detection by the receiver operating curve. Three hundred and forty-two cases with high-platelets (≥300×10 9 /l) were consecutively selected. The patients were analyzed for routine blood examination, bone marrow biopsy and genetic testing. One hundred and fifty-four cases (45.03%) were diagnosed with ET and 188 cases of secondary thrombocythemia according to the hematopoietic and lymphoid tissue tumor classification standards of 2008. It was found that the mutant type of three genes showed three bands, whereas only one band for wild-type. The JAK2V617F and MPL mutations did not cause a change in the open reading frame and the CALR mutation resulted in its change. The mutation rate of JAK2V617F and CALR in ET group was significantly higher than that in the secondary thrombocythemia group (p<0.05). The positive mutation rate of MPL was only 4.55%. JAK2V617F-positive mutation alone was used to diagnose with ET. The area under the curve (AUC) was 0.721. The sensitivity was 72.4%, the specificity was 79.5% and the cut-off value was 0.25. When CALR-positive mutation alone was used to diagnose ET, the AUC, sensitivity, specificity and cut-off value were 0.664, 68.4, 82.4 and 0.09%, respectively. JAK2V617F combined with CALR mutation were used for diagnosis of ET. The AUC was 0.862, the sensitivity was 85.9%, the specificity was 87.8%, and the cut-off values were 0.21 and 0.07. In conclusion, the positive mutation rate of JAK2V617F and CALR in ET was higher, and the sensitivity, specificity and accuracy of the diagnosis of ET were significantly improved using the detection of JAK2V617F and CALR.

  11. JAK2, MPL, and CALR mutations in Chinese Han patients with essential thrombocythemia.

    Science.gov (United States)

    Wang, Jing; Zhang, Biao; Chen, Bing; Zhou, Rong-Fu; Zhang, Qi-Guo; Li, Juan; Yang, Yong-Gong; Zhou, Min; Shao, Xiao-Yan; Xu, Yong; Xu, Xi-Hui; Ouyang, Jian; Xu, Jingyan; Ye, Qing

    2017-04-01

    Mutations in Janus kinase 2 (JAK2), myeloproliferative leukemia (MPL), and CALR are highly relevant to Philadelphia chromosome (Ph)-negative myeloproliferative neoplasms. Assessing the prevalence of molecular mutations in Chinese Han patients with essential thrombocythemia (ET), and correlating their mutational profile with disease characteristics/phenotype. Of the 110 subjects studied, 62 carried the JAK2 V617F mutation, 21 had CALR mutations, one carried an MPL (W515) mutation, and 28 had non-mutated JAK2, CALR, and MPL (so-called triple-negative ET). Mutations in JAK2 exon 12 were not detected in any patient. Two ET patients had both CALR and JAK2 V617F mutations. Comparing the hematological parameters of the patients with JAK2 mutations with those of the patients with CALR mutations showed that the ET patients with CALR mutations were younger (p = 0.045) and had higher platelet counts (p = 0.043). Genotyping for CALR could be a useful diagnostic tool for JAK2/MPL-negative ET, since the data suggest that CALR is much more prevalent than MPL, therefore testing for CALR should be considered in patients who are JAK2 negative as its frequency is almost 20 times that of MPL mutation.

  12. Characteristics and clinical correlates of MPL 515W>L/K mutation in essential thrombocythemia.

    Science.gov (United States)

    Vannucchi, Alessandro M; Antonioli, Elisabetta; Guglielmelli, Paola; Pancrazzi, Alessandro; Guerini, Vittoria; Barosi, Giovanni; Ruggeri, Marco; Specchia, Giorgina; Lo-Coco, Francesco; Delaini, Federica; Villani, Laura; Finotto, Silvia; Ammatuna, Emanuele; Alterini, Renato; Carrai, Valentina; Capaccioli, Gloria; Di Lollo, Simonetta; Liso, Vincenzo; Rambaldi, Alessandro; Bosi, Alberto; Barbui, Tiziano

    2008-08-01

    Among 994 patients with essential thrombocythemia (ET) who were genotyped for the MPLW515L/K mutation, 30 patients carrying the mutation were identified (3.0%), 8 of whom also displayed the JAK2V671F mutation. MPLW515L/K patients presented lower hemoglobin levels and higher platelet counts than did wild type (wt) MPL; these differences were highly significant compared with MPLwt/JAK2V617F-positive patients. Reduced hemoglobin and increased platelet levels were preferentially associated with the W515L and W515K alleles, respectively. MPL mutation was a significant risk factor for microvessel disturbances, suggesting platelet hyperreactivity associated with constitutively active MPL; arterial thromboses were increased only in comparison to MPLwt/JAK2wt patients. MPLW515L/K patients presented reduced total and erythroid bone marrow cellularity, whereas the numbers of megakaryocytes, megakaryocytic clusters, and small-sized megakaryocytes were all significantly increased. These data indicate that MPLW515L/K mutations do not define a distinct phenotype in ET, although some differences depended on the JAK2V617F mutational status of the counterpart.

  13. Gene expression profiling with principal component analysis depicts the biological continuum from essential thrombocythemia over polycythemia vera to myelofibrosis

    DEFF Research Database (Denmark)

    Skov, Vibe; Thomassen, Mads; Riley, Caroline H

    2012-01-01

    The recent discovery of the Janus activating kinase 2 V617F mutation in most patients with polycythemia vera (PV) and half of those with essential thrombocythemia (ET) and primary myelofibrosis (PMF) has favored the hypothesis of a biological continuum from ET over PV to PMF. We performed gene...... with biological relevant overlaps between the different entities. Moreover, the analysis separates Janus activating kinase 2-negative ET patients from Janus activating kinase 2-positive ET patients. Functional annotation analysis demonstrates that clusters of gene ontology terms related to inflammation, immune...... system, apoptosis, RNA metabolism, and secretory system were the most significantly deregulated terms in the three different disease groups. Our results yield further support for the hypothesis of a biological continuum originating from ET over PV to PMF. Functional analysis suggests an important...

  14. Phosphatidylserine-exposing blood and endothelial cells contribute to the hypercoagulable state in essential thrombocythemia patients.

    Science.gov (United States)

    Tong, Dongxia; Yu, Muxin; Guo, Li; Li, Tao; Li, Jihe; Novakovic, Valerie A; Dong, Zengxiang; Tian, Ye; Kou, Junjie; Bi, Yayan; Wang, Jinghua; Zhou, Jin; Shi, Jialan

    2018-04-01

    The mechanisms of thrombogenicity in essential thrombocythemia (ET) are complex and not well defined. Our objective was to explore whether phosphatidylserine (PS) exposure on blood cells and endothelial cells (ECs) can account for the increased thrombosis and distinct thrombotic risks among mutational subtypes in ET. Using flow cytometry and confocal microscopy, we found that the levels of PS-exposing erythrocytes, platelets, leukocytes, and serum-cultured ECs were significantly higher in each ET group [JAK2, CALR, and triple-negative (TN) (all P cells and serum-cultured ECs led to markedly shortened coagulation time and dramatically increased levels of FXa, thrombin, and fibrin production. This procoagulant activity could be largely blocked by addition of lactadherin (approx. 70% inhibition). Confocal microscopy showed that the FVa/FXa complex and fibrin fibrils colocalized with PS on ET serum-cultured ECs. Additionally, we found a relationship between D-dimer, prothrombin fragment F1 + 2, and PS exposure. Our study reveals a previously unrecognized link between hypercoagulability and exposed PS on cells, which might also be associated with distinct thrombotic risks among mutational subtypes in ET. Thus, blocking PS-binding sites may represent a new therapeutic target for preventing thrombosis in ET.

  15. Genomic profile of a patient with triple negative essential thrombocythemia, unresponsive to therapy: A case report and literature review

    Directory of Open Access Journals (Sweden)

    Uzma Zaidi

    2017-07-01

    Full Text Available Clonal analysis of patients with triple negative myeloproliferative neoplasm (MPN has provided evidence of additional aberrations, including epigenetic alterations. To discover such novel genetic aberrations, patients were screened through next-generation sequencing using a myeloid sequencing panel of 54 genes using a genetic analyser. Genetic variants in 28 genes, including TET2, BCOR, BCR, and ABL1 were identified in a triple negative essential thrombocythemia (ET patient. The individual role of some of these variants in disease pathogenesis has yet to be studied. Somatic mutations in the same genes have been reported with variable frequencies in myeloid malignancies. However, no pathogenic impact of these variants could be found; therefore, long-term follow up of patients with genetic analysis of a large cohort and the use of whole genome sequencing is required to assess the effects of these variants.

  16. Generation of human iPSCs from an essential thrombocythemia patient carrying a V501L mutation in the MPL gene.

    Science.gov (United States)

    Liu, Senquan; Ye, Zhaohui; Gao, Yongxing; He, Chaoxia; Williams, Donna W; Moliterno, Alison; Spivak, Jerry; Huang, He; Cheng, Linzhao

    2017-01-01

    Activating point mutations in the MPL gene encoding the thrombopoietin receptor are found in 3%-10% of essential thrombocythemia (ET) and myelofibrosis patients. Here, we report the derivation of induced pluripotent stem cells (iPSCs) from an ET patient with a heterozygous MPL V501L mutation. Peripheral blood CD34 + progenitor cells were reprogrammed by transient plasmid expression of OCT4, SOX2, KLF4, c-MYC plus BCL2L1 (BCL-xL) genes. The derived line M494 carries a MPL V501L mutation, displays typical iPSC morphology and characteristics, are pluripotent and karyotypically normal. Upon differentiation, the iPSCs are able to differentiate into cells derived from three germ layers. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Aspirin in polycythemia vera and essential thrombocythemia: current facts and perspectives.

    Science.gov (United States)

    Landolfi, R; Patrono, C

    1996-09-01

    The role of aspirin in the antithrombotic strategy of patients with polycythemia vera (PV) and essential thrombocythemia (ET) is highly controversial. Long considered unsafe on the basis of a single clinical trial testing very high doses in PV patients, aspirin is being increasingly used at lower dosage. The rationale for the use of aspirin in patients with PV and ET is provided by the efficacy of this agent in the treatment of microcirculatory disturbances of thrombocythemic states associated with myeloproliferative disorders and by recent evidence that asymptomatic PV and ET patients have persistently increased thromboxane (TX) A2-biosynthesis. This increase, which most likely reflects enhanced platelet activation in vivo, is independent of the platelet mass and blood viscosity and largely supressed by a short term low-dose aspirin regimen (50 mg/day for 7 days). Since enhanced TXA2 biosynthesis may play a role in transducing the increased thrombotic risk associated with PV and ET, long-term low-dose aspirin administration has been proposed as a possible antithombotic strategy in these subjects. The safety of this treatment in PV patients has been recently reassessed by the Gruppo Italiano per lo Studio della Policitemia Vera (GISP) which has followed for over one year 112 patients randomized to receive 40 mg/day aspirin or placebo. In the same study, serum TXB2 measurements provided evidence that the low-dose aspirin regimen tested was fully effective in inhibiting platelet cyclooxygenase activity. On this basis, a large scale trial aimed at assessing the antithrombotic efficacy of this approach is currently being organized. In patients with ET both the minimal aspirin dose required for complete inhibition of platelet cyclooxygenase and the safety of long-term aspirin administration need to be established prior to extensive clinical evaluation of this strategy.

  18. Presence of atypical thrombopoietin receptor (MPL) mutations in triple-negative essential thrombocythemia patients.

    Science.gov (United States)

    Cabagnols, Xénia; Favale, Fabrizia; Pasquier, Florence; Messaoudi, Kahia; Defour, Jean Philippe; Ianotto, Jean Christophe; Marzac, Christophe; Le Couédic, Jean Pierre; Droin, Nathalie; Chachoua, Ilyas; Favier, Remi; Diop, M'boyba Khadija; Ugo, Valérie; Casadevall, Nicole; Debili, Najet; Raslova, Hana; Bellanné-Chantelot, Christine; Constantinescu, Stefan N; Bluteau, Olivier; Plo, Isabelle; Vainchenker, William

    2016-01-21

    Mutations in signaling molecules of the cytokine receptor axis play a central role in myeloproliferative neoplasm (MPN) pathogenesis. Polycythemia vera is mainly related to JAK2 mutations, whereas a wider mutational spectrum is detected in essential thrombocythemia (ET) with mutations in JAK2, the thrombopoietin (TPO) receptor (MPL), and the calreticulin (CALR) genes. Here, we studied the mutational profile of 17 ET patients negative for JAK2V617F, MPLW515K/L, and CALR mutations, using whole-exome sequencing and next-generation sequencing (NGS) targeted on JAK2 and MPL. We found several signaling mutations including JAK2V617F at very low allele frequency, 1 homozygous SH2B3 mutation, 1 MPLS505N, 1 MPLW515R, and 2 MPLS204P mutations. In the remaining patients, 4 presented a clonal and 7 a polyclonal hematopoiesis, suggesting that certain triple-negative ETs are not MPNs. NGS on 26 additional triple-negative ETs detected only 1 MPLY591N mutation. Functional studies on MPLS204P and MPLY591N revealed that they are weak gain-of-function mutants increasing MPL signaling and conferring either TPO hypersensitivity or independence to expressing cells, but with a low efficiency. Further studies should be performed to precisely determine the frequency of MPLS204 and MPLY591 mutants in a bigger cohort of MPN. © 2016 by The American Society of Hematology.

  19. Allogeneic hematopoietic stem cell transplantation in patients with polycythemia vera or essential thrombocythemia transformed to myelofibrosis or acute myeloid leukemia: a report from the MPN Subcommittee of the Chronic Malignancies Working Party of the European Group for Blood and Marrow Transplantation

    NARCIS (Netherlands)

    Lussana, F.; Rambaldi, A.; Finazzi, M.C.; Biezen, A. van; Scholten, M.; Oldani, E.; Carobbio, A.; Iacobelli, S.; Finke, J.; Nagler, A.; Volin, L.; Lamy, T.; Arnold, R.; Mohty, M.; Michallet, M.; Witte, T.J.M. de; Olavarria, E.; Kroger, N.

    2014-01-01

    The clinical course of polycythemia vera and essential thrombocythemia is potentially associated with long-term severe complications, such as evolution to myelofibrosis or acute myeloid leukemia. Allogeneic stem cell transplantation is currently the only potentially curative treatment for advanced

  20. Benefits and Risks of Antithrombotic Therapy in Essential Thrombocythemia: A Systematic Review.

    Science.gov (United States)

    Chu, Derek K; Hillis, Christopher M; Leong, Darryl P; Anand, Sonia S; Siegal, Deborah M

    2017-08-01

    Patients with essential thrombocythemia (ET) are at high risk for both thrombosis and hemorrhage. To evaluate the risks and benefits of antithrombotic therapy in adults with ET. Multiple databases, including MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials, through 4 March 2017. Randomized and observational studies of antiplatelet or anticoagulant therapy, published in any language and reporting thrombotic or hemorrhagic events. Two reviewers independently extracted data, assessed risk of bias, and graded certainty of evidence. No relevant randomized trials were identified. Twenty-four observational studies (18 comparative and 6 single-group) involving 6153 patients followed for 31 711 patient-years were reviewed; most were deemed to have high risk of bias. Most patients receiving antiplatelet therapy (3613 of 4527 [80%]) received low-dose aspirin (50 to 150 mg/d); 914 (20%) received high-dose aspirin (300 to 600 mg/d), dipyridamole, or other agents. Overall, findings were inconsistent and imprecise. The reported incidence rates of thrombosis, any bleeding, and major bleeding without antiplatelet therapy ranged from 5 to 110 (median, 20), from 3 to 39 (median, 8), and from 2 to 53 (median, 6) cases per 1000 patient-years, respectively. The reported relative risks for thrombosis, any bleeding, and major bleeding with antiplatelet therapy compared with none ranged from 0.26 to 3.48 (median, 0.74), from 0.48 to 11.04 (median, 1.95), and from 0.48 to 5.17 (median, 1.30), respectively. Certainty of evidence was rated low or very low for all outcomes. No randomized trials, no extractable data on anticoagulants, lack of uniform bleeding definitions, and systematic reporting of outcomes. Available evidence about the risk-benefit ratio of antiplatelet therapy in adults with ET is highly uncertain. Regional Medical Associates. (PROSPERO: CRD42015027051).

  1. Rationale for revision and proposed changes of the WHO diagnostic criteria for polycythemia vera, essential thrombocythemia and primary myelofibrosis

    International Nuclear Information System (INIS)

    Barbui, T; Thiele, J; Vannucchi, A M; Tefferi, A

    2015-01-01

    The 2001/2008 World Health Organization (WHO)-based diagnostic criteria for polycythemia vera (PV), essential thrombocythemia (ET) and primary myelofibrosis (PMF) were recently revised to accomodate new information on disease-specific mutations and underscore distinguishing morphologic features. In this context, it seems to be reasonable to compare first major diagnostic criteria of the former WHO classifications for myeloproliferative neoplasm (MPN) and then to focus on details that have been discussed and will be proposed for the upcoming revision of diagnostic guidelines. In PV, a characteristic bone marrow (BM) morphology was added as one of three major diagnostic criteria, which allowed lowering of the hemoglobin/hematocrit threshold for diagnosis, which is another major criterion, to 16.5 g/dl/49% in men and 16 g/dl/48% in women. The presence of a JAK2 mutation remains the third major diagnostic criterion in PV. Subnormal serum erythropoietin level is now the only minor criterion in PV and is used to capture JAK2-unmutated cases. In ET and PMF, mutations that are considered to confirm clonality and specific diagnosis now include CALR, in addition to JAK2 and MPL. Also in the 2015 discussed revision, overtly fibrotic PMF is clearly distinguished from early/prefibrotic PMF and each PMF variant now includes a separate list of diagnostic criteria. The main rationale for these changes was to enhance the distinction between so-called masked PV and JAK2-mutated ET and between ET and prefibrotic early PMF. The proposed changes also underscore the complementary role, as well as limitations of mutation analysis in morphologic diagnosis. On the other hand, discovery of new biological markers may probably be expected in the future to enhance discrimination of the different MPN subtypes in accordance with the histological BM patterns and corresponding clinical features

  2. Clinical and molecular response to interferon-α therapy in essential thrombocythemia patients with CALR mutations.

    Science.gov (United States)

    Verger, Emmanuelle; Cassinat, Bruno; Chauveau, Aurélie; Dosquet, Christine; Giraudier, Stephane; Schlageter, Marie-Hélène; Ianotto, Jean-Christophe; Yassin, Mohammed A; Al-Dewik, Nader; Carillo, Serge; Legouffe, Eric; Ugo, Valerie; Chomienne, Christine; Kiladjian, Jean-Jacques

    2015-12-10

    Myeloproliferative neoplasms are clonal disorders characterized by the presence of several gene mutations associated with particular hematologic parameters, clinical evolution, and prognosis. Few therapeutic options are available, among which interferon α (IFNα) presents interesting properties like the ability to induce hematologic responses (HRs) and molecular responses (MRs) in patients with JAK2 mutation. We report on the response to IFNα therapy in a cohort of 31 essential thrombocythemia (ET) patients with CALR mutations (mean follow-up of 11.8 years). HR was achieved in all patients. Median CALR mutant allelic burden (%CALR) significantly decreased from 41% at baseline to 26% after treatment, and 2 patients even achieved complete MR. In contrast, %CALR was not significantly modified in ET patients treated with hydroxyurea or aspirin only. Next-generation sequencing identified additional mutations in 6 patients (affecting TET2, ASXL1, IDH2, and TP53 genes). The presence of additional mutations was associated with poorer MR on CALR mutant clones, with only minor or no MRs in this subset of patients. Analysis of the evolution of the different variant allele frequencies showed that the mutated clones had a differential sensitivity to IFNα in a given patient, but no new mutation emerged during treatment. In all, this study shows that IFNα induces high rates of HRs and MRs in CALR-mutated ET, and that the presence of additional nondriver mutations may influence the MR to therapy. © 2015 by The American Society of Hematology.

  3. Recent advances in understanding myelofibrosis and essential thrombocythemia [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    William Vainchenker

    2016-04-01

    Full Text Available The classic BCR-ABL-negative myeloproliferative neoplasms (MPNs, a form of chronic malignant hemopathies, have been classified into polycythemia vera (PV, essential thrombocythemia (ET, and primary myelofibrosis (PMF. ET and PMF are two similar disorders in their pathogenesis, which is marked by a key role of the megakaryocyte (MK lineage. Whereas ET is characterized by MK proliferation, PMF is also associated with aberrant MK differentiation (myelodysplasia, leading to the release of cytokines in the marrow environment, which causes the development of myelofibrosis. Thus, PMF is associated with both myeloproliferation and different levels of myelodysplastic features. MPNs are mostly driven by mutated genes called MPN drivers, which abnormally activate the cytokine receptor/JAK2 pathway and their downstream effectors. The recent discovery of CALR mutations has closed a gap in our knowledge and has shown that this mutated endoplasmic reticulum chaperone activates the thrombopoietin receptor MPL and JAK2. These genetic studies have shown that there are two main types of MPNs: JAK2V617F-MPNs, including ET, PV, and PMF, and the MPL-/CALR-MPNs, which include only ET and PMF. These MPN driver mutations are associated with additional mutations in genes involved in epigenetics, splicing, and signaling, which can precede or follow the acquisition of MPN driver mutations. They are involved in clonal expansion or phenotypic changes or both, leading to myelofibrosis or leukemic transformation or both. Only a few patients with ET exhibit mutations in non-MPN drivers, whereas the great majority of patients with PMF harbor one or several mutations in these genes. However, the entire pathogenesis of ET and PMF may also depend on other factors, such as the patient’s constitutional genetics, the bone marrow microenvironment, the inflammatory response, and age. Recent advances allowed a better stratification of these diseases and new therapeutic approaches with

  4. JAK2V617F mutation is associated with special alleles in essential thrombocythemia.

    Science.gov (United States)

    Hsiao, Hui-Hua; Liu, Yi-Chang; Tsai, Hui-Jen; Lee, Ching-Ping; Hsu, Jui-Feng; Lin, Sheng-Fung

    2011-03-01

    Janus kinase 2 mutation (JAK2V617F) has been identified in myeloproliferative neoplasms. Furthermore, special single nucleoside polymorphisms (SNPs) have been found to be associated with the JAK2V617F mutation. Therefore, the associations among JAK2V617F and special SNPs and the allelic location between them were investigated in patients with essential thrombocythemia (ET). A total of 61 patients with ET and 106 healthy individuals were enrolled. The PCR-RFLP method was applied to investigate the pattern of three SNPs, rs10974944, rs12343867, and rs12340895. Allele-specific PCR was used to examine the allelic location between rs10974944 and JAK2V617F. Among the patients with ET, 34 (55.7%, 34/61) were JAK2V617F positive (heterozygous) while the other 27 (44.3%, 27/61) were negative, and there were no MPLW515L/K mutations noted. The pattern of special SNPs in JAK2V617F(+) was significantly different from that in normal individuals (p <0.05), while there was no difference between JAK2V617F(-) patients and normal individuals. Allele-specific PCR showed high association of a cis-location between the special G-allele of rs10974944 and JAK2V617F(+). Based on this small numbered study, the results show the association between special SNPs and JAK2V617F mutation and a cis-location between the special G-allelic form of rs10974944 and the JAK2V617F mutation. These data highlight a close relationship between them in patients with ET.

  5. The association of JAK2V617F mutation and leukocytosis with thrombotic events in essential thrombocythemia.

    Science.gov (United States)

    Hsiao, Hui-Hua; Yang, Ming-Yu; Liu, Yi-Chang; Lee, Ching-Ping; Yang, Wen-Chi; Liu, Ta-Chih; Chang, Chao-Sung; Lin, Sheng-Fung

    2007-11-01

    The Janus kinase 2 mutation, JAK2 (V617F), and megakaryocytic mutations, MPL (W515L/K), have been identified and correlated with a subtype of essential thrombocythemia (ET) patients. We investigated the frequency of mutations in ET patients and analyzed the relationship with their clinical features. Fifty-three ET patients were enrolled in the study. The amplification refractory mutation system was applied for the mutation survey of the JAK2V617F, while the polymerase chain reaction with sequencing was used for the mutation survey of MPLW515L/K. Thirty-five (66%) patients harboring the JAK2 (V617F) mutation, including 3 homozygous and 32 heterozygous changes, but no MPLW515L/K mutation, were found. During follow-up, 17 (32.1%) patients suffered from documented thrombotic events, with 15 having JAK2V617F mutations. Statistical analysis showed that patients with the JAK2 mutation had significantly higher leukocytes, hemoglobin level, and thrombotic event (p = 0.043, p = 0.001, and p = 0.029, respectively). Thrombotic events were also significantly correlated with leukocytosis and older age. The JAK2V617F mutation was noted in a certain population of ET patients and correlated with leukocytosis, high hemoglobin level, and thrombosis. Therefore, detection of the JAK2V617F mutation can affect not only the diagnosis, but also the management of ET patients.

  6. Blast transformation and fibrotic progression in polycythemia vera and essential thrombocythemia: a literature review of incidence and risk factors

    Science.gov (United States)

    Cerquozzi, S; Tefferi, A

    2015-01-01

    Polycythemia vera (PV) and essential thrombocythemia (ET) constitute two of the three BCR-ABL1-negative myeloproliferative neoplasms and are characterized by relatively long median survivals (approximately 14 and 20 years, respectively). Potentially fatal disease complications in PV and ET include disease transformation into myelofibrosis (MF) or acute myeloid leukemia (AML). The range of reported frequencies for post-PV MF were 4.9–6% at 10 years and 6–14% at 15 years and for post-ET MF were 0.8–4.9% at 10 years and 4–11% at 15 years. The corresponding figures for post-PV AML were 2.3–14.4% at 10 years and 5.5–18.7% at 15 years and for post-ET AML were 0.7–3% at 10 years and 2.1–5.3% at 15 years. Risk factors cited for post-PV MF include advanced age, leukocytosis, reticulin fibrosis, splenomegaly and JAK2V617F allele burden and for post-ET MF include advanced age, leukocytosis, anemia, reticulin fibrosis, absence of JAK2V617F, use of anagrelide and presence of ASXL1 mutation. Risk factors for post-PV AML include advanced age, leukocytosis, reticulin fibrosis, splenomegaly, abnormal karyotype, TP53 or RUNX1 mutations as well as use of pipobroman, radiophosphorus (P32) and busulfan and for post-ET AML include advanced age, leukocytosis, anemia, extreme thrombocytosis, thrombosis, reticulin fibrosis, TP53 or RUNX1 mutations. It is important to note that some of the aforementioned incidence figures and risk factor determinations are probably inaccurate and at times conflicting because of the retrospective nature of studies and the inadvertent labeling, in some studies, of patients with prefibrotic primary MF or ‘masked' PV, as ET. Ultimately, transformation of MPN leads to poor outcomes and management remains challenging. Further understanding of the molecular events leading to disease transformation is being investigated. PMID:26565403

  7. Epidemiology of myelofibrosis, essential thrombocythemia, and polycythemia vera in the European Union.

    Science.gov (United States)

    Moulard, Odile; Mehta, Jyotsna; Fryzek, Jon; Olivares, Robert; Iqbal, Usman; Mesa, Ruben A

    2014-04-01

    Primary myelofibrosis (PMF), essential thrombocythemia (ET), and polycythemia vera (PV) are BCR ABL-negative myeloproliferative neoplasms (MPN). Published epidemiology data are scarce, and multiple sources are needed to assess the disease burden. We assembled the most recent information available on the incidence and prevalence of myelofibrosis (MF), ET, and PV by conducting a structured and exhaustive literature review of the published peer-reviewed literature in EMBASE and by reviewing online documentation from disease registries and relevant health registries in European countries. The search was restricted to human studies written in English or French and published between January 1, 2000, and December 6, 2012. Eleven articles identified from EMBASE, three online hematology or oncology registries, and two Web-based databases or reports were used to summarize epidemiological estimates for MF, PV, and ET. The incidence rate of MF ranged from 0.1 per 100,000 per year to 1 per 100,000 per year. Among the various registries, the incidence of PV ranged from 0.4 per 100,000 per year to 2.8 per 100,000 per year, while the literature estimated the range of PV incidence to be 0.68 per 100,000 to 2.6 per 100,000 per year. The estimated incidence of ET was between 0.38 per 100,000 per year and 1.7 per 100,000 per year. While a few studies reported on the MPNs' prevalences, it is difficult to compare them as various types of prevalence were calculated (point prevalence vs. period prevalence) and standardization was made according to different populations (e.g., the world population and the European population). There is a wide variation in both prevalence and incidence estimates observed across European data sources. Carefully designed studies, with standardized definitions of MPNs and complete ascertainment of patients including both primary and secondary MFs, should be conducted so that estimates of the population aimed to receive novel treatments for these neoplasms are

  8. A clinical-molecular prognostic model to predict survival in patients with post polycythemia vera and post essential thrombocythemia myelofibrosis.

    Science.gov (United States)

    Passamonti, F; Giorgino, T; Mora, B; Guglielmelli, P; Rumi, E; Maffioli, M; Rambaldi, A; Caramella, M; Komrokji, R; Gotlib, J; Kiladjian, J J; Cervantes, F; Devos, T; Palandri, F; De Stefano, V; Ruggeri, M; Silver, R T; Benevolo, G; Albano, F; Caramazza, D; Merli, M; Pietra, D; Casalone, R; Rotunno, G; Barbui, T; Cazzola, M; Vannucchi, A M

    2017-12-01

    Polycythemia vera (PV) and essential thrombocythemia (ET) are myeloproliferative neoplasms with variable risk of evolution into post-PV and post-ET myelofibrosis, from now on referred to as secondary myelofibrosis (SMF). No specific tools have been defined for risk stratification in SMF. To develop a prognostic model for predicting survival, we studied 685 JAK2, CALR, and MPL annotated patients with SMF. Median survival of the whole cohort was 9.3 years (95% CI: 8-not reached-NR-). Through penalized Cox regressions we identified negative predictors of survival and according to beta risk coefficients we assigned 2 points to hemoglobin level <11 g/dl, to circulating blasts ⩾3%, and to CALR-unmutated genotype, 1 point to platelet count <150 × 10 9 /l and to constitutional symptoms, and 0.15 points to any year of age. Myelofibrosis Secondary to PV and ET-Prognostic Model (MYSEC-PM) allocated SMF patients into four risk categories with different survival (P<0.0001): low (median survival NR; 133 patients), intermediate-1 (9.3 years, 95% CI: 8.1-NR; 245 patients), intermediate-2 (4.4 years, 95% CI: 3.2-7.9; 126 patients), and high risk (2 years, 95% CI: 1.7-3.9; 75 patients). Finally, we found that the MYSEC-PM represents the most appropriate tool for SMF decision-making to be used in clinical and trial settings.

  9. The effect of alpha-interferon on bone marrow megakaryocytes and platelet production rate in essential thrombocythemia

    International Nuclear Information System (INIS)

    Wadenvik, H.; Kutti, J.; Ridell, B.; Revesz, P.; Jacobsson, S.; Magnusson, B.; Westin, J.; Vilen, L.

    1991-01-01

    In 10 patients with previously untreated essential thrombocythemia (ET), by using 111 In-labeled platelets and megakaryocyte morphometry, the relation between platelet production rate and bone marrow megakaryocytes was evaluated before and during alpha-2b-interferon (IFN) therapy. A highly significant decrease in platelet count occurred during IFN therapy; the platelet counts, at baseline and after 2 and 6 months of IFN therapy, were 1,102 +/- 345 x 10(9)/L, 524 +/- 169 x 10(9)/L (P less than .0001), and 476 +/- 139 x 10(9)/L (P less than .0001), respectively. The decrement in platelet count was mainly a result of diminished platelet production rate, which at baseline and after 2 and 6 months of IFN therapy was 89 +/- 30 x 10(10) platelets/d, 53 +/- 18 x 10(10) platelets/d (P = .0033), and 45 +/- 20 x 10(10) platelets/d (P less than .0001), respectively. Also, a slight shortening of platelet mean life-span (MLS) was observed in response to IFN treatment; platelet MLS was 7.96 +/- 0.69 days at baseline and 6.68 +/- 1.30 days (P = .012) after 6 months of IFN therapy. IFN induced a significant decrease in bone marrow megakaryocyte volume; both megakaryocyte nuclear and cytoplasmatic volumes were affected. The mean megakaryocyte volume was 372 +/- 126 x 10(2) pL/microL at baseline and 278 +/- 147 x 10(2) pL/microL (P = .049) after 6 months of IFN therapy. However, the number of megakaryocytes did not show any significant change in response to IFN. It is concluded that alpha-IFN reduces platelet production rate and the peripheral platelet count in ET mainly through an anti-proliferative action on the megakaryocytes and to a considerably lesser degree by a shortening of platelet MLS

  10. Lung Clear “Sugar” Cell Tumor and Jak V617f Positive Essential Thrombocythemia: A simple Coıncıdence?

    Directory of Open Access Journals (Sweden)

    Volkan Yazak

    2013-04-01

    Full Text Available The primary clear cell tumor of the lung is an extremely rare benign tumor, which is called “sugar tumor”, because of the large content of glycogen. Here we are presenting essential thrombocythemia and lung clear cell tumor which was not reported before to the best of our knowledge. A 44 years old woman admitted to the clinic with complaint of lassitude lasting for 2 months. In her physical examination the spleen was 3 cm palpable from the costa arch In laboratory findings number of platelet was 1014000 mm³. A 3.5 cm in diameter pulmonary nodule is detected in right upper lobe in the graphy of the lungs. Subsequently  computed tomography  (CT of thorax was carried out. Due to the benign features in the display of  the detected nodule, a total excision with curative and diagnostic intentions was performed. Microscopically the tumor were composed of nests of rounded or oval cells with distinct cell borders and optically clear cytoplasm. The nuclei were small. Immunohistochemically the tumor cells expressed HMB-45, NSE and focal S100 antigen. It was diagnosed as clear  “sugar” cell tumor. In conclusion, in lung clear cell tumor, it is important to make evaluation in terms of myeloproliferative disease in adults whose thrombocytosis continue after the treatment.

  11. Genetic Alterations in Essential Thrombocythemia Progression to Acute Myeloid Leukemia: A Case Series and Review of the Literature

    Directory of Open Access Journals (Sweden)

    Jackline P. Ayres-Silva

    2018-02-01

    Full Text Available The genetic events associated with transformation of myeloproliferative neoplasms (MPNs to secondary acute myeloid leukemia (sAML, particularly in the subgroup of essential thrombocythemia (ET patients, remain incompletely understood. Deep studies using high-throughput methods might lead to a better understanding of genetic landscape of ET patients who transformed to sAML. We performed array-based comparative genomic hybridization (aCGH and whole exome sequencing (WES to analyze paired samples from ET and sAML phases. We investigated five patients with previous history of MPN, which four had initial diagnosis of ET (one case harboring JAK2 p.Val617Phe and the remaining three CALR type II p.Lys385fs*47, and one was diagnosed with MPN/myelodysplastic syndrome with thrombocytosis (SF3B1 p.Lys700Glu. All were homogeneously treated with hydroxyurea, but subsequently transformed to sAML (mean time of 6 years/median of 4 years to transformation. Two of them have chromosomal abnormalities, and both acquire 2p gain and 5q deletion at sAML stage. The molecular mechanisms associated with leukemic progression in MPN patients are not clear. Our WES data showed TP53 alterations recurrently observed as mutations (missense and frameshift and monoallelic loss. On the other hand, aCGH showed novel chromosome abnormalities (+2p and del5q potentially associated with disease progression. The results reported here add valuable information to the still fragmented molecular basis of ET to sAML evolution. Further studies are necessary to identify minimal deleted/amplified region and genes relevant to sAML transformation.

  12. European LeukemiaNet study on the reproducibility of bone marrow features in masked polycythemia vera and differentiation from essential thrombocythemia.

    Science.gov (United States)

    Kvasnicka, Hans Michael; Orazi, Attilio; Thiele, Juergen; Barosi, Giovanni; Bueso-Ramos, Carlos E; Vannucchi, Alessandro M; Hasserjian, Robert P; Kiladjian, Jean-Jacques; Gianelli, Umberto; Silver, Richard; Mughal, Tariq I; Barbui, Tiziano

    2017-10-01

    The purpose of the study was to assess consensus and interobserver agreement among an international panel of six hematopathologists regarding characterization and reproducibility of bone marrow (BM) histologic features used to diagnose early stage myeloproliferative neoplasms, in particular differentiation of so-called masked/prodromal polycythemia vera (mPV) from JAK2-mutated essential thrombocythemia (ET). The six members of the hematopathology panel evaluated 98 BM specimens independently and in a blinded fashion without knowledge of clinical data. The specimens included 48 cases of mPV according to the originally published hemoglobin threshold values for this entity (male: 16.0-18.4 g/dL, female: 15.0-16.4 g/dL), 31 cases with overt PV according to the updated 2016 WHO criteria, and 19 control cases. The latter group included cases of JAK2-mutated ET, primary myelofibrosis, myelodysplastic syndrome, and various reactive conditions. Inter-rater agreement between the panelists was very high (overall agreement 92.6%, kappa 0.812), particularly with respect to separating mPV from ET. Virtually all cases of mPV were correctly classified as PV according to their BM morphology. In conclusion, a central blinded review of histology slides by six hematopathologists demonstrated that highly reproducible specific histological pattern characterize PV and confirmed the notion that there are no significant differences between mPV and overt PV in relation to BM morphology. © 2017 Wiley Periodicals, Inc.

  13. Essential Thrombocythemia

    Science.gov (United States)

    ... Treatment for information on diagnosis , staging , and treatment. Polycythemia Vera Key Points Polycythemia vera is a disease ... blood tests are used to diagnose polycythemia vera. Polycythemia vera is a disease in which too many ...

  14. Impact of JAK2V617F Mutational Status on Phenotypic Features in Essential Thrombocythemia and Primary Myelofibrosis

    Directory of Open Access Journals (Sweden)

    İpek Yönal

    2016-05-01

    Full Text Available Objective: The JAK2V617F mutation is present in the majority of patients with essential thrombocythemia (ET and primary myelofibrosis (PMF. The impact of this mutation on disease phenotype in ET and PMF is still a matter of discussion. This study aims to determine whether there are differences in clinical presentation and disease outcome between ET and PMF patients with and without the JAK2V617F mutation. Materials and Methods: In this single-center study, a total of 184 consecutive Philadelphia-negative chronic myeloproliferative neoplasms, 107 cases of ET and 77 cases of PMF, were genotyped for JAK2V617F mutation using the JAK2 Ipsogen MutaScreen assay, which involves allele-specific polymerase chain reaction. Results: ET patients positive for JAK2V617F mutation had higher hemoglobin (Hb and hematocrit (Hct levels, lower platelet counts, and more prevalent splenomegaly at diagnosis compared to patients negative for the JAK2V617F mutation, but rates of major thrombotic events, arterial thrombosis, and venous thrombosis were comparable between the groups. At presentation, PMF patients with JAK2V617F mutation had significantly higher Hb and Hct levels and leukocyte counts than patients without the mutation. Similar to the findings of ET patients, thromboembolic rates were similar in PMF patients with and without theJAK2V617F mutation. For ET and PMF patients, no difference was observed in rates of death with respect to JAK2V617F mutational status. Moreover, leukemic transformation rate was not different in our PMF patients with and without JAK2V617F mutation. Conclusion: We conclude that JAK2V617F-mutated ET patients express a polycythemia vera-like phenotype and JAK2V617F mutation in PMF patients is associated with a more pronounced myeloproliferative phenotype.

  15. Calreticulin mutation analysis in non-mutated Janus kinase 2 essential thrombocythemia patients in Chiang Mai University: analysis of three methods and clinical correlations.

    Science.gov (United States)

    Rattarittamrong, Ekarat; Tantiworawit, Adisak; Kumpunya, Noppamas; Wongtagan, Ornkamon; Tongphung, Ratchanoo; Phusua, Arunee; Chai-Adisaksopha, Chatree; Hantrakool, Sasinee; Rattanathammethee, Thanawat; Norasetthada, Lalita; Charoenkwan, Pimlak; Lekawanvijit, Suree

    2018-03-09

    The primary objective was to determine the prevalence of calreticulin (CALR) mutation in patients with non-JAK2V617F mutated essential thrombocythemia (ET). The secondary objectives were to evaluate the accuracy of CALR mutation analysis by high-resolution melting (HRM) analysis and real-time polymerase chain reaction (PCR) compared with DNA sequencing and to compare clinical characteristics of CALR mutated and JAK2V617F mutated ET. This was a prospective cohort study involving ET patients registered at Chiang Mai University in the period September 2015-September 2017 who were aged more than 2 years, and did not harbor JAK2V617F mutation. The presence of CALR mutation was established by DNA sequencing, HRM, and real-time PCR for type 1 and type 2 mutation. Clinical data were compared with that from ET patients with mutated JAK2V617F. Twenty-eight patients were enrolled onto the study. CALR mutations were found in 10 patients (35.7%). Three patients had type 1 mutation, 5 patients had type 2 mutation, 1 patient had type 18 mutation, and 1 patients had novel mutations (c.1093 C-G, c.1098_1131 del, c.1135 G-A). HRM could differentiate between the types of mutation in complete agreement with DNA sequencing. Patients with a CALR mutation showed a significantly greater male predominance and had a higher platelet count when compared with 42 JAK2V617F patients. The prevalence of CALR mutation in JAK2V617F-negative ET in this study is 35.7%. HRM is an effective method of detecting CALR mutation and is a more advantageous method of screening for CALR mutation.

  16. Impact of JAK2V617F Mutation Burden on Disease Phenotype in Chinese Patients with JAK2V617F-positive Polycythemia Vera (PV) and Essential thrombocythemia (ET).

    Science.gov (United States)

    Zhao, Shixiang; Zhang, Xiang; Xu, Yang; Feng, Yufeng; Sheng, Wenhong; Cen, Jiannong; Wu, Depei; Han, Yue

    2016-01-01

    Most patients with polycythemia vera (PV) and half of essential thrombocythemia (ET) possess an activating JAK2V617F mutation. The objective of this study was to better define the effect of JAK2V617F mutant allele burden on clinical phenotypes in Chinese patients, especially thrombosis. By real-time polymerase chain reaction (RT-PCR), the JAK2V617F mutation burden was detected in 170 JAK2V617F-positive patients, including 54 PV and 116 ET. The results showed that JAK2V617F allele burden was higher in PV than in ET (PET (68.5% VS 26.7%) (PET patients showed increased JAK2V617F allele burden in the group with higher hemoglobin (HGB above 150 g/L) (PET. In PV patients, JAK2V617F mutation burden had influence on WBC counts. And the clinical characteristics of ET patients, such as WBC counts, hemoglobin level, splenomegaly and thrombosis, were influenced by JAK2V617F mutation burden. Male, high hemoglobin (HGB above 150 g/L), and increased JAK2V617F mutation burden (JAK2V617F allele burden ≥ 16.5%) were risks of thrombosis (PET patients by Logistic Regression.

  17. Current opinion and consensus statement regarding the diagnosis, prognosis, and treatment of patients with essential thrombocythemia: a survey of the Spanish Group of Ph-negative Myeloproliferative Neoplasms (GEMFIN) using the Delphi method.

    Science.gov (United States)

    Besses, C; Hernández-Boluda, J C; Pérez Encinas, M; Raya, J M; Hernández-Rivas, J M; Jiménez Velasco, A; Martínez Lopez, J; Vicente, V; Burgaleta, C

    2016-04-01

    The current consensus on the diagnosis, prognosis, and treatment of essential thrombocythemia (ET) is based on experts' recommendations. However, several aspects of the diagnosis of, prognosis of, and therapy for ET are still controversial. The Delphi method was employed with an expert panel of members of the Spanish Group of Ph-negative Myeloproliferative Neoplasms in order to identify the degree of agreement on the diagnosis, prognosis, and treatment of ET. Nine leading experts selected a total of 41 clinical hematologists with well-known expertise in ET. An electronic questionnaire was used to collect the questions rated in a four-step scale. The questions were grouped into four blocks: diagnosis, risk stratification, goals of therapy, and treatment strategy. After the first round consisting of 80 questions, a second round including 14 additional questions focused on the recommendations advocated by experts of the European LeukemiaNet in 2011 was analyzed. The median and mean values for the first and second rounds were calculated. A summary of the conclusions considered as the most representative of each block of questions is presented. The Delphi method is a powerful instrument to address the current approaches and controversies surrounding ET.

  18. [Essential thrombocythemia and pregnancy].

    Science.gov (United States)

    Baleiras, Carla; Silva, Ana; Serrano, Fátima

    2003-01-01

    Essencial Thrombocytemia is a rare chronic myeloproliferative disease of unknown etiology, characterized by markedly elevated platelet production (> 600,000/ml). It is more frequent among women above 50 years of age and may be associated with hemorrhagic or thrombotic tendencies. The authors report a case of Essencial Thrombocytemia diagnosed after several consecutive spontaneous abortions. Some clinical aspects, complications, differential diagnosis and management of this condition in pregnancy are also reviewed. An individualized, multidisciplinar approach and the treatment with acetylsalicylic acid, associated with interferon-alfa if necessary, will be the best therapeutic options for these patients.

  19. Driver mutations (JAK2V617F, MPLW515L/K or CALR), pentraxin-3 and C-reactive protein in essential thrombocythemia and polycythemia vera.

    Science.gov (United States)

    Lussana, Federico; Carobbio, Alessandra; Salmoiraghi, Silvia; Guglielmelli, Paola; Vannucchi, Alessandro Maria; Bottazzi, Barbara; Leone, Roberto; Mantovani, Alberto; Barbui, Tiziano; Rambaldi, Alessandro

    2017-02-22

    The driver mutations JAK2V617F, MPLW515L/K and CALR influence disease phenotype of myeloproliferative neoplasms (MPNs) and might sustain a condition of chronic inflammation. Pentraxin 3 (PTX3) and high-sensitivity C-reactive protein (hs-CRP) are inflammatory biomarkers potentially useful for refining prognostic classification of MPNs. We evaluated 305 with essential thrombocythemia (ET) and 172 polycythemia vera (PV) patients diagnosed according to the 2016 WHO criteria and with full molecular characterization for driver mutations. PTX3 levels were significantly increased in carriers of homozygous JAK2V617F mutation compared to all the other genotypes and triple negative ET patients, while hs-CRP levels were independent of the mutational profile. The risk of haematological evolution and death from any cause was about 2- and 1.5-fold increased in individuals with high PTX-3 levels, while the thrombosis rate tended to be lower. High hs-CRP levels were associated with risk of haematological evolution, death and also major thrombosis. After sequential adjustment for potential confounders (age, gender, diagnosis and treatments) and the presence of JAK2V617F homozygous status, high hs-CRP levels remained significant for all outcomes, while JAK2V617F homozygous status as well as treatments were the factors independently accounting for adverse outcomes among patients with high PTX3 levels. These results provide evidence that JAK2V617F mutation influences MPN-associated inflammation with a strong correlation between allele burden and PTX3 levels. Plasma levels of hs-CRP and PTX3 might be of prognostic value for patients with ET and PV, but their validation in future prospective studies is needed.

  20. Clinical features of Japanese polycythemia vera and essential thrombocythemia patients harboring CALR, JAK2V617F, JAK2Ex12del, and MPLW515L/K mutations.

    Science.gov (United States)

    Okabe, Masahiro; Yamaguchi, Hiroki; Usuki, Kensuke; Kobayashi, Yutaka; Kawata, Eri; Kuroda, Junya; Kimura, Shinya; Tajika, Kenji; Gomi, Seiji; Arima, Nobuyoshi; Mori, Sinichiro; Ito, Shigeki; Koizumi, Masayuki; Ito, Yoshikazu; Wakita, Satoshi; Arai, Kunihito; Kitano, Tomoaki; Kosaka, Fumiko; Dan, Kazuo; Inokuchi, Koiti

    2016-01-01

    The risk of complication of polycythemia vera (PV) and essential thrombocythemia (ET) by thrombosis in Japanese patients is clearly lower than in western populations, suggesting that genetic background such as race may influence the clinical features. This study aimed to clarify the relationship between genetic mutations and haplotypes and clinical features in Japanese patients with PV and ET. Clinical features were assessed prospectively among 74 PV and 303 ET patients. There were no clinical differences, including JAK2V617F allele burden, between PV patients harboring the various genetic mutations. However, CALR mutation-positive ET patients had a significantly lower WBC count, Hb value, Ht value, and neutrophil alkaline phosphatase score (NAP), and significantly more platelets, relative to JAK2V617F-positive ET patients and ET patients with no mutations. Compared to normal controls, the frequency of the JAK246/1 haplotype was significantly higher among patients with JAK2V617F, JAK2Ex12del, or MPL mutations, whereas no significant difference was found among CALR mutation-positive patients. CALR mutation-positive patients had a lower incidence of thrombosis relative to JAK2V617F-positive patients. Our findings suggest that JAK2V617F-positive ET patients and CALR mutation-positive patients have different mechanisms of occurrence and clinical features of ET, suggesting the potential need for therapy stratification in the future. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Driver mutations (JAK2V617F, MPLW515L/K or CALR, pentraxin-3 and C-reactive protein in essential thrombocythemia and polycythemia vera

    Directory of Open Access Journals (Sweden)

    Federico Lussana

    2017-02-01

    Full Text Available Abstract Background The driver mutations JAK2V617F, MPLW515L/K and CALR influence disease phenotype of myeloproliferative neoplasms (MPNs and might sustain a condition of chronic inflammation. Pentraxin 3 (PTX3 and high-sensitivity C-reactive protein (hs-CRP are inflammatory biomarkers potentially useful for refining prognostic classification of MPNs. Methods We evaluated 305 with essential thrombocythemia (ET and 172 polycythemia vera (PV patients diagnosed according to the 2016 WHO criteria and with full molecular characterization for driver mutations. Results PTX3 levels were significantly increased in carriers of homozygous JAK2V617F mutation compared to all the other genotypes and triple negative ET patients, while hs-CRP levels were independent of the mutational profile. The risk of haematological evolution and death from any cause was about 2- and 1.5-fold increased in individuals with high PTX-3 levels, while the thrombosis rate tended to be lower. High hs-CRP levels were associated with risk of haematological evolution, death and also major thrombosis. After sequential adjustment for potential confounders (age, gender, diagnosis and treatments and the presence of JAK2V617F homozygous status, high hs-CRP levels remained significant for all outcomes, while JAK2V617F homozygous status as well as treatments were the factors independently accounting for adverse outcomes among patients with high PTX3 levels. Conclusions These results provide evidence that JAK2V617F mutation influences MPN-associated inflammation with a strong correlation between allele burden and PTX3 levels. Plasma levels of hs-CRP and PTX3 might be of prognostic value for patients with ET and PV, but their validation in future prospective studies is needed.

  2. Deregulation of apoptosis-related genes is associated with PRV1 overexpression and JAK2 V617F allele burden in Essential Thrombocythemia and Myelofibrosis

    Directory of Open Access Journals (Sweden)

    Tognon Raquel

    2012-02-01

    Full Text Available Abstract Background Essential Thrombocythemia (ET and Primary Myelofibrosis (PMF are Chronic Myeloproliferative Neoplasms (MPN characterized by clonal myeloproliferation/myeloaccumulation without cell maturation impairment. The JAK2 V617F mutation and PRV1 gene overexpression may contribute to MPN physiopathology. We hypothesized that deregulation of the apoptotic machinery may also play a role in the pathogenesis of ET and PMF. In this study we evaluated the apoptosis-related gene and protein expression of BCL2 family members in bone marrow CD34+ hematopoietic stem cells (HSC and peripheral blood leukocytes from ET and PMF patients. We also tested whether the gene expression results were correlated with JAK2 V617F allele burden percentage, PRV1 overexpression, and clinical and laboratory parameters. Results By real time PCR assay, we observed that A1, MCL1, BIK and BID, as well as A1, BCLW and BAK gene expression were increased in ET and PMF CD34+ cells respectively, while pro-apoptotic BAX and anti-apoptotic BCL2 mRNA levels were found to be lower in ET and PMF CD34+ cells respectively, in relation to controls. In patients' leukocytes, we detected an upregulation of anti-apoptotic genes A1, BCL2, BCL-XL and BCLW. In contrast, pro-apoptotic BID and BIMEL expression were downregulated in ET leukocytes. Increased BCL-XL protein expression in PMF leukocytes and decreased BID protein expression in ET leukocytes were observed by Western Blot. In ET leukocytes, we found a correlation between JAK2 V617F allele burden and BAX, BIK and BAD gene expression and between A1, BAX and BIK and PRV1 gene expression. A negative correlation between PRV1 gene expression and platelet count was observed, as well as a positive correlation between PRV1 gene expression and splenomegaly. Conclusions Our results suggest the participation of intrinsic apoptosis pathway in the MPN physiopathology. In addition, PRV1 and JAK2 V617F allele burden were linked to deregulation

  3. Screening for MPL mutations in essential thrombocythemia and primary myelofibrosis: normal Mpl expression and absence of constitutive STAT3 and STAT5 activation in MPLW515L-positive platelets.

    Science.gov (United States)

    Glembotsky, Ana C; Korin, Laura; Lev, Paola R; Chazarreta, Carlos D; Marta, Rosana F; Molinas, Felisa C; Heller, Paula G

    2010-05-01

    To evaluate the frequency of MPL W515L, W515K and S505N mutations in essential thrombocythemia (ET) and primary myelofibrosis (PMF) and to determine whether MPLW515L leads to impaired Mpl expression, constitutive STAT3 and STAT5 activation and enhanced response to thrombopoietin (TPO). Mutation detection was performed by allele-specific PCR and sequencing. Platelet Mpl expression was evaluated by flow cytometry, immunoblotting and real-time RT-PCR. Activation of STAT3 and STAT5 before and after stimulation with increasing concentrations of TPO was studied by immunoblotting. Plasma TPO was measured by ELISA. MPLW515L was detected in 1 of 100 patients with ET and 1 of 11 with PMF. Platelets from the PMF patient showed 100% mutant allele, which was Mpl surface and total protein expression were normal, and TPO levels were mildly increased in the MPLW515L-positive ET patient, while MPL transcripts did not differ from controls in both MPLW515L-positive patients. Constitutive STAT3 and STAT5 phosphorylation was absent and dose response to TPO-induced phosphorylation was not enhanced. The low frequency of MPL mutations in this cohort is in agreement with previous studies. The finding of normal Mpl levels in MPLW515L-positive platelets indicates this mutation does not lead to dysregulated Mpl expression, as frequently shown for myeloproliferative neoplasms. The lack of spontaneous STAT3 and STAT5 activation and the normal response to TPO is unexpected as MPLW515L leads to constitutive receptor activation and hypersensitivity to TPO in experimental models.

  4. Genetics Home Reference: essential thrombocythemia

    Science.gov (United States)

    ... splice donor mutation in the thrombopoietin gene causes hereditary thrombocythaemia. Nat Genet. 1998 Jan;18(1):49-52. ... deficiency Depression Pelizaeus-Merzbacher-like disease type 1 All New & ...

  5. Associations between gender, disease features and symptom burden in patients with myeloproliferative neoplasms: an analysis by the MPN QOL International Working Group.

    Science.gov (United States)

    Geyer, Holly L; Kosiorek, Heidi; Dueck, Amylou C; Scherber, Robyn; Slot, Stefanie; Zweegman, Sonja; Te Boekhorst, Peter Aw; Senyak, Zhenya; Schouten, Harry C; Sackmann, Federico; Fuentes, Ana Kerguelen; Hernández-Maraver, Dolores; Pahl, Heike L; Griesshammer, Martin; Stegelmann, Frank; Döhner, Konstanze; Lehmann, Thomas; Bonatz, Karin; Reiter, Andreas; Boyer, Francoise; Etienne, Gabriel; Ianotto, Jean-Christophe; Ranta, Dana; Roy, Lydia; Cahn, Jean-Yves; Harrison, Claire N; Radia, Deepti; Muxi, Pablo; Maldonado, Norman; Besses, Carlos; Cervantes, Francisco; Johansson, Peter L; Barbui, Tiziano; Barosi, Giovanni; Vannucchi, Alessandro M; Paoli, Chiara; Passamonti, Francesco; Andreasson, Bjorn; Ferrari, Maria L; Rambaldi, Alessandro; Samuelsson, Jan; Cannon, Keith; Birgegard, Gunnar; Xiao, Zhijian; Xu, Zefeng; Zhang, Yue; Sun, Xiujuan; Xu, Junqing; Kiladjian, Jean-Jacques; Zhang, Peihong; Gale, Robert Peter; Mesa, Ruben A

    2017-01-01

    The myeloproliferative neoplasms, including polycythemia vera, essential thrombocythemia and myelofibrosis, are distinguished by their debilitating symptom profiles, life-threatening complications and profound impact on quality of life. The role gender plays in the symptomatology of myeloproliferative neoplasms remains under-investigated. In this study we evaluated how gender relates to patients' characteristics, disease complications and overall symptom expression. A total of 2,006 patients (polycythemia vera=711, essential thrombocythemia=830, myelofibrosis=460, unknown=5) were prospectively evaluated, with patients completing the Myeloproliferative Neoplasm-Symptom Assessment Form and Brief Fatigue Inventory Patient Reported Outcome tools. Information on the individual patients' characteristics, disease complications and laboratory data was collected. Consistent with known literature, most female patients were more likely to have essential thrombocythemia (48.6% versus 33.0%; Pgender contributes to the heterogeneity of myeloproliferative neoplasms by influencing phenotypic profiles and symptom expression. Copyright© Ferrata Storti Foundation.

  6. Deep sequencing reveals double mutations in cis of MPL exon 10 in myeloproliferative neoplasms.

    Science.gov (United States)

    Pietra, Daniela; Brisci, Angela; Rumi, Elisa; Boggi, Sabrina; Elena, Chiara; Pietrelli, Alessandro; Bordoni, Roberta; Ferrari, Maurizio; Passamonti, Francesco; De Bellis, Gianluca; Cremonesi, Laura; Cazzola, Mario

    2011-04-01

    Somatic mutations of MPL exon 10, mainly involving a W515 substitution, have been described in JAK2 (V617F)-negative patients with essential thrombocythemia and primary myelofibrosis. We used direct sequencing and high-resolution melt analysis to identify mutations of MPL exon 10 in 570 patients with myeloproliferative neoplasms, and allele specific PCR and deep sequencing to further characterize a subset of mutated patients. Somatic mutations were detected in 33 of 221 patients (15%) with JAK2 (V617F)-negative essential thrombocythemia or primary myelofibrosis. Only one patient with essential thrombocythemia carried both JAK2 (V617F) and MPL (W515L). High-resolution melt analysis identified abnormal patterns in all the MPL mutated cases, while direct sequencing did not detect the mutant MPL in one fifth of them. In 3 cases carrying double MPL mutations, deep sequencing analysis showed identical load and location in cis of the paired lesions, indicating their simultaneous occurrence on the same chromosome.

  7. DEFINITION OF ACTIVATED THROMBOCYTE NUMBER WITH ANTIBODIES FOR ACTIVATED FIBRINOGEN AND P-SELECTIN IN PATIENTS WITH ESSENTIAL THROMBOCYTHEMIA AND ANTIAGGREGATION DRUG EFFECT

    Directory of Open Access Journals (Sweden)

    Samo Zver

    2004-12-01

    Full Text Available Background. Essential thrombocythemia (ET is a chronic myeloproliferative disease with a platelet count within the range of 400–2000 × 109/L. Higher percentage of platelets in the circulation of patients with ET express also activation markers on their membranes. Two of such markers are P-selectin and activated fibrinogen on platelet membranes. Because of frequent thrombembolic and also bleeding related complications, treatment of ET is mandatory. Patients whose platelet count is less than 1000 × 109/L and who did not suffer any thrombembolic complication during the course of the disease, are ussually treated with an antiaggregation drug, acetylsalicylic acid 100 mg/daily orally. Clopidogrel is an adenosyn-di-phosphate (ADP receptor antagonist in platelets. There is no routine clinical data about clopidogrel treatment in the patients with ET and only sporadic case reports can be find in the literature.Patients and methods. In our clinical study we compared antiaggregational effects of acetylsalicylic acid and clopidogrel, by measuring the P-selectin level and activated fibrinogen expression on platelet membranes.There were 35 ET patients included, within the age range between 21 and 78 years and with platelet counts within 451–952 × 109/L. None of the patients did suffer any thrombembolic complication during the course of the disease. During the sequential 14 day periods, patients received acetylsalicylic acid 100 mg/daily orally, followed by clopidogrel 75 mg/daily orally and ultimativelly, together acetylsalicylic acid 100 mg/daily orally plus clopidogrel 75 mg/daily orally. After each fourteen days period the level of P-selectin and activated fibrinogen activated platelets were determined with monoclonal antibodies on flow cytometer. Statistical evaluation was calculated on the difference of average values between the two small, independent pair groups with the t-test.Results. When the patients stopped with acetylsalicylic acid and

  8. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  9. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  10. Clinical features and course of refractory anemia with ring sideroblasts associated with marked thrombocytosis

    Science.gov (United States)

    Broseus, Julien; Florensa, Lourdes; Zipperer, Esther; Schnittger, Susanne; Malcovati, Luca; Richebourg, Steven; Lippert, Eric; Cermak, Jaroslav; Evans, Jyoti; Mounier, Morgane; Raya, José Maria; Bailly, François; Gattermann, Norbert; Haferlach, Torsten; Garand, Richard; Allou, Kaoutar; Besses, Carlos; Germing, Ulrich; Haferlach, Claudia; Travaglino, Erica; Luno, Elisa; Pinan, Maria Angeles; Arenillas, Leonor; Rozman, Maria; Perez Sirvent, Maria Luz; Favre, Bernardine; Guy, Julien; Alonso, Esther; Ahwij, Nuhri; Jerez, Andrés; Hermouet, Sylvie; Maynadié, Marc; Cazzola, Mario; Girodon, François

    2012-01-01

    Background Refractory anemia with ring sideroblasts associated with marked thrombocytosis was proposed as a provisional entity in the 2001 World Health Organization classification of myeloid neoplasms and also in the 2008 version, but its existence as a single entity is contested. We wish to define the clinical features of this rare myelodysplastic/myeloproliferative neoplasm and to compare its clinical outcome with that of refractory anemia with ring sideroblasts and essential thrombocythemia. Design and Methods We conducted a collaborative retrospective study across Europe. Our database included 200 patients diagnosed with refractory anemia with ring sideroblasts and marked thrombocytosis. For each of these patients, each patient diagnosed with refractory anemia with ring sideroblasts was matched for age and sex. At the same time, a cohort of 454 patients with essential thrombocythemia was used to compare outcomes of the two diseases. Results In patients with refractory anemia with ring sideroblasts and marked thrombocytosis, depending on the Janus Kinase 2 V617F mutational status (positive or negative) or platelet threshold (over or below 600×109/L), no difference in survival was noted. However, these patients had shorter overall survival and leukemia-free survival with a lower risk of thrombotic complications than did patients with essential thrombocythemia (P<0.001) but better survival (P<0.001) and a higher risk of thrombosis (P=0.039) than patients with refractory anemia with ring sideroblasts. Conclusions The clinical course of refractory anemia with ring sideroblasts and marked thrombocytosis is better than that of refractory anemia with ring sideroblasts and worse than that of essential thrombocythemia. The higher risk of thrombotic events in this disorder suggests that anti-platelet therapy might be considered in this subset of patients. From a clinical point of view, it appears to be important to consider refractory anemia with ring sideroblasts and

  11. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  12. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  13. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  14. Statistics I essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics I covers include frequency distributions, numerical methods of describing data, measures of variability, parameters of distributions, probability theory, and distributions.

  15. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  16. Increased gene expression of histone deacetylases in patients with Philadelphia-negative chronic myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Skov, Vibe; Larsen, Thomas Stauffer; Thomassen, Mads

    2012-01-01

    Abstract Myeloproliferation, myeloaccumulation (decreased apoptosis), inflammation, bone marrow fibrosis and angiogenesis are cardinal features of the Philadelphia-negative chronic myeloproliferative neoplasms: essential thrombocythemia (ET), polycythemia vera (PV) and primary myelofibrosis (PMF...

  17. Disease: H01612 [KEGG MEDICUS

    Lifescience Database Archive (English)

    Full Text Available 5 update on diagnosis, risk-stratification and management. ... JOURNAL ... Am J Hematol 90:162-73 (2015) DOI:10.... TITLE ... Pharmacological management of essential thrombocythemia. ... JOURNAL ... Expert Opin Pharmacother 14:

  18. Bone morbidity in chronic myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Farmer, Sarah; Ocias, Lukas Frans; Vestergaard, Hanne

    2015-01-01

    Patients with the classical Philadelphia chromosome-negative chronic myeloproliferative neoplasms including essential thrombocythemia, polycythemia vera and primary myelofibrosis often suffer from comorbidities, in particular, cardiovascular diseases and thrombotic events. Apparently, there is also...

  19. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  20. Treatment Options for Chronic Myeloproliferative Neoplasms

    Science.gov (United States)

    ... factors affect prognosis (chance of recovery) and treatment options for primary myelofibrosis. Prognosis (chance of recovery ) depends ... factors affect prognosis (chance of recovery) and treatment options for essential thrombocythemia. Prognosis (chance of recovery ) and ...

  1. Treatment Option Overview (Chronic Myeloproliferative Neoplasms)

    Science.gov (United States)

    ... factors affect prognosis (chance of recovery) and treatment options for primary myelofibrosis. Prognosis (chance of recovery ) depends ... factors affect prognosis (chance of recovery) and treatment options for essential thrombocythemia. Prognosis (chance of recovery ) and ...

  2. Algebra & trigonometry II essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Algebra & Trigonometry II includes logarithms, sequences and series, permutations, combinations and probability, vectors, matrices, determinants and systems of equations, mathematica

  3. Safety and efficacy of combination therapy of interferon-alpha2 + JAK1-2 Inhibitor in the philadelphia-negative chronic myeloproliferative neoplasms. Preliminary results from the danish combi-trial-an open label, single arm, non-randomized multicenter phase II study

    DEFF Research Database (Denmark)

    Mikkelsen, S. U.; Kjaer, L.; Skov, V.

    2015-01-01

    Background: The Philadelphia-negative, chronic myeloproliferative neoplasms (MPN) include essential thrombocythemia (ET), polycythemia vera (PV) and primary myelofibrosis (MF) (PMF). Chronic inflammation and a deregulated immune system are considered important for clonal evolution and disease...

  4. Whole blood transcriptional profiling reveals deregulation of oxidative and antioxidative defence genes in myelofibrosis and related neoplasms. Potential implications of downregulation of Nrf2 for genomic instability and disease progression

    DEFF Research Database (Denmark)

    Hasselbalch, Hans Carl; Thomassen, Mads; Riley, Caroline Hasselbalch

    2014-01-01

    The Philadelphia-negative chronic myeloproliferative neoplasms - essential thrombocythemia (ET), polycythemia vera (PV), and myelofibrosis (MF) (MPNs) - have recently been shown to be associated with chronic inflammation, oxidative stress and accumulation of reactive oxygen species (ROS). Using...

  5. Inheritance of the chronic myeloproliferative neoplasms. A systematic review

    DEFF Research Database (Denmark)

    Ranjan, Ajenthen; Penninga, E; Jelsig, Am

    2012-01-01

    This systematic review investigated the inheritance of the classical chronic myeloproliferative neoplasms (MPNs) including polycythemia vera (PV), essential thrombocythemia (ET), primary myelofibrosis (PMF) and chronic myelogenous leukemia (CML). Sixty-one articles were included and provided 135...

  6. Molecular Profiling of Peripheral Blood Cells from Patients with Polycythemia Vera and Related Neoplasms: Identification of Deregulated Genes of Significance for Inflammation and Immune Surveillance

    DEFF Research Database (Denmark)

    Skov, Vibe; Larsen, Thomas Stauffer; Thomassen, Mads

    2012-01-01

    Essential thrombocythemia (ET), polycythemia vera (PV) and primary myelofibrosis (PMF) are haematopoietic stem cell neoplasms that may be associated with autoimmune or chronic inflammatory disorders. Earlier gene expression profiling studies have demonstrated aberrant expression of genes involved...

  7. CD177: A member of the Ly-6 gene superfamily involved with neutrophil proliferation and polycythemia vera

    Directory of Open Access Journals (Sweden)

    Bettinotti Maria

    2004-03-01

    Full Text Available Abstract Genes in the Leukocyte Antigen 6 (Ly-6 superfamily encode glycosyl-phosphatidylinositol (GPI anchored glycoproteins (gp with conserved domains of 70 to 100 amino acids and 8 to 10 cysteine residues. Murine Ly-6 genes encode important lymphocyte and hematopoietic stem cell antigens. Recently, a new member of the human Ly-6 gene superfamily has been described, CD177. CD177 is polymorphic and has at least two alleles, PRV-1 and NB1. CD177 was first described as PRV-1, a gene that is overexpressed in neutrophils from approximately 95% of patients with polycythemia vera and from about half of patients with essential thrombocythemia. CD177 encodes NB1 gp, a 58–64 kD GPI gp that is expressed by neutrophils and neutrophil precursors. NB1 gp carries Human Neutrophil Antigen (HNA-2a. Investigators working to identify the gene encoding NB1 gp called the CD177 allele they described NB1. NB1 gp is unusual in that neutrophils from some healthy people lack the NB1 gp completely and in most people NB1 gp is expressed by a subpopulation of neutrophils. The function of NB1 gp and the role of CD177 in the pathogenesis and clinical course of polycythemia vera and essential thrombocythemia are not yet known. However, measuring neutrophil CD177 mRNA levels has become an important marker for diagnosing the myeloproliferative disorders polycythemia vera and essential thrombocythemia.

  8. The Danish National Chronic Myeloid Neoplasia Registry

    DEFF Research Database (Denmark)

    Bak, Marie; Ibfelt, Else Helene; Stauffer Larsen, Thomas

    2016-01-01

    departmental levels and serve as a platform for research. STUDY POPULATION: The DCMR has nationwide coverage and contains information on patients diagnosed at hematology departments from January 2010 onward, including patients with essential thrombocythemia, polycythemia vera, myelofibrosis, unclassifiable...

  9. Chronic myeloproliferative neoplasms and subsequent cancer risk: a Danish population-based cohort study

    DEFF Research Database (Denmark)

    Frederiksen, Henrik; Farkas, Dóra Körmendiné; Christiansen, Christian Fynbo

    2011-01-01

    Patients with chronic myeloproliferative neoplasms, including essential thrombocythemia (ET), polycythemia vera (PV), and chronic myeloid leukemia (CML), are at increased risk of new hematologic malignancies, but their risk of nonhematologic malignancies remains unknown. In the present study, we...

  10. Highcharts essentials

    CERN Document Server

    Shahid, Bilal

    2014-01-01

    If you are a web developer with a basic knowledge of HTML, CSS, and JavaScript and want to quickly get started with this web charting technology, this is the book for you. This book will also serve as an essential guide to those who have probably used a similar library and are now looking at migrating to Highcharts.

  11. Whole Blood Transcriptional Profiling of Interferon-Inducible Genes Identifies Highly Upregulated IFI27 in Primary Myelofibrosis

    DEFF Research Database (Denmark)

    Skov, Vibe; Larsen, Thomas Stauffer; Thomassen, Mads

    2011-01-01

    focused upon the transcriptional profiling of interferon-associated genes in patients with essential thrombocythemia (ET) (n = 19), polycythemia vera (PV) (n = 41), and primary myelofibrosis (PMF) (n = 9). Using whole-blood transcriptional profiling and accordingly obtaining an integrated signature...

  12. Whole-blood transcriptional profiling of interferon-inducible genes identifies highly upregulated IFI27 in primary myelofibrosis

    DEFF Research Database (Denmark)

    Skov, Vibe; Larsen, Thomas Stauffer; Thomassen, Mads

    2011-01-01

    focused upon the transcriptional profiling of interferon-associated genes in patients with essential thrombocythemia (ET) (n = 19), polycythemia vera (PV) (n = 41), and primary myelofibrosis (PMF) (n = 9). Using whole-blood transcriptional profiling and accordingly obtaining an integrated signature...

  13. Myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Roaldsnes, Christina; Holst, René; Frederiksen, Henrik

    2017-01-01

    BACKGROUND: Polycythemia vera (PV), essential thrombocythemia (ET) and myelofibrosis (MF) are clonal disorders collectively named as myeloproliferative neoplasms (MPN). Published data on epidemiology of MPN after the discovery of the JAK2 mutation and the 2008 WHO classifications are scarce. We...

  14. Essentialism, social constructionism, and the history of homosexuality.

    Science.gov (United States)

    Halwani, R

    1998-01-01

    Social constructionism is the view that homosexuality is not an atemporal and acultural phenomenon. Rather, homosexuality exists only within certain cultures and within certain time periods, most obviously Europe and North America after the nineteenth century. Essentialism is the view that homosexuality is an essential feature of human beings and that it could be found, in principle at least, in any culture and in any time. In this paper, I argue that the historical evidence available to us does not show that social constructionism is the correct view, and that essentialism is fully compatible with such evidence. Furthermore, I argue that the historical evidence does not even render social constructionism more probable than essentialism, i.e., both views are equally probable in the face of this evidence.

  15. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    Science.gov (United States)

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  16. The JAK2 V617F mutation involves B- and T-lymphocyte lineages in a subgroup of patients with Philadelphia-chromosome negative chronic myeloproliferative disorders

    DEFF Research Database (Denmark)

    Larsen, Thomas Stauffer; Christensen, Jacob Haaber; Hasselbalch, Hans Carl

    2007-01-01

    The JAK2 V617F mutation is a frequent genetic event in the three classical Philadelphia-chromosome negative chronic myeloproliferative disorders (Ph(neg.)-CMPD), polycythemia vera (PV), essential thrombocythemia (ET) and idiopathic myelofibrosis (IMF). Its occurrence varies in frequency in regards...

  17. Transcriptional Profiling of Whole Blood Identifies a Unique 5-Gene Signature for Myelofibrosis and Imminent Myelofibrosis Transformation

    DEFF Research Database (Denmark)

    Hasselbalch, Hans Carl; Skov, Vibe; Stauffer Larsen, Thomas

    2014-01-01

    Identifying a distinct gene signature for myelofibrosis may yield novel information of the genes, which are responsible for progression of essential thrombocythemia and polycythemia vera towards myelofibrosis. We aimed at identifying a simple gene signature - composed of a few genes - which were...

  18. Interferon-alpha in the treatment of Philadelphia-negative chronic myeloproliferative neoplasms. Status and perspectives

    DEFF Research Database (Denmark)

    Hasselbalch, Hans Carl; Larsen, Thomas Stauffer; Riley, Caroline Hasselbalch

    2011-01-01

    The Philadelphia-negative chronic myeloproliferative neoplasms encompass essential thrombocythemia (ET), polycythemia vera (PV) and primary myelofibrosis (PMF). A major break-through in the understanding of the pathogenesis of these neoplasms occurred in 2005 by the discovery of the JAK2 V617F...

  19. Treatment and management of myelofibrosis in the era of JAK inhibitors [Corrigendum

    Directory of Open Access Journals (Sweden)

    Keohane C

    2013-10-01

    Full Text Available Keohane C, Radia DH, Harrison CN. Biologics: Targets and Therapy. 2013;7:189–198. On page 193 note that the paragraph beginning "Pacritinib (SB1518; Cell Technology, Inc, Mountain View, CA, USA is a JAK2 and FLT3 inhibitor currently being evaluated at a dose of 400 mg daily in a Phase II study (N=34 that included patients with low platelet counts (<50 × 109/L." should have been "Pacritinib (SB1518; Cell Therapeutics, Inc, Seattle, WA, USA is a JAK2 and FLT3 inhibitor which was evaluated at a dose of 400 mg daily in a Phase II study (N=34 that included patients with low platelet counts (<50 × 109/L." On page 193 in the same paragraph note that "Post-Essential Thrombocythemia Myelofibrosis: PERSIST." should have been "Post-Essential Thrombocythemia Myelofibrosis-1: PERSIST-1."Read the original article

  20. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  1. Application of probability generating function to the essentials of nondestructive nuclear materials assay system using neutron correlation

    International Nuclear Information System (INIS)

    Hosoma, Takashi

    2017-01-01

    In the previous research (JAEA-Research 2015-009), essentials of neutron multiplicity counting mathematics were reconsidered where experiences obtained at the Plutonium Conversion Development Facility were taken into, and formulae of multiplicity distribution were algebraically derived up to septuplet using a probability generating function to make a strategic move in the future. Its principle was reported by K. Böhnel in 1985, but such a high-order expansion was the first case due to its increasing complexity. In this research, characteristics of the high-order correlation were investigated. It was found that higher-order correlation increases rapidly in response to the increase of leakage multiplication, crosses and leaves lower-order correlations behind, when leakage multiplication is > 1.3 that depends on detector efficiency and counter setting. In addition, fission rates and doubles count rates by fast neutron and by thermal neutron in their coexisting system were algebraically derived using a probability generating function again. Its principle was reported by I. Pázsit and L. Pál in 2012, but such a physical interpretation, i.e. associating their stochastic variables with fission rate, doubles count rate and leakage multiplication, is the first case. From Rossi-alpha combined distribution and measured ratio of each area obtained by Differential Die-Away Self-Interrogation (DDSI) and conventional assay data, it is possible to estimate: the number of induced fissions per unit time by fast neutron and by thermal neutron; the number of induced fissions (< 1) by one source neutron; and individual doubles count rates. During the research, a hypothesis introduced in their report was proved to be true. Provisional calculations were done for UO_2 of 1∼10 kgU containing ∼ 0.009 wt% "2"4"4Cm. (author)

  2. Are chronic myeloproliferative neoplasms associated with age-related macular degeneration?

    DEFF Research Database (Denmark)

    Bak, M.; Sorensen, T. L.; Flachs, E. M.

    2015-01-01

    population-based matched cohort study using Danish registries. We included all patients age 18+ or older with a first listed diagnosis of MPN in the Danish National Patient Registry between 1994 and 2013. Patients with Essential Thrombocythemia (ET), Polycythemia Vera (PV), Myelofibrosis (MF), Unclassifiable...

  3. Detection of CALR and MPL Mutations in Low Allelic Burden JAK2 V617F Essential Thrombocythemia.

    Science.gov (United States)

    Usseglio, Fabrice; Beaufils, Nathalie; Calleja, Anne; Raynaud, Sophie; Gabert, Jean

    2017-01-01

    Myeloproliferative neoplasms are clonal hematopoietic stem cell disorders characterized by aberrant proliferation and an increased tendency toward leukemic transformation. The genes JAK2, MPL, and CALR are frequently altered in these syndromes, and their mutations are often a strong argument for diagnosis. We analyzed the mutational profiles of these three genes in a cohort of 164 suspected myeloproliferative neoplasms. JAK2 V617F mutation was detected by real-time PCR, whereas high-resolution melting analysis followed by Sanger sequencing were used for searching for mutations in JAK2 exon 12, CALR, and MPL. JAK2 V617F mutation was associated with CALR (n = 4) and MPL (n = 4) mutations in 8 of 103 essential thrombocytosis patients. These cases were harboring a JAK2 V617F allelic burden of MPL genes in myeloproliferative neoplasms and suggest that CALR and MPL should be analyzed not only in JAK2-negative patients but also in low V617F mutation patients. Follow-up of these double-mutation cases will be important for determining whether this group of patients presents particular evolution or complications. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  4. Philadelphia-negative classical myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Barbui, T.; Barosi, G.; Birgegard, G.

    2011-01-01

    the criterion of clinical relevance. Statements were produced using a Delphi process, and two consensus conferences involving a panel of 21 experts appointed by the European LeukemiaNet (ELN) were convened. Patients with polycythemia vera (PV) and essential thrombocythemia (ET) should be defined as high risk...

  5. A new era for IFN-α in the treatment of Philadelphia-negative chronic myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Hasselbalch, H.C.

    2011-01-01

    In recent years, several studies have shown that IFN-α2 is able to induce molecular remissions with undetectable JAK2V617F in a subset of patients with essential thrombocythemia (ET) and polycythemia vera (PV), even with normalization of the bone marrow and sustained molecular remissions after...

  6. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  7. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  8. JAK2V617F Somatic Mutation In The General Population

    DEFF Research Database (Denmark)

    Nielsen, Camilla; Bojesen, Stig E; Nordestgaard, Børge G

    2014-01-01

    of myeloproliferative neoplasm from no disease (n=8 at re-examination) through essential thrombocythemia (n=20) and polycythemia vera (n=13) to primary myelofibrosis (n=7). Among those diagnosed with a myeloproliferative neoplasm only at re-examination in 2012, in the preceding years JAK2V617F mutation burden increased...

  9. Minimal residual disease after long-term interferon-alpha2 treatment

    DEFF Research Database (Denmark)

    Utke Rank, Cecilie; Weis Bjerrum, Ole; Larsen, Thomas Stauffer

    2016-01-01

    Essential thrombocythemia (ET) and polycythemia vera (PV) are Philadelphia chromosome-negative chronic myeloproliferative neoplasms (MPNs) characterized by the JAK2 V617F mutation, which can be found in more than 98% of PV patients and in ∼ 50% of ET patients. Assessment of the JAK2 V617F allele...

  10. Molecular diagnostics of myeloproliferative neoplasms

    DEFF Research Database (Denmark)

    Langabeer, S. E.; Andrikovics, H.; Asp, J.

    2015-01-01

    Since the discovery of the JAK2 V617F mutation in the majority of the myeloproliferative neoplasms (MPN) of polycythemia vera, essential thrombocythemia and primary myelofibrosis ten years ago, further MPN-specific mutational events, notably in JAK2 exon 12, MPL exon 10 and CALR exon 9 have been...

  11. Bone marrow histomorphology and JAK2 mutation status in essential thrombocythemia

    DEFF Research Database (Denmark)

    Stauffer Larsen, Thomas; Hasselbalch, Hans Carl; Pallisgaard, Niels

    2007-01-01

    for evaluation. 14 patients were reclassified as having prefibrotic idiopathic myelofibrosis (IMF), whilst the ET diagnosis was sustained in 19 patients. The individual bone marrow parameters of the reviewed diagnosis showed no correlation with JAK2 V617F mutation status, which was determined by a highly...

  12. Age-Related Macular Degeneration in Patients With Chronic Myeloproliferative Neoplasms

    DEFF Research Database (Denmark)

    Bak, Marie; Sørensen, Torben Lykke; Flachs, Esben Meulengracht

    2017-01-01

    , and December 31, 2013, of essential thrombocythemia, polycythemia vera, myelofibrosis, or unclassifiable MPNs. For each patient, 10 age- and sex-matched controls were included. All patients without prior AMD were followed up from the date of diagnosis (or corresponding entry date for the controls) until......, 3063 with polycythemia vera, 547 with myelofibrosis, and 1720 with unclassifiable MPNs) and 4.3 (95% CI, 4.1-4.4) for the 77445 controls, while the 10-year risk of AMD was 2.4% (95% CI, 2.1%-2.8%) for patients with MPNs and 2.3% (95% CI, 2.2%-2.4%) for the controls. The risk of AMD was increased...... overall for patients with MPNs (adjusted HR, 1.3; 95% CI, 1.1-1.5), with adjusted HRs for the subtypes of 1.2 (95% CI, 1.0-1.6) for essential thrombocythemia, 1.4 (95% CI, 1.2-1.7) for polycythemia vera, 1.7 (95% CI, 0.8-4.0) for myelofibrosis, and 1.5 (95% CI, 1.1-2.1) for unclassifiable MPNs...

  13. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  14. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  15. JAK2 and MPL gene mutations in V617F-negative myeloproliferative neoplasms.

    NARCIS (Netherlands)

    Siemiatkowska, A.M.; Bieniaszewska, M.; Hellmann, A.; Limon, J.

    2010-01-01

    We report three novel mutations in JAK2 exons 12, 19 and 25 in V617F-negative patients with polycythemia vera, essential thrombocythemia and idiopathic myelofibrosis. Scanning of JAK2 exons 12-25 and MPL exon 10 revealed the presence of JAK2 alterations in six and MPL W515L/K mutations in five of 34

  16. Flux continuity and probability conservation in complexified Bohmian mechanics

    International Nuclear Information System (INIS)

    Poirier, Bill

    2008-01-01

    Recent years have seen increased interest in complexified Bohmian mechanical trajectory calculations for quantum systems as both a pedagogical and computational tool. In the latter context, it is essential that trajectories satisfy probability conservation to ensure they are always guided to where they are most needed. We consider probability conservation for complexified Bohmian trajectories. The analysis relies on time-reversal symmetry considerations, leading to a generalized expression for the conjugation of wave functions of complexified variables. This in turn enables meaningful discussion of complexified flux continuity, which turns out not to be satisfied in general, though a related property is found to be true. The main conclusion, though, is that even under a weak interpretation, probability is not conserved along complex Bohmian trajectories

  17. Capturing alternative secondary structures of RNA by decomposition of base-pairing probabilities.

    Science.gov (United States)

    Hagio, Taichi; Sakuraba, Shun; Iwakiri, Junichi; Mori, Ryota; Asai, Kiyoshi

    2018-02-19

    It is known that functional RNAs often switch their functions by forming different secondary structures. Popular tools for RNA secondary structures prediction, however, predict the single 'best' structures, and do not produce alternative structures. There are bioinformatics tools to predict suboptimal structures, but it is difficult to detect which alternative secondary structures are essential. We proposed a new computational method to detect essential alternative secondary structures from RNA sequences by decomposing the base-pairing probability matrix. The decomposition is calculated by a newly implemented software tool, RintW, which efficiently computes the base-pairing probability distributions over the Hamming distance from arbitrary reference secondary structures. The proposed approach has been demonstrated on ROSE element RNA thermometer sequence and Lysine RNA ribo-switch, showing that the proposed approach captures conformational changes in secondary structures. We have shown that alternative secondary structures are captured by decomposing base-paring probabilities over Hamming distance. Source code is available from http://www.ncRNA.org/RintW .

  18. A unified definition of clinical resistance/intolerance to hydroxyurea in essential thrombocythemia

    DEFF Research Database (Denmark)

    Barosi, G; Besses, C; Birgegard, G

    2007-01-01

    BACKGROUND: Three main problems hamper the identification of wheat food allergens: (1) lack of a standardized procedure for extracting all of the wheat protein fractions; (2) absence of double-blind, placebo-controlled food challenge studies that compare the allergenic profile of Osborne's three ...

  19. The calreticulin (CALR) exon 9 mutations are promising targets for cancer immune therapy

    DEFF Research Database (Denmark)

    Holmström, M O; Martinenaite, E; Ahmad, S M

    2017-01-01

    The calreticulin (CALR) exon 9 mutations are found in ∼30% of patients with essential thrombocythemia and primary myelofibrosis. Recently, we reported spontaneous immune responses against the CALR mutations. Here, we describe that CALR-mutant (CALRmut)-specific T cells are able to specifically re...... CALR exon 9 mutations.Leukemia advance online publication, 15 August 2017; doi:10.1038/leu.2017.214....

  20. The 2016 WHO classification and diagnostic criteria for myeloproliferative neoplasms: document summary and in-depth discussion

    OpenAIRE

    Barbui, Tiziano; Thiele, Jürgen; Gisslinger, Heinz; Kvasnicka, Hans Michael; Vannucchi, Alessandro M.; Guglielmelli, Paola; Orazi, Attilio; Tefferi, Ayalew

    2018-01-01

    The new edition of the 2016 World Health Organization (WHO) classification system for tumors of the hematopoietic and lymphoid tissues was published in September 2017. Under the category of myeloproliferative neoplasms (MPNs), the revised document includes seven subcategories: chronic myeloid leukemia, chronic neutrophilic leukemia, polycythemia vera (PV), primary myelofibrosis (PMF), essential thrombocythemia (ET), chronic eosinophilic leukemia-not otherwise specified and MPN, unclassifiable...

  1. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  2. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  3. A probability model for the failure of pressure containing parts

    International Nuclear Information System (INIS)

    Thomas, H.M.

    1978-01-01

    The model provides a method of estimating the order of magnitude of the leakage failure probability of pressure containing parts. It is a fatigue based model which makes use of the statistics available for both specimens and vessels. Some novel concepts are introduced but essentially the model simply quantifies the obvious i.e. that failure probability increases with increases in stress levels, number of cycles, volume of material and volume of weld metal. A further model based on fracture mechanics estimates the catastrophic fraction of leakage failures. (author)

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. Nuclear power: accident probabilities, risks, and benefits. A bibliography

    International Nuclear Information System (INIS)

    1976-02-01

    This report is a selected listing of 396 documents pertaining to nuclear accident probability and nuclear risk. Because of the attention focused on these concepts by the recent (August 1974) publication of the draft of WASH-1400, ''Reactor Safety Study,'' it is intended that this bibliography make conveniently available the existence of relevant literature on these concepts. Such an awareness will enhance an understanding of probability and risk as applied to nuclear power plants and is essential to their further development and/or application. This bibliography includes first a listing of the selected documents with abstracts and keywords, followed by three indexes: (1) keyword, (2) author, and (3) permuted title

  6. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  7. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  8. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  9. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Science.gov (United States)

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Applications of Algorithmic Probability to the Philosophy of Mind

    OpenAIRE

    Leuenberger, Gabriel

    2014-01-01

    This paper presents formulae that can solve various seemingly hopeless philosophical conundrums. We discuss the simulation argument, teleportation, mind-uploading, the rationality of utilitarianism, and the ethics of exploiting artificial general intelligence. Our approach arises from combining the essential ideas of formalisms such as algorithmic probability, the universal intelligence measure, space-time-embedded intelligence, and Hutter's observer localization. We argue that such universal...

  11. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  12. Characterization and Prognosis Significance of JAK2 (V617F), MPL, and CALR Mutations in Philadelphia-Negative Myeloproliferative Neoplasms

    OpenAIRE

    Singdong, Roongrudee; Siriboonpiputtana, Teerapong; Chareonsirisuthigul, Takol; Kongruang, Adcharee; Limsuwanachot, Nittaya; Sirirat, Tanasan; Chuncharunee, Suporn; Rerkamnuaychoke, Budsaba

    2016-01-01

    Background: The discovery of somatic acquired mutations of JAK2 (V617F) in Philadelphia-negative myeloproliferative neoplasms (Ph-negative MPNs) including polycythemia vera (PV), essential thrombocythemia (ET), and primary myelofibrosis (PMF) has not only improved rational disease classification and prognostication but also brings new understanding insight into the pathogenesis of diseases. Dosage effects of the JAK2 (V617F) allelic burden in Ph-negative MPNs may partially influence clinical ...

  13. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  14. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  15. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  16. Ligand-independent Thrombopoietin Mutant Receptor Requires Cell Surface Localization for Endogenous Activity*

    OpenAIRE

    Marty, Caroline; Chaligné, Ronan; Lacout, Catherine; Constantinescu, Stefan N.; Vainchenker, William; Villeval, Jean-Luc

    2009-01-01

    The activating W515L mutation in the thrombopoietin receptor (MPL) has been identified in primary myelofibrosis and essential thrombocythemia. MPL belongs to a subset of the cytokine receptor superfamily that requires the JAK2 kinase for signaling. We examined whether the ligand-independent MPLW515L mutant could signal intracellularly. Addition of the endoplasmic reticulum (ER) retention KDEL sequence to the receptor C terminus efficiently locked MPLW515L within its na...

  17. A sensitive detection method for MPLW515L or MPLW515K mutation in chronic myeloproliferative disorders with locked nucleic acid-modified probes and real-time polymerase chain reaction.

    OpenAIRE

    Alessandro, Pancrazzi; Paola, Guglielmelli; Vanessa, Ponziani; Gaetano, Bergamaschi; Alberto, Bosi; Giovanni, Barosi; Alessandro M, Vannucchi

    2008-01-01

    Acquired mutations in the juxtamembrane region of MPL (W515K or W515L), the receptor for thrombopoietin, have been described in patients with primary myelofibrosis or essential thrombocythemia, which are chronic myeloproliferative disorders. We have developed a real-time polymerase chain reaction assay for the detection and quantification of MPL mutations that is based on locked nucleic acid fluorescent probes. Mutational analysis was performed using DNA from granulocytes. Reference curves we...

  18. Essential hypertension in adolescents and children: Recent advances in causative mechanisms

    Directory of Open Access Journals (Sweden)

    Manu Raj

    2011-01-01

    Full Text Available Essential hypertension is the most common form of hypertension in adults, and it is recognized more often in adolescents than in younger children. It is well known that the probability of a diagnosis of essential hypertension increases with age from birth onward. The initiation of high blood pressure burden starts in childhood and continues through adolescence to persist in the remaining phases of life. The genesis of essential hypertension is likely to be multifactorial. Obesity, insulin resistance, activation of sympathetic nervous system, sodium homeostasis, renin-angiotensin system, vascular smooth muscle structure and reactivity, serum uric acid levels, genetic factors and fetal programming have been implicated in this disorder. In addition, erythrocyte sodium transport, the free calcium concentration in platelets and leukocytes, urine kallikrein excretion, and sympathetic nervous system receptors have also been investigated as other possible mechanisms. Obesity in children appears to be the lead contributor of essential hypertension prevalence in children and adolescents. Suggested mechanisms of obesity-related hypertension include insulin resistance, sodium retention, increased sympathetic nervous system activity, activation of renin-angiotensin-aldosterone, and altered vascular function. The etiopathogenesis of essential hypertension in children and adolescents appears to closely resemble that of adults. The minor variations seen could probably be due to the evolving nature of this condition. Many of the established mechanisms that are confirmed in adult population need to be replicated in the pediatric age group by means of definitive research for a better understanding of this condition in future.

  19. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  20. Defining the Thrombotic Risk in Patients with Myeloproliferative Neoplasms

    Directory of Open Access Journals (Sweden)

    Fabrizio Vianello

    2011-01-01

    Full Text Available Polycythemia vera (PV and essential thrombocythemia (ET are two Philadelphia-negative myeloproliferative neoplasms (MPN associated with an acquired mutation in the JAK2 tyrosine kinase gene. There is a rare incidence of progression to myelofibrosis and myeloid metaplasia in both disorders, which may or may not precede transformation to acute myeloid leukemia, but thrombosis is the main cause of morbidity and mortality. The pathophysiology of thrombosis in patients with MPN is complex. Traditionally, abnormalities of platelet number and function have been claimed as the main players, but increased dynamic interactions between platelets, leukocytes, and the endothelium do probably represent a fundamental interplay in generating a thrombophilic state. In addition, endothelial dysfunction, a well-known risk factor for vascular disease, may play a role in the thrombotic risk of patients with PV and ET. The identification of plasma markers translating the hemostatic imbalance in patients with PV and ET would be extremely helpful in order to define the subgroup of patients with a significant clinical risk of thrombosis.

  1. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  2. Regional cerebral blood flow in childhood headache

    International Nuclear Information System (INIS)

    Roach, E.S.; Stump, D.A.

    1989-01-01

    Regional cerebral blood flow (rCBF) was measured in 16 cranial regions in 23 children and adolescents with frequent headaches using the non-invasive Xenon-133 inhalation technique. Blood flow response to 5% carbon dioxide (CO2) was also determined in 21 patients, while response to 50% oxygen was measured in the two patients with hemoglobinopathy. Included were 10 patients with a clinical diagnosis of migraine, 4 with musculoskeletal headaches, and 3 with features of both types. Also studied were 2 patients with primary thrombocythemia, 2 patients with hemoglobinopathy and headaches, 1 patient with polycythemia, and 1 with headaches following trauma. With two exceptions, rCBF determinations were done during an asymptomatic period. Baseline rCBF values tended to be higher in these young patients than in young adults done in our laboratory. Localized reduction in the expected blood flow surge after CO2 inhalation, most often noted posteriorly, was seen in 8 of the 13 vascular headaches, but in none of the musculoskeletal headache group. Both patients with primary thrombocythemia had normal baseline flow values and altered responsiveness to CO2 similar to that seen in migraineurs; thus, the frequently reported headache and transient neurologic signs with primary thrombocythemia are probably not due to microvascular obstruction as previously suggested. These data support the concept of pediatric migraine as a disorder of vasomotor function and also add to our knowledge of normal rCBF values in younger patients. Demonstration of altered vasomotor reactivity to CO2 could prove helpful in children whose headache is atypical

  3. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  4. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  5. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  6. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  7. Essentially Optimal Universally Composable Oblivious Transfer

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2009-01-01

    . Communication complexity: it communicates O(1) group elements to transfer one out of two group elements. The Big-O notation hides 32, meaning that the communication is probably not optimal, but is essentially optimal in that the overhead is at least constant. Our construction is based on pairings, and we assume......Oblivious transfer is one of the most important cryptographic primitives, both for theoretical and practical reasons and several protocols were proposed during the years. We provide the first oblivious transfer protocol which is simultaneously optimal on the following list of parameters: Security...

  8. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  9. Chronic myeloproliferative disorders: A rarest case with oral manifestations and dental management

    Directory of Open Access Journals (Sweden)

    Pritesh B Ruparelia

    2012-01-01

    Full Text Available Chronic myeloproliferative disorders (CMPD are rarest hematological disorders (malignant myeloid neoplasms. The three most common chronic myeloproliferative disorders are polycythemia vera, essential thrombocythemia and chronic idiopathic myelofibrosis. Clinical manifestations (including oral manifestations of these disorders are overlapping with each other and with other hematologic disorders, which makes the diagnosis of CMPD a challenging task. In this article we report a rare to rarest case of CMPD at dental outpatient department, its oral manifestations and its management in dental clinics.

  10. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  11. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  13. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  14. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  15. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  16. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  17. Space Object Collision Probability via Monte Carlo on the Graphics Processing Unit

    Science.gov (United States)

    Vittaldev, Vivek; Russell, Ryan P.

    2017-09-01

    Fast and accurate collision probability computations are essential for protecting space assets. Monte Carlo (MC) simulation is the most accurate but computationally intensive method. A Graphics Processing Unit (GPU) is used to parallelize the computation and reduce the overall runtime. Using MC techniques to compute the collision probability is common in literature as the benchmark. An optimized implementation on the GPU, however, is a challenging problem and is the main focus of the current work. The MC simulation takes samples from the uncertainty distributions of the Resident Space Objects (RSOs) at any time during a time window of interest and outputs the separations at closest approach. Therefore, any uncertainty propagation method may be used and the collision probability is automatically computed as a function of RSO collision radii. Integration using a fixed time step and a quartic interpolation after every Runge Kutta step ensures that no close approaches are missed. Two orders of magnitude speedups over a serial CPU implementation are shown, and speedups improve moderately with higher fidelity dynamics. The tool makes the MC approach tractable on a single workstation, and can be used as a final product, or for verifying surrogate and analytical collision probability methods.

  18. Information dimension analysis of bacterial essential and nonessential genes based on chaos game representation

    International Nuclear Information System (INIS)

    Zhou, Qian; Yu, Yong-ming

    2014-01-01

    Essential genes are indispensable for the survival of an organism. Investigating features associated with gene essentiality is fundamental to the prediction and identification of the essential genes. Selecting features associated with gene essentiality is fundamental to predict essential genes with computational techniques. We use fractal theory to make comparative analysis of essential and nonessential genes in bacteria. The information dimensions of essential genes and nonessential genes available in the DEG database for 27 bacteria are calculated based on their gene chaos game representations (CGRs). It is found that weak positive linear correlation exists between information dimension and gene length. Moreover, for genes of similar length, the average information dimension of essential genes is larger than that of nonessential genes. This indicates that essential genes show less regularity and higher complexity than nonessential genes. Our results show that for bacterium with a similar number of essential genes and nonessential genes, the CGR information dimension is helpful for the classification of essential genes and nonessential genes. Therefore, the gene CGR information dimension is very probably a useful gene feature for a genetic algorithm predicting essential genes. (paper)

  19. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  20. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  1. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  2. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  3. Splenomegaly in myelofibrosis—new options for therapy and the therapeutic potential of Janus kinase 2 inhibitors

    Directory of Open Access Journals (Sweden)

    Randhawa Jasleen

    2012-08-01

    Full Text Available Abstract Splenomegaly is a common sign of primary myelofibrosis (PMF, post-polycythemia vera myelofibrosis (post-PV MF, and post-essential thrombocythemia myelofibrosis (post-ET MF that is associated with bothersome symptoms, which have a significant negative impact on patients’ quality of life. It may also be present in patients with advanced polycythemia vera (PV or essential thrombocythemia (ET. Until recently, none of the therapies used to treat MF were particularly effective in reducing splenomegaly. The discovery of an activating Janus kinase 2 (JAK2 activating mutation (JAK2V617F that is present in almost all patients with PV and in about 50-60 % of patients with ET and PMF led to the initiation of several trials investigating the clinical effectiveness of various JAK2 (or JAK1/JAK2 inhibitors for the treatment of patients with ET, PV, and MF. Some of these trials have documented significant clinical benefit of JAK inhibitors, particularly in terms of regression of splenomegaly. In November 2011, the US Food and Drug Administration approved the use of the JAK1- and JAK2-selective inhibitor ruxolitinib for the treatment of patients with intermediate or high-risk myelofibrosis, including PMF, post-PV MF, and post-ET MF. This review discusses current therapeutic options for splenomegaly associated with primary or secondary MF and the treatment potential of the JAK inhibitors in this setting.

  4. Prostate Cancer Probability Prediction By Machine Learning Technique.

    Science.gov (United States)

    Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena

    2017-11-26

    The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.

  5. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  6. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  7. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  9. Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2011-03-01

    Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.

  10. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  11. Estimating the Probabilities of Default for Callable Bonds: A Duffie-Singleton Approach

    OpenAIRE

    David Wang

    2005-01-01

    This paper presents a model for estimating the default risks implicit in the prices of callable corporate bonds. The model considers three essential ingredients in the pricing of callable corporate bonds: stochastic interest rate, default risk, and call provision. The stochastic interest rate is modeled as a square-root diffusion process. The default risk is modeled as a constant spread, with the magnitude of this spread impacting the probability of a Poisson process governing the arrival of ...

  12. VHA-19 is essential in Caenorhabditis elegans oocytes for embryogenesis and is involved in trafficking in oocytes.

    Directory of Open Access Journals (Sweden)

    Alison J Knight

    Full Text Available There is an urgent need to develop new drugs against parasitic nematodes, which are a significant burden on human health and agriculture. Information about the function of essential nematode-specific genes provides insight to key nematode-specific processes that could be targeted with drugs. We have characterized the function of a novel, nematode-specific Caenorhabditis elegans protein, VHA-19, and show that VHA-19 is essential in the germline and, specifically, the oocytes, for the completion of embryogenesis. VHA-19 is also involved in trafficking the oocyte receptor RME-2 to the oocyte plasma membrane and is essential for osmoregulation in the embryo, probably because VHA-19 is required for proper eggshell formation via exocytosis of cortical granules or other essential components of the eggshell. VHA-19 may also have a role in cytokinesis, either directly or as an indirect effect of its role in osmoregulation. Critically, VHA-19 is expressed in the excretory cell in both larvae and adults, suggesting that it may have a role in osmoregulation in C. elegans more generally, probably in trafficking or secretion pathways. This is the first time a role for VHA-19 has been described.

  13. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  14. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  15. JAK2 V617F, MPL, and CALR mutations in essential thrombocythaemia and major thrombotic complications: a single-institute retrospective analysis.

    Science.gov (United States)

    Pósfai, Éva; Marton, Imelda; Király, Péter Attila; Kotosz, Balázs; Kiss-László, Zsuzsanna; Széll, Márta; Borbényi, Zita

    2015-07-01

    Thrombo-haemorrhagic events are the main cause of morbidity and mortality in essential thrombocythemia. The aim of this study was to estimate the incidence of thrombotic events and the impact of the JAK2V617F, MPL (W515L, W515K, W515R, W515A and S505N) and CALR (type-1, type-2) mutations on 101 essential thrombocythaemia patients (72 females and 29 males with a mean age of 61 years) diagnosed in a Southern Hungarian regional academic centre. The incidence of major thrombosis was 13.86 %. Sixty percent of the patients carried the JAK2V617F mutation. The MPL mutations were analysed by sequencing and the W515L was the only one we could identify with an incidence of 3.96 %. Type-2 CALR mutation could be identified in 3 cases among the patients who had JAK2/MPL-unmutated ET. Statistical analyses revealed that the JAK2V617F mutation was associated with significantly increased levels of platelet (p = 0.042), haemoglobin (p = 0.000), red blood cell (p = 0.000) and haematocrit (p = 0.000) and hepatomegaly (p = 0.045) at diagnosis compared to JAK2V617F negative counterparts, however there was no significant association between the JAK2V617F mutation status (relative risk: 1.297, 95 % CI 0.395-4.258; p = 0.668) and subsequent thrombotic complications. The impact of JAK2V617F, MPL W515L and CALR mutations on the clinical findings at the diagnosis of ET was obvious, but their statistically significant role in the prediction of thrombotic events could not be proven in this study. Our results indirectly support the concept that, besides the quantitative and qualitative changes in the platelets, the mechanisms leading to thrombosis are more complex and multifactorial.

  16. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  17. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  18. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  1. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  2. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  5. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  6. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  7. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  8. A mutação JAK2 V617F e as síndromes mieloproliferativas JAK2 V617F mutation and the myeloproliferative disorders

    Directory of Open Access Journals (Sweden)

    Bárbara C. R. Monte-Mór

    2008-01-01

    Full Text Available Síndromes mieloproliferativas (SMPs são doenças hematopoéticas de origem clonal que apresentam amplificação de uma ou mais linhagens mielóides. Policitemia vera (PV, trombocitemia essencial (TE, mielofibrose idiopática (MF e leucemia mielóide crônica (LMC são consideradas SMPs clássicas e apresentam características clínicas e biológicas comuns. Ao contrário de LMC, cuja etiologia está relacionada à proteína constitutivamente ativa Bcr-Abl, o mecanismo molecular de PV, TE e MF permaneceu por muito tempo desconhecido. Esta revisão se foca na recente descoberta da mutação JAK2 V617F em pacientes com PV, TE e MF, sua relação com o fenótipo mieloproliferativo e implicações na abordagem clínica de pacientes.Myeloproliferative disorders are clonal hematopoietic diseases that are characterized by the amplification of one or more myeloid lineages. Polycythemia vera, essential thrombocythemia, idiopathic myelofibrosis and chronic myeloid leukemia are considered classic myeloproliferative disorders and share common clinical and biological features. While the genetic basis of chronic myeloid leukemia is shown to be the constitutive active protein BCR-ABL, the main molecular lesions in polycythemia vera, essential thrombocythemia and idiopathic myelofibrosis remain unknown. This review focuses on the recent discovery of the JAK2 V617F mutation, its relationship to the myeloproliferative phenotype and implications in the clinical approach of patients.

  9. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  10. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  11. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  12. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  13. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  14. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  15. Differential Cytotoxic Activity of Essential Oil of Lippia citriodora from Different Regions in Morocco.

    Science.gov (United States)

    Oukerrou, Moulay Ali; Tilaoui, Mounir; Mouse, Hassan Ait; Bouchmaa, Najat; Zyad, Abdelmajid

    2017-07-01

    The aim of this work was to investigate the cytotoxic effect of the essential oil of dried leaves of Lippia citriodora (H.B. & K.) harvested in different regions of Morocco. This effect was evaluated against the P815 murine mastocytoma cell line using the MTT assay. Interestingly, this work demonstrated for the first time that these essential oils exhibited a strong cytotoxic activity against the P815 cell line, with IC 50 values ranging from 7.75 to 13.25 μg/ml. This cytotoxicity began early and increased in a dose- and time-dependent manner. The chemical profile of these essential oils was analyzed by gas chromatography coupled to mass spectrometry. Importantly, the difference in terms of major components' contents was not significant suggesting probably that the differential cytotoxicity between these essential oils could be attributed to the difference in the content of these essential oils in minor compounds, which could interact with each other or with the main molecules. Finally, this study demonstrated for the first time that essential oils of L. citriodora from different regions in Morocco induced apoptosis against P815 tumor cell line. © 2017 Wiley-VHCA AG, Zurich, Switzerland.

  16. Accurate step-hold tracking of smoothly varying periodic and aperiodic probability.

    Science.gov (United States)

    Ricci, Matthew; Gallistel, Randy

    2017-07-01

    Subjects observing many samples from a Bernoulli distribution are able to perceive an estimate of the generating parameter. A question of fundamental importance is how the current percept-what we think the probability now is-depends on the sequence of observed samples. Answers to this question are strongly constrained by the manner in which the current percept changes in response to changes in the hidden parameter. Subjects do not update their percept trial-by-trial when the hidden probability undergoes unpredictable and unsignaled step changes; instead, they update it only intermittently in a step-hold pattern. It could be that the step-hold pattern is not essential to the perception of probability and is only an artifact of step changes in the hidden parameter. However, we now report that the step-hold pattern obtains even when the parameter varies slowly and smoothly. It obtains even when the smooth variation is periodic (sinusoidal) and perceived as such. We elaborate on a previously published theory that accounts for: (i) the quantitative properties of the step-hold update pattern; (ii) subjects' quick and accurate reporting of changes; (iii) subjects' second thoughts about previously reported changes; (iv) subjects' detection of higher-order structure in patterns of change. We also call attention to the challenges these results pose for trial-by-trial updating theories.

  17. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  18. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. Evaluation of plitidepsin in patients with primary myelofibrosis and post polycythemia vera/essential thrombocythemia myelofibrosis: results of preclinical studies and a phase II clinical trial

    International Nuclear Information System (INIS)

    Pardanani, A; Tefferi, A; Guglielmelli, P; Bogani, C; Bartalucci, N; Rodríguez, J; Extremera, S; Pérez, I; Alfaro, V; Vannucchi, A M

    2015-01-01

    Previous data established that plitidepsin, a cyclic depsipeptide, exerted activity in a mouse model of myelofibrosis (MF). New preclinical experiments reported herein found that low nanomolar plitidepsin concentrations potently inhibited the proliferation of JAK2V617F-mutated cell lines and reduced colony formation by CD34+ cells of individuals with MF, at least in part through modulation of p27 levels. Cells of MF patients had significantly reduced p27 content, that were modestly increased upon plitidepsin exposure. On these premise, an exploratory phase II trial evaluated plitidepsin 5 mg/m 2 3-h intravenous infusion administered on days 1 and 15 every 4 weeks (q4wk). Response rate (RR) according to the International Working Group for Myelofibrosis Research and Treatment consensus criteria was 9.1% (95% CI, 0.2–41.3%) in 11 evaluable patients during the first trial stage. The single responder achieved a red cell transfusion independence and stable disease was reported in nine additional patients (81.8%). Eight patients underwent a short-lasting improvement of splenomegaly. In conclusion, plitidepsin 5 mg/m 2 3-h infusion q4wk was well tolerated but had a modest activity in patients with primary, post-polycythaemia vera or post-essential thrombocythaemia MF. Therefore, this trial was prematurely terminated and we concluded that further clinical trials with plitidepsin as single agent in MF are not warranted

  1. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  2. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  3. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  6. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  7. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  8. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  11. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  12. Detection of MPL mutations by a novel allele-specific PCR-based strategy.

    Science.gov (United States)

    Furtado, Larissa V; Weigelin, Helmut C; Elenitoba-Johnson, Kojo S J; Betz, Bryan L

    2013-11-01

    MPL mutation testing is recommended in patients with suspected primary myelofibrosis or essential thrombocythemia who lack the JAK2 V617F mutation. MPL mutations can occur at allelic levels below 15%, which may escape detection by commonly used mutation screening methods such as Sanger sequencing. We developed a novel multiplexed allele-specific PCR assay capable of detecting most recurrent MPL exon 10 mutations associated with primary myelofibrosis and essential thrombocythemia (W515L, W515K, W515A, and S505N) down to a sensitivity of 2.5% mutant allele. Test results were reviewed from 15 reference cases and 1380 consecutive specimens referred to our laboratory for testing. Assay performance was compared to Sanger sequencing across a series of 58 specimens with MPL mutations. Positive cases consisted of 45 with W515L, 6 with S505N, 5 with W515K, 1 with W515A, and 1 with both W515L and S505N. Seven cases had mutations below 5% that were undetected by Sanger sequencing. Ten additional cases had mutation levels between 5% and 15% that were not consistently detected by sequencing. All results were easily interpreted in the allele-specific test. This assay offers a sensitive and reliable solution for MPL mutation testing. Sanger sequencing appears insufficiently sensitive for robust MPL mutation detection. Our data also suggest the relative frequency of S505N mutations may be underestimated, highlighting the necessity for inclusion of this mutation in MPL test platforms. Copyright © 2013 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  13. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  16. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  17. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  18. Chemical composition and biological activities of leaf and fruit essential oils from Eucalyptus camaldulensis.

    Science.gov (United States)

    Dogan, Gulden; Kara, Nazan; Bagci, Eyup; Gur, Seher

    2017-10-26

    The chemical composition of the essential oils from the leaves and fruit of Eucalyptus camaldulensis grown in Mersin, Turkey was analyzed using gas chromatography (GC) and gas chromatography-mass spectrometry (GC-MS) techniques. The biological activities (antibacterial and antifungal) were examined using the agar well diffusion method. The main leaf oil constituents were p-cymene (42.1%), eucalyptol (1,8-cineole) (14.1%), α-pinene (12.7%) and α-terpinol (10.7%). The main constituents of the fruit oil were eucalyptol (1,8-cineole) (34.5%), p-cymene (30.0%), α-terpinol (15.1%) and α-pinene (9.0%). Our results showed that both types of oils are rich in terms of monoterpene hydrocarbons and oxygenated monoterpenes. The leaf and fruit essential oils of E. camaldulensis significantly inhibited the growth of Gram-positive (Staphylococcus aureus and Bacillus subtilis) and Gram-negative (Escherichia coli and Streptococcus sp.) bacteria (poils also showed fungicidal activity against Candida tropicalis and C. globrata. Leaf essential oils showed more activity than fruit essential oils, probably due to the higher p-cymene concentration in leaves.

  19. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  20. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  1. Non-driver mutations in myeloproliferative neoplasm-associated myelofibrosis

    Directory of Open Access Journals (Sweden)

    Bing Li

    2017-05-01

    Full Text Available Abstract We studied non-driver mutations in 62 subjects with myeloproliferative neoplasm (MPN-associated myelofibrosis upon diagnosis, including 45 subjects with primary myelofibrosis (PMF and 17 with post-polycythemia vera or post-essential thrombocythemia myelofibrosis (post-PV/ET MF. Fifty-eight subjects had ≥1 non-driver mutation upon diagnosis. Mutations in mRNA splicing genes, especially in U2AF1, were significantly more frequent in PMF than in post-PV/ET MF (33 vs. 6%; P = 0.015. There were also striking differences in clonal architecture. These data indicate different genomic spectrums between PMF and post-PV/ET MF.

  2. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  3. LOGISTIC REGRESSION AS A TOOL FOR DETERMINATION OF THE PROBABILITY OF DEFAULT FOR ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Erika SPUCHLAKOVA

    2017-12-01

    Full Text Available In a rapidly changing world it is necessary to adapt to new conditions. From a day to day approaches can vary. For the proper management of the company it is essential to know the financial situation. Assessment of the company financial health can be carried out by financial analysis which provides a number of methods how to evaluate the company financial health. Analysis indicators are often included in the company assessment, in obtaining bank loans and other financial resources to ensure the functioning of the company. As company focuses on the future and its planning, it is essential to forecast the future financial situation. According to the results of company´s financial health prediction, the company decides on the extension or limitation of its business. It depends mainly on the capabilities of company´s management how they will use information obtained from financial analysis in practice. The findings of logistic regression methods were published firstly in the 60s, as an alternative to the least squares method. The essence of logistic regression is to determine the relationship between being explained (dependent variable and explanatory (independent variables. The basic principle of this static method is based on the regression analysis, but unlike linear regression, it can predict the probability of a phenomenon that has occurred or not. The aim of this paper is to determine the probability of bankruptcy enterprises.

  4. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  5. Plant Essential Oils Synergize and Antagonize Toxicity of Different Conventional Insecticides against Myzus persicae (Hemiptera: Aphididae)

    Science.gov (United States)

    Faraone, Nicoletta; Hillier, N. Kirk; Cutler, G. Christopher

    2015-01-01

    Plant-derived products can play an important role in pest management programs. Essential oils from Lavandula angustifolia (lavender) and Thymus vulgaris (thyme) and their main constituents, linalool and thymol, respectively, were evaluated for insecticidal activity and synergistic action in combination with insecticides against green peach aphid, Myzus persicae (Sulzer) (Hemiptera: Aphididae). The essential oils and their main constituents exerted similar insecticidal activity when aphids were exposed by direct sprays, but were non-toxic by exposure to treated leaf discs. In synergism experiments, the toxicity of imidacloprid was synergized 16- to 20-fold by L. angustifolia and T. vulgaris essential oils, but far less synergism occurred with linalool and thymol, indicating that secondary constituents of the oils were probably responsible for the observed synergism. In contrast to results with imidacloprid, the insecticidal activity of spirotetramat was antagonized by L. angustifolia and T. vulgaris essential oils, and linalool and thymol. Our results demonstrate the potential of plant essential oils as synergists of insecticides, but show that antagonistic action against certain insecticides may occur. PMID:26010088

  6. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  8. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  9. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  10. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  14. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  15. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  17. Enfartes Esplénicos – quando a etiologia é multifactorial: Mutação do gene MTHFR e Trombocitose Essencial

    Directory of Open Access Journals (Sweden)

    Marta Pereira

    2016-06-01

    Full Text Available INTRODUCTION: Essential thrombocythemia (ET is a rare chronic myeloproliferative disease associated with an increased risk of thrombotic events in up to 50% of all patients. In patients with hyperhomocysteinemia associated with MTHFR mutation in homozigozity, the risk for thrombotic events is increased in 1-2%. Therefore, the coexistence of these two clinical entities causes an exponential rise in the risk for ischemic phenomena. CASE REPORT: A 56-year-old male, a smoker with previously known dyslipidemia and cerebrovascular disease was admitted to our hospital for epigastric and left hypochondrium pain for two months. Imagiological studies showed splenomegaly and several lesions suggestive of splenic infarction. Laboratory studies revealed leukocytosis (12900/μL, thrombocythemia (570x103/μL, reduced folic acid levels (0.90 ng/mL and hyperhomocysteinemia (42.5 μmol/L. MTHFR c.677C>T mutation was positive (homozygous. His bone marrow showed characteristics suggestive of ET and JAK2 V612F was positive (heterozygous with bcr-abl mutation negative. Aspirine and hydroxyurea were started as well as vitaminic supplementation, with good response. DISCUSSION: The present case reflects the association between two unusual clinical entities, in which thrombotic phenomena are very common, particularly in the vascular territorries involved in this patient. We highlight the importance of a quick diagnosis and treatment, the main keys for a survival rate similar to the general population.

  18. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  19. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  20. Use of fault tree technique to determine the failure probability of electrical systems of IE class in nuclear installations

    International Nuclear Information System (INIS)

    Cruz S, W.D.

    1988-01-01

    This paper refers to emergency safety systems of Angra INPP (Brazil 1626 Mw(e)) such as containment, heat removal, emergency removal system, radioactive elements removal from containment environment, berated water infection, etc. Associated with these systems, the failure probability calculation of IE Class bars is achieved, this is a safety classification for electrical equipment essential for the systems mentioned above

  1. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  2. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  3. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  4. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  5. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  6. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  7. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  8. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  10. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  11. Antibacterial activity and composition of essential oils from Pelargonium graveolens L'Her and Vitex agnus-castus L.

    Science.gov (United States)

    Ghannadi, A; Bagherinejad, Mr; Abedi, D; Jalali, M; Absalan, B; Sadeghi, N

    2012-12-01

    Essential oils are volatile compounds that have been used since Middle Ages as antimicrobial, anti-inflammatory, sedative, local anesthetic and food flavoring agents. In the current study, essential oils of Pelargonium graveolens L'Her and Vitex agnus-castus L. were analyzed for their antibacterial activities. The chemical compositions of essential oils were characterized by GC-MS. Disc diffusion method was used to study antimicrobial activity. Inhibition zones showed that the essential oils of the two plants were active against all of the studied bacteria (except Listeria monocytogenes). The susceptibility of the strains changed with the dilution of essential oils in DMSO. The pure essential oils showed the most extensive inhibition zones and they were very effective antimicrobial compounds compared to chloramphenicol and amoxicillin. The most susceptible strain against these two essential oils was Staphylococcus aureus. It seems that β-citronellol is a prominent part of P. graveolens volatile oil and caryophyllene oxide is a famous and important part of V. agnus-castus volatile oil and their probable synergistic effect with other constituents are responsible for the antibacterial effects of these oils. However further studies must be performed to confirm the safety of these oils for use as antimicrobial agents and natural preservatives in different products.

  12. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  13. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  14. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  15. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  16. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  17. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  18. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  19. Astronomy essentials

    CERN Document Server

    Brass, Charles O

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Astronomy includes the historical perspective of astronomy, sky basics and the celestial coordinate systems, a model and the origin of the solar system, the sun, the planets, Kepler'

  20. Anti-inflammatory and anti-allergic properties of the essential oil and active compounds from Cordia verbenacea.

    Science.gov (United States)

    Passos, Giselle F; Fernandes, Elizabeth S; da Cunha, Fernanda M; Ferreira, Juliano; Pianowski, Luiz F; Campos, Maria M; Calixto, João B

    2007-03-21

    The anti-inflammatory and anti-allergic effects of the essential oil of Cordia verbenacea (Boraginaceae) and some of its active compounds were evaluated. Systemic treatment with the essential oil of Cordia verbenacea (300-600mg/kg, p.o.) reduced carrageenan-induced rat paw oedema, myeloperoxidase activity and the mouse oedema elicited by carrageenan, bradykinin, substance P, histamine and platelet-activating factor. It also prevented carrageenan-evoked exudation and the neutrophil influx to the rat pleura and the neutrophil migration into carrageenan-stimulated mouse air pouches. Moreover, Cordia verbenacea oil inhibited the oedema caused by Apis mellifera venom or ovalbumin in sensitized rats and ovalbumin-evoked allergic pleurisy. The essential oil significantly decreased TNFalpha, without affecting IL-1beta production, in carrageenan-injected rat paws. Neither the PGE(2) formation after intrapleural injection of carrageenan nor the COX-1 or COX-2 activities in vitro were affected by the essential oil. Of high interest, the paw edema induced by carrageenan in mice was markedly inhibited by both sesquiterpenic compounds obtained from the essential oil: alpha-humulene and trans-caryophyllene (50mg/kg, p.o.). Collectively, the present results showed marked anti-inflammatory effects for the essential oil of Cordia verbenacea and some active compounds, probably by interfering with TNFalpha production. Cordia verbenacea essential oil or its constituents might represent new therapeutic options for the treatment of inflammatory diseases.

  1. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  2. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  3. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  4. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  5. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  7. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  8. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  9. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  10. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  11. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  12. Pathogenetic Role of JAK2 V617F Mutation in Chronic Myeloproliferative Disorders

    Directory of Open Access Journals (Sweden)

    Hui-Chi Hsu

    2007-03-01

    Full Text Available The molecular pathogenesis of chronic myeloproliferative disorders (MPDs is poorly understood. The hematopoietic progenitor cells of patients with polycythemia vera (PV or essential thrombocythemia (ET are characterized by hypersensitiv-ity to hematopoietic growth factors and formation of endogenous erythroid colonies. Recently, 4 groups reported almost simultaneously Janus kinase 2 (JAK2 V617F mutation in more than 80% of PV patients, 30% of patients with ET and in about 50% of patients with idiopathic myelofibrosis. The identification of the JAK2 mutation represents a major advance in the understanding of the molecular pathogenesis of MPDs that will likely permit a new classification and the development of novel therapeutic strategies for these diseases.

  13. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  14. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  15. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  16. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  17. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  18. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  19. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  20. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  1. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  2. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  3. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  4. Janus kinase inhibitors: jackpot or potluck?

    Directory of Open Access Journals (Sweden)

    Pavithran Keechilat

    2012-06-01

    Full Text Available The reports of a unique mutation in the Janus kinase-2 gene (JAK2 in polycythemia vera by several independent groups in 2005 quickly spurred the development of the Janus kinase inhibitors. In one of the great victories of translational research in recent times, the first smallmolecule Janus kinase inhibitor ruxolitinib entered a phase I trial in 2007. With the approval of ruxolitinib by the US Federal Drug Administration in November 2011 for high-risk and intermediate-2 risk myelofibrosis, a change in paradigm has occurred in the management of a subset of myeloproliferative neoplasms (MPN: primary myelofibrosis, post-polycythemia vera myelofibrosis, and post-essential thrombocythemia myelofibrosis. Whereas the current evidence for ruxolitinib only covers high-risk and intermediate-2 risk myelofibrosis, inhibitors with greater potency are likely to offer better disease control and survival advantage in patients belonging to these categories, and possibly to the low-risk and intermediate-1 risk categories of MPN as well. But use of the Janus kinase inhibitors also probably has certain disadvantages, such as toxicity, resistance, withdrawal phenomenon, non-reversal of histology, and an implausible goal of disease clone eradication, some of which could offset the gains. In spite of this, Janus kinase inhibitors are here to stay, and for use in more than just myeloproliferative neoplasms.

  5. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  6. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  7. Effect of Origanum heracleoticum L. essential oil on food-borne Penicillium aurantiogriseum and Penicilium chrysogenum isolates

    Directory of Open Access Journals (Sweden)

    Čabarkapa Ivana S.

    2011-01-01

    Full Text Available Molds are ubiquitously distributed in nature and their spores can be found in the atmosphere even at high altitudes. The difficulty of controlling these undesirable molds, as well as the growing interest of the consumers in natural products, have been forcing the industry to find new alternatives for food preservation. The modern trends in nutrition suggest the limitation of synthetic food additives or substitution with natural ones. Aromatic herbs are probably the most important source of natural antimicrobial agents. Origanum heracleoticum L. essential oil has been known as an interesting source of antimicrobial compounds to be applied in food preservation. In the this work, we have investigated the effect of essential oil obtained from O. heracleoticum on growth of six isolates of Penicillium aurantiogriseum and four isolates of Penicillium chrysogenum isolated from meat plant for traditional Petrovacka sausage (Petrovská klobása production. The findings reveal that the essential oil of O. heracleoticum provides inhibition of all of fungal isolates tested. O. heracleoticum L. essential oil exhibited higher antifungal activity against the isolates of P. chrysogenum than the isolates of P. aurantiogriseum. O. heracleoticum essential oil showed a MIC value ranging from 25 to 100 μL/mL. The fungi cultivated in the medium with higher concentration of essential oil showed certain morphological changes. The alterations included lack of sporulation and loss of pigmentation.

  8. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  9. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  10. Radiation risk of tissue late effects, a net consequence of probabilities of various cellular responses

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    1991-01-01

    Late effects from the exposure to low doses of ionizing radiation are hardly or not at all observed in man mainly due to the low values of risk coefficients that preclude statistical analyses of data from populations that are exposed to doses less than 0.2 Gy. In order to arrive at an assessment of potential risk from radiation exposure in the low dose range, the microdosimetry approach is essential. In the low dose range, ionizing radiation generates particle tracks, mainly electrons, which are distributed rather heterogeneously within the exposed tissue. Taking the individual cell as the elemental unit of life, observations and calculations of cellular responses to being hit by energy depositions events from low LET type are analysed. It emerges that besides the probability of a hit cell to sustain a detrimental effect with the consequense of malignant transformation there are probabilities of various adaptive responses that equipp the hit cell with a benefit. On the one hand, an improvement of cellular radical detoxification was observed in mouse bone marrow cells; another adaptive response pertaining to improved DNA repair, was reported for human lymphocytes. The improved radical detoxification in mouse bone marrow cells lasts for a period of 5-10 hours and improved DNA repair in human lymphocytes was seen for some 60 hours following acute irradiation. It is speculated that improved radical detoxification and improved DNA repair may reduce the probability of spontaneous carcinogenesis. Thus it is proposed to weigh the probability of detriment for a hit cell within a multicellular system against the probability of benefit through adaptive responses in other hit cells in the same system per radiation exposure. In doing this, the net effect of low doses of low LET radiation in tissue with individual cells being hit by energy deposition events could be zero or even beneficial. (orig./MG)

  11. Policy on synthetic biology: deliberation, probability, and the precautionary paradox.

    Science.gov (United States)

    Wareham, Christopher; Nardini, Cecilia

    2015-02-01

    Synthetic biology is a cutting-edge area of research that holds the promise of unprecedented health benefits. However, in tandem with these large prospective benefits, synthetic biology projects entail a risk of catastrophic consequences whose severity may exceed that of most ordinary human undertakings. This is due to the peculiar nature of synthetic biology as a 'threshold technology' which opens doors to opportunities and applications that are essentially unpredictable. Fears about these potentially unstoppable consequences have led to declarations from civil society groups calling for the use of a precautionary principle to regulate the field. Moreover, the principle is prevalent in law and international agreements. Despite widespread political recognition of a need for caution, the precautionary principle has been extensively criticized as a guide for regulatory policy. We examine a central objection to the principle: that its application entails crippling inaction and incoherence, since whatever action one takes there is always a chance that some highly improbable cataclysm will occur. In response to this difficulty, which we call the 'precautionary paradox,' we outline a deliberative means for arriving at threshold of probability below which potential dangers can be disregarded. In addition, we describe a Bayesian mechanism with which to assign probabilities to harmful outcomes. We argue that these steps resolve the paradox. The rehabilitated PP can thus provide a viable policy option to confront the uncharted waters of synthetic biology research. © 2013 John Wiley & Sons Ltd.

  12. Inflorescence and leaves essential oil composition of hydroponically grown Ocimum basilicum L

    Directory of Open Access Journals (Sweden)

    MOHAMMAD BAGHER HASSANPOURAGHDAM

    2010-10-01

    Full Text Available In order to characterize the essential oils of leaves and inflorescences, water distilled volatile oils of hydroponically grown Ocimum basilicum L. were analyzed by GC/EI-MS. Fifty components were identified in the inflorescence and leaf essential oils of the basil plants, accounting for 98.8 and 99.9 % of the total quantified components respectively. Phenylpropanoids (37.7 % for the inflorescence vs. 58.3 % for the leaves were the predominant class of oil constituents, followed by sesquiterpenes (33.3 vs. 19.4 % and monoterpenes (27.7 vs. 22.1 %. Of the monoterpenoid compounds, oxygenated monoterpenes (25.2 vs. 18.9 % were the main subclass. Sesquiterpene hydrocarbons (25 vs. 15.9 % were the main subclass of sesquiterpenoidal compounds. Methyl chavicol, a phenylpropane derivative, (37.2 vs. 56.7 % was the principle component of both organ oils, with up to 38 and 57 % of the total identified components of the inflorescence and leaf essential oils, respectively. Linalool (21.1 vs. 13.1 % was the second common major component followed by α-cadinol (6.1 vs. 3 %, germacrene D (6.1 vs. 2.7 % and 1,8-cineole (2.4 vs. 3.5 %. There were significant quantitative but very small qualitative differences between the two oils. In total, considering the previous reports, it seems that essential oil composition of hydroponically grown O. basilicum L. had volatile constituents comparable with field grown counterparts, probably with potential applicability in the pharmaceutical and food industries.

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  15. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  16. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  17. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  18. KNIME essentials

    CERN Document Server

    Bakos, Gábor

    2013-01-01

    KNIME Essentials is a practical guide aimed at getting the results you want, as quickly as possible.""Knime Essentials"" is written for data analysts looking to quickly get up to speed using the market leader in data processing tools, KNIME. No knowledge of KNIME is required, but we will assume that you have some background in data processing.

  19. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  20. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  1. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  2. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  3. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  4. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  5. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  6. Whole body measurements of sodium turnover in offspring of patients with sustained essential hypertension

    International Nuclear Information System (INIS)

    Henningsen, N.C.; Ohlsson, O.; Mattson, S.; Nosslin, B.; Lund Univ.; Lund Univ.

    1982-01-01

    The elimination rate (percent per day) of injected 22 Na using a whole body measurement technique was significantly lower (26%, 5.8 +- 1.5) in normotensive of borderline hypertensive offspring of essential hypertensive patients than in 15 age- and sex-matched, normotensive controls (7.3 +- 1.0). There were no significant differences in exchangeable sodium, whole body potassium or in the urinary exeretion of sodium, potassium and creatinine. The basis for the difference in turnover rate during weck 1 is probably an alteration in the cellular handling of sodium (i.e. increased intracellular sodium) in the still normotensive offspring of patients with essential hypertension. The long-term (more than 100 days) whole body retention of 22 Na was found to be only 0.1% of that injected, which justifies the use of the method on larger population groups. (orig.)

  7. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  8. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  9. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  10. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  11. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  12. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  13. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  14. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  15. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  16. Modern algebra essentials

    CERN Document Server

    Lutfiyya, Lutfi A

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Modern Algebra includes set theory, operations, relations, basic properties of the integers, group theory, and ring theory.

  17. Calculus III essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Calculus III includes vector analysis, real valued functions, partial differentiation, multiple integrations, vector fields, and infinite series.

  18. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  19. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  20. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  1. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  2. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  3. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  4. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  5. Electric circuits essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Electric Circuits I includes units, notation, resistive circuits, experimental laws, transient circuits, network theorems, techniques of circuit analysis, sinusoidal analysis, polyph

  6. Pre-calculus essentials

    CERN Document Server

    Woodward, Ernest

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Pre-Calculus reviews sets, numbers, operations and properties, coordinate geometry, fundamental algebraic topics, solving equations and inequalities, functions, trigonometry, exponents

  7. Electronics I essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Electronics I covers fundamentals of semiconductor devices, junction diodes, bipolar junction transistors, power supplies, multitransistor circuits, small signals, low-frequency anal

  8. Multiple-event probability in general-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse

  9. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  10. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  11. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  12. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  13. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  14. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  15. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  16. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  17. Thermodynamics II essentials

    CERN Document Server

    REA, The Editors of

    2013-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Thermodynamics II includes review of thermodynamic relations, power and refrigeration cycles, mixtures and solutions, chemical reactions, chemical equilibrium, and flow through nozzl

  18. Electronics II essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Electronics II covers operational amplifiers, feedback and frequency compensation of OP amps, multivibrators, logic gates and families, Boolean algebra, registers, counters, arithmet

  19. Laplace transforms essentials

    CERN Document Server

    Shafii-Mousavi, Morteza

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Laplace Transforms includes the Laplace transform, the inverse Laplace transform, special functions and properties, applications to ordinary linear differential equations, Fourier tr

  20. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  1. Boolean algebra essentials

    CERN Document Server

    Solomon, Alan D

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Boolean Algebra includes set theory, sentential calculus, fundamental ideas of Boolean algebras, lattices, rings and Boolean algebras, the structure of a Boolean algebra, and Boolean

  2. Set theory essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Set Theory includes elementary logic, sets, relations, functions, denumerable and non-denumerable sets, cardinal numbers, Cantor's theorem, axiom of choice, and order relations.

  3. Geometry I essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Geometry I includes methods of proof, points, lines, planes, angles, congruent angles and line segments, triangles, parallelism, quadrilaterals, geometric inequalities, and geometric

  4. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  5. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  6. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  7. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  8. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  9. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  10. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  11. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  12. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  13. Evaluation of a screening instrument for essential tremor

    DEFF Research Database (Denmark)

    Lorenz, Delia; Papengut, Frank; Frederiksen, Henrik

    2008-01-01

    To evaluate a screening instrument for essential tremor (ET) consisting of a seven-item questionnaire and a spiral drawing. A total of 2,448 Danish twins aged 70 years or more and a second sample aged 60 years or more (n = 1,684) from a population-based northern German cross-sectional study (Pop....... Definite or probable ET was diagnosed in 104 patients, possible in 86 and other tremors in 98 patients. The sensitivity of the screening instrument was 70.5%, the positive predictive value was 64.9%, the specificity was 68.2%, and the negative predictive value was 73.5%. Tremor severity correlated...... significantly with higher spiral scores and more positive items. More patients were identified by spiral drawing in all tremor groups. The interrater and intrarater reliability for spirals ranged from 0.7 to 0.8 using intraclass coefficient. A cluster analysis revealed that the questionnaire can be reduced...

  14. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  15. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  17. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  18. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  19. Thermodynamics I essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Thermodynamics I includes review of properties and states of a pure substance, work and heat, energy and the first law of thermodynamics, entropy and the second law of thermodynamics

  20. Physics I essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Physics I includes vectors and scalars, one-dimensional motion, plane motion, dynamics of a particle, work and energy, conservation of energy, dynamics of systems of particles, rotation

  1. Beyond Bayes: On the Need for a Unified and Jaynesian Definition of Probability and Information within Neuroscience

    Directory of Open Access Journals (Sweden)

    Christopher D. Fiorillo

    2012-04-01

    Full Text Available It has been proposed that the general function of the brain is inference, which corresponds quantitatively to the minimization of uncertainty (or the maximization of information. However, there has been a lack of clarity about exactly what this means. Efforts to quantify information have been in agreement that it depends on probabilities (through Shannon entropy, but there has long been a dispute about the definition of probabilities themselves. The “frequentist” view is that probabilities are (or can be essentially equivalent to frequencies, and that they are therefore properties of a physical system, independent of any observer of the system. E.T. Jaynes developed the alternate “Bayesian” definition, in which probabilities are always conditional on a state of knowledge through the rules of logic, as expressed in the maximum entropy principle. In doing so, Jaynes and others provided the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason. However, neuroscience literature virtually never specifies any definition of probability, nor does it acknowledge any dispute concerning the definition. Although there has recently been tremendous interest in Bayesian approaches to the brain, even in the Bayesian literature it is common to find probabilities that are purported to come directly and unconditionally from frequencies. As a result, scientists have mistakenly attributed their own information to the neural systems they study. Here I argue that the adoption of a strictly Jaynesian approach will prevent such errors and will provide us with the philosophical and mathematical framework that is needed to understand the general function of the brain. Accordingly, our challenge becomes the identification of the biophysical basis of Jaynesian information and logic. I begin to address this issue by suggesting how we might identify a probability distribution over states of one physical

  2. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  3. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  4. levels of essential and non-essential metals in ethiopian ouzo

    African Journals Online (AJOL)

    Preferred Customer

    Key words/phrases: Alcoholic beverage, Ethiopia, essential metal, non-essential metal, ouzo. * Author to whom all correspondence should ... ing the attention of scientists and policy makers as a vital part of food security strategies and ... Canadian Government indicated that it had detected ethyl carbamate, C3H7NO2, which ...

  5. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  6. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  7. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  8. Low blood selenium: A probable factor in essential hypertension ...

    African Journals Online (AJOL)

    Blood selenium (BSe) and plasma glutathione peroxidase (plGSH-Px) activity were measured as biochemical markers of selenium status of 103 hypertensive patients (44 males and 59 females) and 88 apparently healthy subjects (40 males and 48 females). The hypertensive patients were classified into three groups based ...

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  11. Levels of essential and non-essential metals in Rhamnus prinoides ...

    African Journals Online (AJOL)

    The objective of this study was to assess the levels of essential and toxic metals in leaf and stem of Rhamnus prinoides which are used for bitterness of local alcoholic beverages in Ethiopia and as traditional medicine in some African countries. Levels of essential metals (Ca, Mg, Cr, Mn, Fe, Co, Ni, Cu and Zn) and toxic ...

  12. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  13. Application of Non-Kolmogorovian Probability and Quantum Adaptive Dynamics to Unconscious Inference in Visual Perception Process

    Science.gov (United States)

    Accardi, Luigi; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2016-07-01

    Recently a novel quantum information formalism — quantum adaptive dynamics — was developed and applied to modelling of information processing by bio-systems including cognitive phenomena: from molecular biology (glucose-lactose metabolism for E.coli bacteria, epigenetic evolution) to cognition, psychology. From the foundational point of view quantum adaptive dynamics describes mutual adapting of the information states of two interacting systems (physical or biological) as well as adapting of co-observations performed by the systems. In this paper we apply this formalism to model unconscious inference: the process of transition from sensation to perception. The paper combines theory and experiment. Statistical data collected in an experimental study on recognition of a particular ambiguous figure, the Schröder stairs, support the viability of the quantum(-like) model of unconscious inference including modelling of biases generated by rotation-contexts. From the probabilistic point of view, we study (for concrete experimental data) the problem of contextuality of probability, its dependence on experimental contexts. Mathematically contextuality leads to non-Komogorovness: probability distributions generated by various rotation contexts cannot be treated in the Kolmogorovian framework. At the same time they can be embedded in a “big Kolmogorov space” as conditional probabilities. However, such a Kolmogorov space has too complex structure and the operational quantum formalism in the form of quantum adaptive dynamics simplifies the modelling essentially.

  14. Application of a weighted spatial probability model in GIS to analyse landslides in Penang Island, Malaysia

    Directory of Open Access Journals (Sweden)

    Samy Ismail Elmahdy

    2016-01-01

    Full Text Available In the current study, Penang Island, which is one of the several mountainous areas in Malaysia that is often subjected to landslide hazard, was chosen for further investigation. A multi-criteria Evaluation and the spatial probability weighted approach and model builder was applied to map and analyse landslides in Penang Island. A set of automated algorithms was used to construct new essential geological and morphometric thematic maps from remote sensing data. The maps were ranked using the weighted probability spatial model based on their contribution to the landslide hazard. Results obtained showed that sites at an elevation of 100–300 m, with steep slopes of 10°–37° and slope direction (aspect in the E and SE directions were areas of very high and high probability for the landslide occurrence; the total areas were 21.393 km2 (11.84% and 58.690 km2 (32.48%, respectively. The obtained map was verified by comparing variogram models of the mapped and the occurred landslide locations and showed a strong correlation with the locations of occurred landslides, indicating that the proposed method can successfully predict the unpredictable landslide hazard. The method is time and cost effective and can be used as a reference for geological and geotechnical engineers.

  15. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  16. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  17. Essential Palatal Myoclonus

    Directory of Open Access Journals (Sweden)

    Bhuwan Raj Pandey

    2017-06-01

    Full Text Available Introduction: Palatal myoclonus is a rare condition presenting with clicking sound in ear or muscle tremor in pharynx. There are two varieties: essential and symptomatic. Various treatment options exists ranging from watchful observation to botulinum toxin injection. We have not found any reported case of palatal myoclonus from our country. Here we present a case of essential palatal myoclonus managed with clonazepam. Case report: A young female presented in Ear Nose and Throat clinic with complain of auditory click and spontaneous rhythmic movement of throat muscles for eight months. On examination, there was involuntary, rhythmic contraction of bilateral soft-palate, uvula, and base of tongue. Neurological, eye, and peripheral examination were normal. A diagnosis of essential palatal myoclonus was made. It was managed successfully with clonazepam; patient was still on low dose clonazepam at the time of making this report. Conclusion: Essential palatal myoclonus can be clinically diagnosed and managed even in settings where MRI is not available or affordable.

  18. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  19. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  20. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  1. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  2. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  3. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  4. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  5. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  6. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  7. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  8. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  9. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  10. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  11. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  12. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  13. Fixation Probability in a Haploid-Diploid Population.

    Science.gov (United States)

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  14. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  15. Chance, determinism and the classical theory of probability.

    Science.gov (United States)

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Essential Bacillus subtilis genes

    DEFF Research Database (Denmark)

    Kobayashi, K.; Ehrlich, S.D.; Albertini, A.

    2003-01-01

    To estimate the minimal gene set required to sustain bacterial life in nutritious conditions, we carried out a systematic inactivation of Bacillus subtilis genes. Among approximate to4,100 genes of the organism, only 192 were shown to be indispensable by this or previous work. Another 79 genes were...... predicted to be essential. The vast majority of essential genes were categorized in relatively few domains of cell metabolism, with about half involved in information processing, one-fifth involved in the synthesis of cell envelope and the determination of cell shape and division, and one-tenth related...... to cell energetics. Only 4% of essential genes encode unknown functions. Most essential genes are present throughout a wide range of Bacteria, and almost 70% can also be found in Archaea and Eucarya. However, essential genes related to cell envelope, shape, division, and respiration tend to be lost from...

  20. 14 CFR 417.224 - Probability of failure analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure...

  1. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  2. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    Roth, M.J.

    1985-04-01

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  3. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  4. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  5. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  6. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  7. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  8. On the sum of Gamma-Gamma variates with application to the fast outage probability evaluation over fading channels

    KAUST Repository

    Ben Issaid, Chaouki

    2017-04-01

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of independent and identically distributed Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  9. On the sum of Gamma-Gamma variates with application to the fast outage probability evaluation over fading channels

    KAUST Repository

    Ben Issaid, Chaouki; Rached, Nadhir B.; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2017-01-01

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of independent and identically distributed Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  10. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  12. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  13. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

    Directory of Open Access Journals (Sweden)

    Robert G. Staudte

    2018-04-01

    Full Text Available We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs, obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples.

  14. Genetics Home Reference: essential tremor

    Science.gov (United States)

    ... Facebook Twitter Home Health Conditions Essential tremor Essential tremor Printable PDF Open All Close All Enable Javascript to view the expand/collapse boxes. Description Essential tremor is a movement disorder that causes involuntary, rhythmic ...

  15. Chemical composition and seasonal variability of the essential oils of leaves and morphological analysis of Hyptis carpinifolia

    Directory of Open Access Journals (Sweden)

    Stone de Sá

    Full Text Available ABSTRACT Hyptis carpinifolia Benth., Lamiaceae, is a species known popularly as "rosmaninho" and "mata-pasto", and leaves are employed in Brazilian folk medicine to treat colds, flu, and rheumatism. The aim of this study was to perform a morphological description of H. carpinifolia and to evaluate the seasonal chemical variability of the leaf essential oils during 12 months. Macroscopic characterization of H. carpinifolia was carried out with the naked eye and with a stereoscopic microscope. Essential oils were isolated from leaves by hydrodistillation in Clevenger apparatus and analyzed by gas chromatography/mass spectrometry. Major compounds were found to be 1,8-cineole (39.6-61.8%, trans-cadina-1(6,4-diene (2.8-17.5%, β-caryophyllene (4.4-10.0%, prenopsan-8-ol (4.2-9.6% and β-pinene (2.9-5.3%. Results of essential oils compositions were processed by cluster analysis and principal component analysis. Data showed high variability in the concentration of the components. Besides, there was a seasonal variability of chemical composition, probably related mainly to the rainfall regime.

  16. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  17. Asymptotics of Toeplitz determinants and the emptiness formation probability for the XY spin chain

    International Nuclear Information System (INIS)

    Franchini, Fabio; Abanov, Alexander G

    2005-01-01

    We study an asymptotic behaviour of a special correlator known as the emptiness formation probability (EFP) for the one-dimensional anisotropic XY spin-1/2 chain in a transverse magnetic field. This correlator is essentially the probability of formation of a ferromagnetic string of length n in the antiferromagnetic ground state of the chain and plays an important role in the theory of integrable models. For the XY spin chain, the correlator can be expressed as the determinant of a Toeplitz matrix and its asymptotical behaviours for n → ∞ throughout the phase diagram are obtained using known theorems and conjectures on Toeplitz determinants. We find that the decay is exponential everywhere in the phase diagram of the XY model except on the critical lines, i.e. where the spectrum is gapless. In these cases, a power-law prefactor with a universal exponent arises in addition to an exponential or Gaussian decay. The latter Gaussian behaviour holds on the critical line corresponding to the isotropic XY model, while at the critical value of the magnetic field the EFP decays exponentially. At small anisotropy one has a crossover from the Gaussian to the exponential behaviour. We study this crossover using the bosonization approach

  18. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  19. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  20. Exact capture probability analysis of GSC receivers over Rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2010-01-01

    For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  1. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  2. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  3. A discussion on the origin of quantum probabilities

    International Nuclear Information System (INIS)

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-01

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases

  4. The antidepressant-like effect of Mentha spicata essential oil in animal models of depression in male mice

    Directory of Open Access Journals (Sweden)

    Behnam Jedi-Behnia

    2017-06-01

    Full Text Available Background & Objective: Previous researches have revealed analgesic and sedative properties of Mentha spicata (MS. The aim of present study was to evaluate the antidepressant effects of MS essential oil in forced swim test (FST and tail suspension test (TST in male mice. Materials & Methods: In this experimental study, 84 male mice were randomly divided into 14 groups of 6: Negative control groups received normal saline (10 ml/kg,i.p., positive control groups received fluoxetine (20mg/kg, i.p. and imipramine (30mg/kg and treatment groups received MS essential oil (30, 60,120 and 240 mg/kg i.p.. In FST, immobility time, swimming time and climbing time and immobility time in TST were recorded in six minutes. Results: Findings indicated that essential oil at doses of 120 and 240 mg/kg, fluoxetine and imipramine reduced immobility time compared to control group in FST and TST (p0.05. In contrast, imipramine increased climbing time without any significant change in swimming time (p>0.05. Conclusion: Based on the findings of the present study, MS essential oil has antidepressant-like activity similar to fluoxetine and probably their compounds (especially carvone with serotonergic mechanism induced their effect. However, further studies are needed to determine the precise mechanism of its action.

  5. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  6. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  7. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  8. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  9. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  10. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  11. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  12. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  13. Dietary essentiality of “nutritionally non-essential amino acids” for animals and humans

    OpenAIRE

    Hou, Yongqing; Yin, Yulong; Wu, Guoyao

    2015-01-01

    Based on growth or nitrogen balance, amino acids (AA) had traditionally been classified as nutritionally essential (indispensable) or non-essential (dispensable) for animals and humans. Nutritionally essential AA (EAA) are defined as either those AA whose carbon skeletons cannot be synthesized de novo in animal cells or those that normally are insufficiently synthesized de novo by the animal organism relative to its needs for maintenance, growth, development, and health and which must be prov...

  14. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  15. Treatment of Essential Tremor

    Science.gov (United States)

    ... for PATIENTS and their FAMILIES TREATMENT OF ESSENTIAL TREMOR This fact sheet is provided to help you understand which therapies help treat essential tremor. Neurologists from the American Academy of Neurology are ...

  16. Essential Medicines in a High Income Country: Essential to Whom?

    Science.gov (United States)

    Duong, Mai; Moles, Rebekah J; Chaar, Betty; Chen, Timothy F

    2015-01-01

    To explore the perspectives of a diverse group of stakeholders engaged in medicines decision making around what constitutes an "essential" medicine, and how the Essential Medicines List (EML) concept functions in a high income country context. In-depth qualitative semi-structured interviews were conducted with 32 Australian stakeholders, recognised as decision makers, leaders or advisors in the area of medicines reimbursement or supply chain management. Participants were recruited from government, pharmaceutical industry, pharmaceutical wholesale/distribution companies, medicines non-profit organisations, academic health disciplines, hospitals, and consumer groups. Perspectives on the definition and application of the EML concept in a high income country context were thematically analysed using grounded theory approach. Stakeholders found it challenging to describe the EML concept in the Australian context because many perceived it was generally used in resource scarce settings. Stakeholders were unable to distinguish whether nationally reimbursed medicines were essential medicines in Australia. Despite frequent generic drug shortages and high prices paid by consumers, many struggled to describe how the EML concept applied to Australia. Instead, broad inclusion of consumer needs, such as rare and high cost medicines, and consumer involvement in the decision making process, has led to expansive lists of nationally subsidised medicines. Therefore, improved communication and coordination is needed around shared interests between stakeholders regarding how medicines are prioritised and guaranteed in the supply chain. This study showed that decision-making in Australia around reimbursement of medicines has strayed from the fundamental utilitarian concept of essential medicines. Many stakeholders involved in medicine reimbursement decisions and management of the supply chain did not consider the EML concept in their approach. The wide range of views of what stakeholders

  17. Essential Medicines in a High Income Country: Essential to Whom?

    Directory of Open Access Journals (Sweden)

    Mai Duong

    Full Text Available To explore the perspectives of a diverse group of stakeholders engaged in medicines decision making around what constitutes an "essential" medicine, and how the Essential Medicines List (EML concept functions in a high income country context.In-depth qualitative semi-structured interviews were conducted with 32 Australian stakeholders, recognised as decision makers, leaders or advisors in the area of medicines reimbursement or supply chain management. Participants were recruited from government, pharmaceutical industry, pharmaceutical wholesale/distribution companies, medicines non-profit organisations, academic health disciplines, hospitals, and consumer groups. Perspectives on the definition and application of the EML concept in a high income country context were thematically analysed using grounded theory approach.Stakeholders found it challenging to describe the EML concept in the Australian context because many perceived it was generally used in resource scarce settings. Stakeholders were unable to distinguish whether nationally reimbursed medicines were essential medicines in Australia. Despite frequent generic drug shortages and high prices paid by consumers, many struggled to describe how the EML concept applied to Australia. Instead, broad inclusion of consumer needs, such as rare and high cost medicines, and consumer involvement in the decision making process, has led to expansive lists of nationally subsidised medicines. Therefore, improved communication and coordination is needed around shared interests between stakeholders regarding how medicines are prioritised and guaranteed in the supply chain.This study showed that decision-making in Australia around reimbursement of medicines has strayed from the fundamental utilitarian concept of essential medicines. Many stakeholders involved in medicine reimbursement decisions and management of the supply chain did not consider the EML concept in their approach. The wide range of views of

  18. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  19. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  20. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  3. Probability calculations for three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  4. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  5. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  6. Quantum interference of probabilities and hidden variable theories

    International Nuclear Information System (INIS)

    Srinivas, M.D.

    1984-01-01

    One of the fundamental contributions of Louis de Broglie, which does not get cited often, has been his analysis of the basic difference between the calculus of the probabilities as predicted by quantum theory and the usual calculus of probabilities - the one employed by most mathematicians, in its standard axiomatised version due to Kolmogorov. This paper is basically devoted to a discussion of the 'quantum interference of probabilities', discovered by de Broglie. In particular, it is shown that it is this feature of the quantum theoretic probabilities which leads to some serious constraints on the possible 'hidden-variable formulations' of quantum mechanics, including the celebrated theorem of Bell. (Auth.)

  7. Transport phenomena II essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Transport Phenomena II covers forced convention, temperature distribution, free convection, diffusitivity and the mechanism of mass transfer, convective mass transfer, concentration

  8. Heat transfer II essentials

    CERN Document Server

    REA, The Editors of

    1988-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Heat Transfer II reviews correlations for forced convection, free convection, heat exchangers, radiation heat transfer, and boiling and condensation.

  9. Differential equations I essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Differential Equations I covers first- and second-order equations, series solutions, higher-order linear equations, and the Laplace transform.

  10. Numerical analysis II essentials

    CERN Document Server

    REA, The Editors of; Staff of Research Education Association

    1989-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Numerical Analysis II covers simultaneous linear systems and matrix methods, differential equations, Fourier transformations, partial differential equations, and Monte Carlo methods.

  11. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  12. A Challenge to Ludwig von Mises’s Theory of Probability

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2010-10-01

    Full Text Available The most interesting and completely overlooked aspect of Ludwig von Mises’s theory of probability is the total absence of any explicit definition for probability in his theory. This paper examines Mises’s theory of probability in light of the fact that his theory possesses no definition for probability. It is argued, first, that Mises’s theory differs in important respects from his brother’s famous theory of probability. A defense of the subjective definition for probability is then provided, which is subsequently used to critique Ludwig von Mises’s theory. It is argued that only the subjective definition for probability comports with Mises’s other philosophical positions. Since Mises did not provide an explicit definition for probability, it is suggested that he ought to have adopted a subjective definition.

  13. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. A note on iterated function systems with discontinuous probabilities

    International Nuclear Information System (INIS)

    Jaroszewska, Joanna

    2013-01-01

    Highlights: ► Certain iterated function system with discontinuous probabilities is discussed. ► Existence of an invariant measure via the Schauder–Tychonov theorem is established. ► Asymptotic stability of the system under examination is proved. -- Abstract: We consider an example of an iterated function system with discontinuous probabilities. We prove that it posses an invariant probability measure. We also prove that it is asymptotically stable provided probabilities are positive

  16. Rhetoric and Essentially Contested Arguments

    Science.gov (United States)

    Garver, Eugene

    1978-01-01

    Draws a connection between Gallie's essentially contested concepts and Aristotle's account of rhetorical argument by presenting a definition of Essentially Contested Argument which is used as the connecting term between rhetoric and essentially contested concepts and by demonstrating the value of making this connection. (JF)

  17. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  18. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  19. The considering of the slowing down effect in the formalism of probability tables. Application to the effective cross section calculation

    International Nuclear Information System (INIS)

    Bouhelal, O.K.A.

    1990-01-01

    The exact determination of the effective multigroup cross sections imposes the numerical solution of the slowing down equation on a very fine energy mesh. Given the complexity of these calculations, different approximation methods have been developed but without a satisfactory treatment of the slowing-down effect. The usual methods are essentially based on interpolations using precalculated tables. The models that use the probability tables allow to reduce the amount of data and the computational effort. A variety of methods proposed by Soviets, then by Americans, and finally the French method, based on the ''moments of a probability distribution'' are incontestably valid within the framework of the statistical hypothesis. This stipulates that the collision densities do not depend on cross section and there is no ambiguity in the effective cross section calculation. The objective of our work is to show that the non statistical phenomena, such as the slowing-down effect which is taken into account, can be described by probability tables which are able to represent the neutronic values and collision densities. The formalism involved in the statistical hypothesis, is based on the Gauss quadrature of the cross sections moments. In the non-statistical hypothesis we introduce the crossed probability tables using the quadratures of double integrals of cross sections, comments. Moreover, a mathematical formalism allowing to establish a relationship between the crossed probability tables and the collision densities was developed. This method was applied on uranium-238 in the range of resolved resonances where the slowing down effect is significant. Validity of the method and the analysis of the obtained results are studied through a reference calculation based on a solution of a discretized slowing down equation using a very fine mesh in which each microgroup can be correctly defined via the statistical probability tables. 42 figs., 32 tabs., 49 refs. (author)

  20. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  1. Chasing down the triple-negative myeloproliferative neoplasms: Implications for molecular diagnostics.

    Science.gov (United States)

    Langabeer, Stephen E

    2016-01-01

    The majority of patients with classical myeloproliferative neoplasms (MPN) of polycythemia vera, essential thrombocythemia, and primary myelofibrosis harbor distinct disease-driving mutations within the JAK2 , CALR , or MPL genes. The term triple-negative has been recently applied to those MPN without evidence of these consistent mutations, prompting whole or targeted exome sequencing approaches to determine the driver mutational status of this subgroup. These strategies have identified numerous novel mutations that occur in alternative exons of both JAK2 and MPL , the majority of which result in functional activation. Current molecular diagnostic approaches may possess insufficient coverage to detect these alternative mutations, prompting further consideration of targeted exon sequencing into routine diagnostic practice. How to incorporate these illuminating findings into the expanding molecular diagnostic algorithm for MPN requires continual attention.

  2. Group theory I essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Group Theory I includes sets and mapping, groupoids and semi-groups, groups, isomorphisms and homomorphisms, cyclic groups, the Sylow theorems, and finite p-groups.

  3. C programming language essentials

    CERN Document Server

    Ackermann, Ernest C

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. C Programming Language discusses fundamental notions, data types and objects, expressions, statements, declarations, function and program structure, the preprocessor, and the standar

  4. Algebra & trigonometry I essentials

    CERN Document Server

    REA, Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Algebra & Trigonometry I includes sets and set operations, number systems and fundamental algebraic laws and operations, exponents and radicals, polynomials and rational expressions, eq

  5. Transport phenomena I essentials

    CERN Document Server

    REA, The Editors of

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Transport Phenomena I includes viscosity, flow of Newtonian fluids, velocity distribution in laminar flow, velocity distributions with more than one independent variable, thermal con

  6. Data structures II essentials

    CERN Document Server

    Smolarski, Dennis C

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Data Structures II includes sets, trees, advanced sorting, elementary graph theory, hashing, memory management and garbage collection, and appendices on recursion vs. iteration, alge

  7. Data structures I essentials

    CERN Document Server

    Smolarski, Dennis C

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Data Structures I includes scalar variables, arrays and records, elementary sorting, searching, linked lists, queues, and appendices of binary notation and subprogram parameter passi

  8. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  9. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  10. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  11. Probability functions in the context of signed involutive meadows

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2016-01-01

    The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.

  12. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  13. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  14. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  15. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  16. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  17. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  18. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  19. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  20. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  1. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  2. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  4. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  5. Essentialism goes social: belief in social determinism as a component of psychological essentialism.

    Science.gov (United States)

    Rangel, Ulrike; Keller, Johannes

    2011-06-01

    Individuals tend to explain the characteristics of others with reference to an underlying essence, a tendency that has been termed psychological essentialism. Drawing on current conceptualizations of essentialism as a fundamental mode of social thinking, and on prior studies investigating belief in genetic determinism (BGD) as a component of essentialism, we argue that BGD cannot constitute the sole basis of individuals' essentialist reasoning. Accordingly, we propose belief in social determinism (BSD) as a complementary component of essentialism, which relies on the belief that a person's essential character is shaped by social factors (e.g., upbringing, social background). We developed a scale to measure this social component of essentialism. Results of five correlational studies indicate that (a) BGD and BSD are largely independent, (b) BGD and BSD are related to important correlates of essentialist thinking (e.g., dispositionism, perceived group homogeneity), (c) BGD and BSD are associated with indicators of fundamental epistemic and ideological motives, and (d) the endorsement of each lay theory is associated with vital social-cognitive consequences (particularly stereotyping and prejudice). Two experimental studies examined the idea that the relationship between BSD and prejudice is bidirectional in nature. Study 6 reveals that rendering social-deterministic explanations salient results in increased levels of ingroup favoritism in individuals who chronically endorse BSD. Results of Study 7 show that priming of prejudice enhances endorsement of social-deterministic explanations particularly in persons habitually endorsing prejudiced attitudes. 2011 APA, all rights reserved

  6. Essential Tremor

    Science.gov (United States)

    ... Treatment There is no definitive cure for essential tremor. Symptomatic drug therapy may include propranolol or other beta blockers and primidone, an anticonvulsant drug. Eliminating tremor "triggers" ...

  7. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  8. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  9. Zero field reversal probability in thermally assisted magnetization reversal

    Science.gov (United States)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  10. Path probability of stochastic motion: A functional approach

    Science.gov (United States)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  11. Chemical composition and antifungal activity of thyme (Thymus vulgaris essential oil

    Directory of Open Access Journals (Sweden)

    S. Farsaraei*

    2017-11-01

    Full Text Available Background and objectives: The antifungal activity of the essential oils and their constituents against some phytopathogenic fungi has been reported. Thymus vulgaris (Lamiaceae is one of the Thymus species.  A large number of studies have concerned the chemical compositions and antifungal activity of thyme’s oil. In order to reduce the use of synthetic fungicides, recently considerable attention has been given to search for naturally occurring compounds. The aim of the present work was to determine the chemical composition and antifungal activity of T. vulgaris oil cultivated in Iran. Methods: The essential oil from aerial parts of the plant at full flowering stage was subjected to hydrodistillation and chemical compounds were analyzed by GC/GC-MS. The in vitro antifungal activity against three phytopathogenic fungi (Drechslera spicifera, Fusarium oxysporum f.sp. ciceris and Macrophomina phaseolinaby of the oil was evaluated by agar dilution method. The data were subjected to ANOVA according to the SPSS 21 software. Results: Totally 45 compounds representing 96.75% of the oil were found. Thymol (36.81% and ρ-cymene (30.90% were the main components of thyme oil. According to the results, the antifungal activity of the oil increased with a rising in concentration. All of the tested fungi growth was completely inhibited on 1600 µL/L. In this study fungicidal activity was only observed on F. oxysporum and D. spicifera at concentrations higher than 800 µL/L.  Conclusion: The antifungal activity of T. vulgaris essential oil could be probably due to the high concentration of oxygenated monoterpenes (thymol and monoterpene hydrocarbons (ρ-cymene.

  12. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  13. Python essential reference

    CERN Document Server

    Beazley, David M

    2009-01-01

    Python Essential Reference is the definitive reference guide to the Python programming language — the one authoritative handbook that reliably untangles and explains both the core Python language and the most essential parts of the Python library. Designed for the professional programmer, the book is concise, to the point, and highly accessible. It also includes detailed information on the Python library and many advanced subjects that is not available in either the official Python documentation or any other single reference source. Thoroughly updated to reflect the significant new programming language features and library modules that have been introduced in Python 2.6 and Python 3, the fourth edition of Python Essential Reference is the definitive guide for programmers who need to modernize existing Python code or who are planning an eventual migration to Python 3. Programmers starting a new Python project will find detailed coverage of contemporary Python programming idioms.

  14. Cytotoxic effects of Pinus eldarica essential oil and extracts on HeLa and MCF-7 cell lines.

    Science.gov (United States)

    Sarvmeili, Najmeh; Jafarian-Dehkordi, Abbas; Zolfaghari, Behzad

    2016-12-01

    Several attempts have so far been made in the search of new anticancer agents of plant origin. Some studies have reported that different species of Pine genus possess cytotoxic activities against various cancer cell lines. In the present study, we evaluated the cytotoxic effects of Pinus eldarica bark and leaf extracts or leaf essential oil on HeLa and MCF-7 tumor cell lines. Hydroalcoholic and phenolic extracts and the essential oil of plant were prepared. Total phenolic contents of the extracts were measured using Folin-Ciocalteu reagent. Essential oil components were determined by gas chromatography-mass spectroscopy (GC-MS). Cytotoxic activity of the extracts and essential oil against HeLa and MCF-7 tumor cell lines were assessed by 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl tetrazolium bromide (MTT) assay. The polyphenolic content of hydroalcoholic and phenolic extracts of the bark and hydroalcoholic extract of the leaf were 48.31%, 47.2%, and 8.47%, respectively. According to the GC-MS analysis, the major components of the leaf oil of P. eldarica were: β -caryophyllene (14.8%), germacrene D (12.9%), α-terpinenyl acetate (8.15%), α -pinene (5.7%), and -α humulene (5.9%). Bark extracts and leaf essential oil of P. eldarica significantly reduced the viability of both HeLa and MCF-7 cells in a concentration dependent manner. However, leaf extract showed less inhibitory effects against both cell lines. The essential oil of P. eldarica was more cytotoxic than its hydroalcoholic and phenolic extracts. The terpenes and phenolic compounds were probably responsible for cytotoxicity of P. eldarica . Therefore, P. eldarica might have a good potential for active anticancer agents.

  15. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  16. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  17. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  18. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  19. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    Science.gov (United States)

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (nresearchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  20. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    Science.gov (United States)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to