WorldWideScience

Sample records for efavirenz como probable

  1. Efavirenz in pregnancy : review

    African Journals Online (AJOL)

    2010-09-01

    Sep 1, 2010 ... Merck Sharp and Dohme (MSD, who market efavirenz as Stocrin in South Africa and many other parts of the world) provides prescriber information for efavirenz that cautions against conception in patients already on efavirenz as opposed to the prescription of efavirenz in women already known to be ...

  2. Como pesquisar o perfil patentário de um fármaco: o caso Efavirenz How to determine the patent profile of a drug: a case study of Efavirenz

    Directory of Open Access Journals (Sweden)

    Jaqueline Mendes Soares

    2010-01-01

    Full Text Available The importance of the patent system for researchers, especially in chemistry and related areas, is undeniable. In this context, this work aims at guiding the search in major search engines of patents, in order to map the patents related to a specific chemical compound and identify the material that each patent document protects. In this case study, it was performed a search for the drug efavirenz to demonstrate how to conduct a literature search in patents databases and to map patent applications at national and international levels.

  3. [Apropos of atypical melancholia with Sustiva (efavirenz)].

    Science.gov (United States)

    Lang, J P; Halleguen, O; Picard, A; Lang, J M; Danion, J M

    2001-01-01

    The treatment of HIV infection has changed dramatically in recent years as a result of the development of new drugs which allows a variety of multitherapy combinations more adapted to patients' needs and thereby improving compliance. Efavirenz is a non-nucleoside reverse transcriptase inhibitor. In addition to a potent antiretroviral activity, efavirenz is an easy-to-take drug with once-daily dosing and is usually well tolerated. Efavirenz, however, may induce psychic alterations which are variable and atypical in both their clinical presentation and severity. As early as the first days of treatment, efavirenz may provoke surprising phenomena such as nightmares, vivid dreams, hallucinations or illusions, and twilight states. Depersonalization and derealization episodes, personality alterations, stream of thought troubles and unusual thought contents, atypical depression and cognitive disorders have also been observed. These phenomena may occur either early or later on treatment. The prevalence of severe psychic disorders is less than 5%, but they are often responsible for harmful treatment discontinuations. Psychiatric side effects are heterogeneous and probably not related to pre-existing psychologic weakness. We do not have enough data to evaluate these side effects and their etiopathogeny. The drug could act directly on the central nervous system since it crosses the blood-brain barrier, on the serotoninergic and dopaminergic systems. Some authors have compared efavirenz-induced psychic effects to those associated with LSD and found structural similarities between the two molecules. However, the heterogeneity and low prevalence of the psychiatric side effects of efavirenz suggest and individual sensitivity. In order to improve patient care, a better clinical approach, neuropsychological evaluation, and functional brain imagery should be used to progress in the analysis and comprehension of these disorders. We discuss in this paper the case of Mister H. This HIV

  4. Influence of milling process on efavirenz solubility

    Directory of Open Access Journals (Sweden)

    Erizal Zaini

    2017-01-01

    Full Text Available Introduction: The aim of this study was to investigate the influence of the milling process on the solubility of efavirenz. Materials and Methods: Milling process was done using Nanomilling for 30, 60, and 180 min. Intact and milled efavirenz were characterized by powder X-ray diffraction, scanning electron microscopy (SEM, spectroscopy infrared (IR, differential scanning calorimetry (DSC, and solubility test. Results: The X-ray diffractogram showed a decline on peak intensity of milled efavirenz compared to intact efavirenz. The SEM graph depicted the change from crystalline to amorphous habit after milling process. The IR spectrum showed there was no difference between intact and milled efavirenz. Thermal analysis which performed by DSC showed a reduction on endothermic peak after milling process which related to decreasing of crystallinity. Solubility test of intact and milled efavirenz was conducted in distilled water free CO2with 0.25% sodium lauryl sulfate media and measured using high-performance liquid chromatography method with acetonitrile: distilled water (80:20 as mobile phases. The solubility was significantly increased (P < 0.05 after milling processes, which the intact efavirenz was 27.12 ± 2.05, while the milled efavirenz for 30, 60, and 180 min were 75.53 ± 1.59, 82.34 ± 1.23, and 104.75 ± 0.96 μg/mL, respectively. Conclusions: Based on the results, the solubility of efavirenz improved after milling process.

  5. FDA approves efavirenz. Food and Drug Administration.

    Science.gov (United States)

    Highleyman, L

    1998-10-01

    The Food and Drug Administration (FDA) approved DuPont Pharma's new non-nucleoside reverse transcriptase inhibitor (NNRTI) efavirenz (Sustiva, DMP-266). Efavirenz has shown promise in trials with over 2000 participants for up to 24 weeks, and early data suggests it may be as effective as protease inhibitors when used in a combination regimen. It is the first anti-HIV drug approved for once-daily dosing. Efavirenz is well tolerated, and the main side effects reported are dizziness, insomnia, abnormal dreams, and skin rash. Efavirenz has been approved for adults and children, but should not be used by pregnant women. Contact information is provided.

  6. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  7. Crystal structure determination of Efavirenz

    International Nuclear Information System (INIS)

    Popeneciu, Horea; Dumitru, Ristoiu; Tripon, Carmen; Borodi, Gheorghe; Pop, Mihaela Maria

    2015-01-01

    Needle-shaped single crystals of the title compound, C 14 H 9 ClF 3 NO 2 , were obtained from a co-crystallization experiment of Efavirenz with maleic acid in a (1:1) ratio, using methanol as solvent. Crystal structure determination at room temperature revealed a significant anisotropy of the lattice expansion compared to the previously reported low-temperature structure. In both low- and room temperature structures the cyclopropylethynyl fragment in one of the asymmetric unit molecules is disordered. While at low-temperature only one C atom exhibits positional disorder, at room temperature the disorder is present for two C atoms of the cyclopropane ring

  8. Efavirenz

    Science.gov (United States)

    ... along with practicing safer sex and making other life-style changes may decrease the risk of transmitting ( ... exist), loss of touch with reality, or other strange thoughts. Be sure your family knows which symptoms ...

  9. Efavirenz: A review of the epidemiology, severity and management ...

    African Journals Online (AJOL)

    The five classes of drugs used for the .... The three main risk factors for the development of NPSEs in HIV- positive .... Influence of efavirenz pharmacokinetics and ... efavirenz on neuropsychological performance and symptoms in HIV-infected ...

  10. Estratégias utilizadas para o incremento da solubilidade do fármaco antiretroviral classe II: Efavirenz

    Directory of Open Access Journals (Sweden)

    Tarcyla Andrade Gomes

    2015-10-01

    Full Text Available O efavirenz (EFZ é considerado um dos fármacos anti- HIV mais utilizados, porém, como a grande maioria dos antirretrovirais, é classificado como fármaco de classe II, segundo o Sistema de Classificação Biofarmacêutica (SCB, por apresentar baixa solubilidade e alta permeabilidade. É bem conhecido que a solubilidade aquosa de um fármaco constitui requisito prévio à absorção e, assim, se faz uma das mais importantes barreiras à eficácia do medicamento. Desta forma, o aumento, através de tecnologias farmacêuticas, da dissolução aquosa e, conseqüentemente, da biodisponibilidade de fármacos pouco solúveis em água é considerado como um dos mais desafiantes aspectos no desenvolvimento moderno de fármacos. Este trabalho tem como objetivo realizar um levantamento da literatura cientifica e discussão sobre as principais técnicas aplicadas no melhoramento da dissolução do EFZ, dentre elas: dispersões sólidas, complexos de inclusão, sistemas multicomponentes e sistemas particulados nos últimos 10 anos (2004 – 2014. Após levantamento bibliográfico, verificou-se maior número de publicações empregando a técnica de dispersões sólidas, provavelmente, porque a mesma é considerada uma técnica simples e de baixo custo. Desta forma, ficou claro que há grande interesse, por parte dos pesquisadores, de desenvolver métodos eficientes e econômicos que visam o melhoramento da dissolução aquosa deste fármaco e que o desenvolvimento de dispersões sólidas é, sem dúvida, uma solução interessante.Palavras-chave: Efavirenz. Solubilidade. Dispersões Sólidas. Complexos de Inclusão. Sistemas Multicomponentes. Sistemas Particulados. ABSTRACT Efavirenz (EFZ is considered one of the most used anti-HIV drugs. However, like the others anti-retroviral drugs, it is classified as a class II drug, according to the Biopharmaceutics Classification System (BCS, due to its low solubility and high permeability. It is well-known that

  11. Efavirenz Dissolution Enhancement I: Co-Micronization

    Directory of Open Access Journals (Sweden)

    Helvécio Vinícius Antunes Rocha

    2012-12-01

    Full Text Available AIDS constitutes one of the most serious infectious diseases, representing a major public health priority. Efavirenz (EFV, one of the most widely used drugs for this pathology, belongs to the Class II of the Biopharmaceutics Classification System for drugs with very poor water solubility. To improve EFV’s dissolution profile, changes can be made to the physical properties of the drug that do not lead to any accompanying molecular modifications. Therefore, the study objective was to develop and characterize systems with efavirenz able to improve its dissolution, which were co-processed with sodium lauryl sulfate (SLS and polyvinylpyrrolidone (PVP. The technique used was co-micronization. Three different drug:excipient ratios were tested for each of the two carriers. The drug dispersion dissolution results showed significant improvement for all the co-processed samples in comparison to non-processed material and corresponding physical mixtures. The dissolution profiles obtained for dispersion with co-micronized SLS samples proved superior to those of co-micronized PVP, with the proportion (1:0.25 proving the optimal mixture. The improvements may be explained by the hypothesis that formation of a hydrophilic layer on the surface of the micronized drug increases the wettability of the system formed, corroborated by characterization results indicating no loss of crystallinity and an absence of interaction at the molecular level.

  12. Acute Liver Toxicity due to Efavirenz/Emtricitabine/Tenofovir

    Directory of Open Access Journals (Sweden)

    Rashmee Patil

    2015-01-01

    Full Text Available The fixed-dose combination of Efavirenz/Emtricitabine/Tenofovir is a first-line agent for the treatment of HIV; however few cases have reported hepatotoxicity associated with the drug. We report a case of Efavirenz/Emtricitabine/Tenofovir-associated hepatotoxicity presenting mainly with hepatocellular injury characterized by extremely elevated aminotransferase levels, which resolved without acute liver failure or need for liver transplant referral.

  13. Biowaiver monographs for immediate release solid oral dosage forms: efavirenz.

    Science.gov (United States)

    Cristofoletti, Rodrigo; Nair, Anita; Abrahamsson, Bertil; Groot, D W; Kopp, Sabine; Langguth, Peter; Polli, James E; Shah, Vinod P; Dressman, Jennifer B

    2013-02-01

    Literature data pertaining to the decision to allow a waiver of in vivo bioequivalence testing for the approval of immediate-release (IR) solid oral dosage forms containing efavirenz as the only active pharmaceutical ingredient (API) are reviewed. Because of lack of conclusive data about efavirenz's permeability and its failure to comply with the "high solubility" criteria according to the Biopharmaceutics Classification System (BCS), the API can be classified as BCS Class II/IV. In line with the solubility characteristics, the innovator product does not meet the dissolution criteria for a "rapidly dissolving product." Furthermore, product variations containing commonly used excipients or in the manufacturing process have been reported to impact the rate and extent of efavirenz absorption. Despite its wide therapeutic index, subtherapeutic levels of efavirenz can lead to treatment failure and also facilitate the emergence of efavirenz-resistant mutants. For all these reasons, a biowaiver for IR solid oral dosage forms containing efavirenz as the sole API is not scientifically justified for reformulated or multisource drug products. Copyright © 2012 Wiley Periodicals, Inc.

  14. Long term adverse drug reaction to Efavirenz in a HIV infected ...

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    2015-08-31

    Aug 31, 2015 ... presented with five-day history of Difficulty sleeping, abnormal dreams, inability to ... He regained his memory, no longer had bad dreams or demonstrated any irrational .... concentration of Efavirenz on long term. Efavirenz ...

  15. Desenvolvimento de formulações e tecnologia de obtenção de comprimidos revestidos de efavirenz: terapia anti-HIV Development of formulations and technology of efavirenz coated tablets obtention: anti-HIV therapy

    Directory of Open Access Journals (Sweden)

    Osnir de Sá Viana

    2006-12-01

    Full Text Available O efavirenz é uma das mais recentes classes de agentes anti-retrovirais aplicados no tratamento de infecções por HIV. Está entre os medicamentos de primeira escolha no tratamento da AIDS. Como o efavirenz possui característica hidrofóbica, baixa densidade e oferece grande resistência ao escoamento, a escolha de uma formulação adequada deste fármaco é essencial no desenvolvimento dos comprimidos e para garantir melhor disponibilização no trato gastrointestinal, de forma a alcançar a biodisponibilidade e o efeito terapêutico desejados. Neste trabalho, apresentamos, de forma lógica, o desenvolvimento tecnológico de comprimidos revestidos de efavirenz, levando em consideração suas características físicas e físico-químicas. Os núcleos (comprimidos de efavirenz foram obtidos utilizando-se a técnica de granulação por via úmida. No revestimento por película utilizou-se Opadry® Y-1-7000 em sistema aquoso. Os parâmetros adotados para avaliação física dos comprimidos seguiram as especificações farmacopéicas oficiais e a determinação quantitativa foi realizada mediante método analítico desenvolvido e validado.Efavirenz is one of the most recent classes of anti-retroviral drugs used in the treatment of HIV infections. It is the first choice therapy for AIDS treatment. As efavirenz detains hydrofobic characteristic, low density and offers great resistance to draining, it has been essential to choose an adequated formulation for it, in order to guarantee drug's availability in gastro-intestinal tract, achieving bioavailability and expected therapeutical effects. This work presents efavirez coated tablets thecnological development, considering the physics and physico-chemical characteristics of this drug. The efavirez nucleus (tablets has been obtained employing granulation technics by humid via. In the coating by film, Opadry Y-1-7000 in aquous system was used. The adopted parameters to tablet phisics evaluation

  16. Efavirenz poisoning in a 12 year old HIV negative African boy ...

    African Journals Online (AJOL)

    Efavirenz is an oral antiretroviral drug in the class of non nucleoside reverse transcriptase inhibitors. Toxicity at therapeutic doses has been documented but there is scarcity of data on presentation and management of Efavirenz overdose. We describe a case of Efavirenz poisoning in a 12-year old HIV Negative African boy ...

  17. Efavirenz: A review of the epidemiology, severity and management ...

    African Journals Online (AJOL)

    The NPSEs tend to occur within the first few days of initiation of therapy and resolve spontaneously within the first 4 - 6 weeks, with the most commonly reported being dizziness, insomnia, headache, abnormal dreams and impaired concentration. The plasma level of efavirenz and genetic polymorphisms are thought to play ...

  18. Influence of the Efavirenz Micronization on Tableting and Dissolution

    Directory of Open Access Journals (Sweden)

    Lucio Mendes Cabral

    2012-09-01

    Full Text Available The purpose of this study was to propose an analytical procedure that provides the effects of particle size and surface area on dissolution of efavirenz. Five different batches obtained by different micronization processes and with different particle size distribution and surface area were studied. The preformulation studies and dissolution curves were used to confirm the particle size distribution effect on drug solubility. No polymorphic variety or amorphization was observed in the tested batches and the particle size distribution was determined as directly responsible for the improvement of drug dissolution. The influence of the preparation process on the tablets derived from efavirenz was observed in the final dissolution result in which agglomeration, usually seen in non-lipophilic micronized material, was avoided through the use of an appropriate wet granulation method. For these reasons, micronization may represent one viable alternative for the formulation of brick dust drugs.

  19. Rethinking the risk-benefit ratio of efavirenz in HIV-infected children

    NARCIS (Netherlands)

    Wijer, L van de; Schellekens, A.F.A.; Burger, D.M.; Homberg, J.R.; Mast, Q. de; Ven, A.J.A.M. van der

    2016-01-01

    The non-nucleoside reverse transcriptase inhibitor efavirenz is part of the WHO guidelines for preferred first-line treatment of HIV-1-infected adults, pregnant and lactating women, and children. Efavirenz is well known to cause CNS toxicity. Although good data for CNS toxicity are available for

  20. Improvement of Depression and Anxiety After Discontinuation of Long- Term Efavirenz Treatment

    NARCIS (Netherlands)

    Mothapo, K.M.; Schellekens, A.F.A.; Crevel, R. van; Keuter, M.; Grintjes-Huisman, K.; Koopmans, P.; Ven, A. van der

    2015-01-01

    Neuropsychiatric symptoms in human immunodeficiency virus (HIV)-infected patients may be a late complication of efavirenz treatment. This study: 1) assessed the level of neuropsychiatric symptoms in HIV-infected patients on long-term efavirenz therapy; 2) explored the effect of a switch to

  1. EFAVIRENZ-INDUCED GYNAECOMASTIA IN HIV INFECTED MALES: A REPORT OF 2 CASES

    Directory of Open Access Journals (Sweden)

    Ishwar Sidappa Hasabi

    2016-08-01

    Full Text Available Highly Active Antiretroviral Therapy (HAART has been a major leap in the treatment of HIV. HAART has improved both morbidity and mortality in HIV patients. Of late, the cases of gynaecomastia are increasing secondary to initiation of ART. Efavirenz-induced gynaecomastia still remains underreported. CASE PRESENTATION We hereby report two cases of Efavirenz-induced Gynaecomastia in young males with median duration of 12 months on Efavirenz after valid written consent. CONCLUSION Efavirenz is being used as a first line regimen drug for ART initiation and also when patient has tuberculosis as opportunistic infection. Hence, the side effects of Efavirenz should be addressed and proper guidelines should be framed to manage the same.

  2. A systematic review of the psychiatric side-effects of efavirenz.

    Science.gov (United States)

    Kenedi, Christopher A; Goforth, Harold W

    2011-11-01

    Concerns regarding the use of efavirenz in patients with a history of mental illness may predispose clinicians to not offer this agent to psychiatrically ill populations in spite of the convenience of once daily dosing, which can result in improved adherence in these at-risk populations. This systematic review examines the current data regarding the neuropsychiatric effects of efavirenz, and also attempts to provide guidance to clinicians using efavirenz to treat patients with mental illness. The review identified high rates of neuropsychiatric side effects including vivid dreams, insomnia and mood changes in approximately 50% of patients who initiate efavirenz. The effects begin quickly, commonly peak in the first 2 weeks, and are generally mild and transient in nature. Isolated case reports and uncontrolled data suggest higher rates of severe side effects; however, there is no clear evidence of a broadly increased risk of suicide or dangerous behavior for patients taking efavirenz as part of their antiretroviral regimen.

  3. Licença compulsória do efavirenz no Brasil em 2007: contextualização Compulsory licensing of efavirenz in Brazil in 2007: contextualization

    Directory of Open Access Journals (Sweden)

    William C. V. Rodrigues

    2009-12-01

    Full Text Available The present article aims at contextualizing the first Brazilian experience with compulsory licensing, which functions as a defense mechanism to prevent excessive pricing by holders of patents. According to this mechanism, a government can authorize a third party to explore the patented object (in this case a drug without previous consent from the patent holder. On May 4, 2007, Brazil officially issued compulsory licensing of the antiretroviral drug efavirenz for public, non-commercial use. Initially, generic versions of the drug were purchased from laboratories in India. The next step was the manufacture of efavirenz by Farmanguinhos, official pharmaceutical laboratory (Fundação Osvaldo Cruz. It is concluded that the decision made by the Brazilian government to issue compulsory licensing of efavirenz nwas correct, taking into account the projected savings of US$ 236.8 until 2012 and the guarantee of availability of efavirenz, the most usual free antiretroviral treatment provided in Brazil.

  4. Efavirenz-induced gynecomastia in a prepubertal girl with human immunodeficiency virus infection: a case report

    Science.gov (United States)

    2013-01-01

    Background Prepubertal gynecomastia is a rare condition and most frequently classified as idiopathic. In HIV-infected adults gynecomastia is a recognised but infrequent side-effect of antiretroviral treatment (ART) and mostly attributed to efavirenz use. Gynecomastia should be distinguished from pseudogynecomastia as part of the lipodystrophy syndrome caused by Nucleoside Reverse Transcriptase Inhibitors (NRTIs) to avoid incorrect substitution of drugs. In the medical literature only five cases of prepubertal gynecomastia in children taking ART are described and underlying pathogenesis was unknown. The occurrence of adverse effects of ART may interfere with therapy adherence and long-term prognosis and for that reason requires attention. We report the first case of prepubertal gynecomastia in a young girl attributed to efavirenz use. Case presentation A seven-year-old African girl presented with true gynecomastia four months after initiation on ART (abacavir, lamivudine, efavirenz). History, physical examination and laboratory tests excluded known causes of gynecomastia and efavirenz was considered as the most likely cause. Six weeks after withdrawal of efavirenz the breast enlargement had completely resolved. Conclusions Efavirenz-induced gynecomastia may occur in children as well as in adults. With the increasing access to ART, the possibility of efavirenz-exposure and the potential occurrence of its associated side-effects may be high. In resource-poor settings, empirical change from efavirenz to nevirapine may be considered, providing no other known or alarming cause is identified, as efavirenz-induced gynecomastia can resolve quickly after withdrawal of the drug. Timely recognition of gynecomastia as a side-effect of efavirenz is important in order to intervene while the condition may still be reversible, to sustain adherence to ART and to maintain the sociopsychological health of the child. PMID:23941256

  5. Allosteric activation of cytochrome P450 3A4 by efavirenz facilitates midazolam binding.

    Science.gov (United States)

    Ichikawa, Tomohiko; Tsujino, Hirofumi; Miki, Takahiro; Kobayashi, Masaya; Matsubara, Chiaki; Miyata, Sara; Yamashita, Taku; Takeshita, Kohei; Yonezawa, Yasushige; Uno, Tadayuki

    2017-12-18

    1. The purpose of this study is to investigate the heteroactivation mechanism of CYP3A4 by efavirenz, which enhances metabolism of midazolam in vivo, in terms of its binding to CYP3A4 with in vitro spectroscopic methods. 2. Efavirenz exhibited a type II spectral change with binding to CYP3A4 indicating a possible inhibitor. Although dissociation constant (K d ) was approximated as 520 μM, efavirenz enhanced binding affinity of midazolam as a co-existing drug with an estimated iK d value of 5.6 µM which is comparable to a clinical concentration. 3. Efavirenz stimulated the formation of 1'-hydroxymidazolam, and the product formation rate (V max ) concentration-dependently increased without changing the K m . Besides, an efavirenz analogue, [6-chloro-1,4-dihydro-4-(1-pentynyl)-4-(trifluoromethyl)-2H-3,1-benzoxazin-2-one] (efavirenz impurity) slightly facilitated the binding affinity of midazolam in a concentration-dependent manner. These results propose that efavirenz affects midazolam-binding via binding to the peripheral site which is apart from the active site of CYP3A4. 4. A molecular dynamics simulation also suggested the bound-efavirenz was repositioned to effector-binding site. As a consequence, our spectroscopic studies clarified the heteroactivation of CYP3A4 caused by efavirenz with a proper affinity to the peripheral site, and we concluded the method can be a useful tool for characterising the potential for drug-drug interactions.

  6. HIV and mental illness in Malawi and the neuropsychiatric sequelae of efavirenz.

    Science.gov (United States)

    Drury, Andrew; Gleadow-Ware, Selena; Gilfillan, Sheila; Ahrens, Jen

    2018-03-01

    Little is published about mental disorders in Malawi, specifically in relation to Human Immunodeficiency Virus (HIV) and it's treatment. Efavirenz is a medication commonly used as part of triple therapy for HIV treatment. Indeed, in 2013, Malawi introduced 5A with Efavirenz as part of it's 1st line treatment for HIV. There exists some literature documenting known psychiatric side effects of Efavirenz, which include anxiety, mood changes, nightmares, psychosis and suicidal ideation. Little is known about what features are most common in the presentation and what factors in the patient and drug which may make this reaction more likely. The aim of this commentary is to review the association between HIV and psychiatric disorder, and consider the neuropsychiatric side-effects of Efavirenz. An evaluative literature review was completed by means of multiple electronic database search as well as an additional manual search to obtain published works identified through the electronic search. Search terms used were: Efavirenz, Acquired Immunodeficiency Syndrome, Africa, Antiretroviral Therapy, Developing Countries, Malawi, Mental Disorders, Public Health, and Psychiatry. This is an important area of study, as potentially large numbers of individuals with HIV are being placed on Efavirenz as first line treatment, yet 60% may experience some form of neuropsychiatric side effects.

  7. Safety, strength and simplicity of efavirenz in pregnancy

    Directory of Open Access Journals (Sweden)

    Prinitha Pillay

    2012-03-01

    Full Text Available The WHO recommends starting lifelong ART for all pregnant women with a CD4 count at or below 350 cells/mm³, which recognises the important component of ‘when to start’ and the role that timing of initiation plays in reducing mortality and disease progression. The data on ‘what to start’ are conflicting, and options for resource-limited settings are limited. The choice of an ART regimen for pregnant women is complicated by the need to take into account the health and safety of both the mother and baby. Particularly contentious is whether to use a nevirapine- (NVP or efavirenz- (EFV based regimen. This review presents the latest evidence on the safety and efficacy of EFV and NVP in pregnancy and offers recommendations for improving maternal and child health outcomes and avoid mother-to-child transmission as South Africa moves toward turning back the tide on its HIV epidemic.

  8. Quantifying the risks and benefits of efavirenz use in HIV-infected women of childbearing age in the USA.

    Science.gov (United States)

    Hsu, H E; Rydzak, C E; Cotich, K L; Wang, B; Sax, P E; Losina, E; Freedberg, K A; Goldie, S J; Lu, Z; Walensky, R P

    2011-02-01

    The aim of the study was to quantify the benefits (life expectancy gains) and risks (efavirenz-related teratogenicity) associated with using efavirenz in HIV-infected women of childbearing age in the USA. We used data from the Women's Interagency HIV Study in an HIV disease simulation model to estimate life expectancy in women who receive an efavirenz-based initial antiretroviral regimen compared with those who delay efavirenz use and receive a boosted protease inhibitor-based initial regimen. To estimate excess risk of teratogenic events with and without efavirenz exposure per 100,000 women, we incorporated literature-based rates of pregnancy, live births, and teratogenic events into a decision analytic model. We assumed a teratogenicity risk of 2.90 events/100 live births in women exposed to efavirenz during pregnancy and 2.68/100 live births in unexposed women. Survival for HIV-infected women who received an efavirenz-based initial antiretroviral therapy (ART) regimen was 0.89 years greater than for women receiving non-efavirenz-based initial therapy (28.91 vs. 28.02 years). The rate of teratogenic events was 77.26/100,000 exposed women, compared with 72.46/100,000 unexposed women. Survival estimates were sensitive to variations in treatment efficacy and AIDS-related mortality. Estimates of excess teratogenic events were most sensitive to pregnancy rates and number of teratogenic events/100 live births in efavirenz-exposed women. Use of non-efavirenz-based initial ART in HIV-infected women of childbearing age may reduce life expectancy gains from antiretroviral treatment, but may also prevent teratogenic events. Decision-making regarding efavirenz use presents a trade-off between these two risks; this study can inform discussions between patients and health care providers.

  9. Quantifying the risks and benefits of efavirenz use in HIV-infected women of childbearing age in the United States

    Science.gov (United States)

    Hsu, HE; Rydzak, CE; Cotich, KL; Wang, B; Sax, PE; Losina, E; Freedberg, KA; Goldie, SJ; Lu, Z; Walensky, RP

    2010-01-01

    Objectives We quantified the benefits (life expectancy gains) and harms (efavirenz-related teratogenicity) associated with using efavirenz in HIV-infected women of childbearing age in the United States. Methods We used data from the Women’s Interagency HIV Study in an HIV disease simulation model to estimate life expectancy in women who receive an efavirenz-based initial antiretroviral regimen compared with those who delay efavirenz use and receive a boosted protease inhibitor-based initial regimen. To estimate excess risk of teratogenic events with and without efavirenz exposure per 100,000 women, we incorporated literature-based rates of pregnancy, live births, and teratogenic events into a decision analytic model. We assumed a teratogenicity risk of 2.90 events/100 live births in women exposed to efavirenz during pregnancy and 2.68/100 live births in unexposed women. Results Survival for HIV-infected women who received an efavirenz-based initial antiretroviral therapy regimen was 0.89 years greater than for women receiving non-efavirenz-based initial therapy (28.91 vs. 28.02 years). The rate of teratogenic events was 77.26/100,000 exposed women, compared with 72.46/100,000 unexposed women. Survival estimates were sensitive to variations in treatment efficacy and AIDS-related mortality. Estimates of excess teratogenic events were most sensitive to pregnancy rates and number of teratogenic events/100 live births in efavirenz-exposed women. Conclusions Use of non-efavirenz-based initial antiretroviral therapy in HIV-infected women of childbearing age may reduce life expectancy gains from antiretroviral treatment, but may also prevent teratogenic events. Decision-making regarding efavirenz use presents a tradeoff between these two risks; this study can inform discussions between patients and health care providers. PMID:20561082

  10. Pharmacokinetics of efavirenz and treatment of HIV-1 among pregnant women with and without tuberculosis coinfection.

    Science.gov (United States)

    Dooley, Kelly E; Denti, Paolo; Martinson, Neil; Cohn, Silvia; Mashabela, Fildah; Hoffmann, Jennifer; Haas, David W; Hull, Jennifer; Msandiwa, Regina; Castel, Sandra; Wiesner, Lubbe; Chaisson, Richard E; McIlleron, Helen

    2015-01-15

    Pregnancy and tuberculosis treatment or prophylaxis can affect efavirenz pharmacokinetics, maternal human immunodeficiency virus type 1 (HIV-1) treatment outcomes, and mother-to-child transmission (MTCT) risk. We evaluated a prospective cohort of pregnant, HIV-infected women with and without tuberculosis in Soweto, South Africa. Pharmacokinetic sampling was performed at gestation week 37 and during the postpartum period. Efavirenz trough concentrations (Cmin) were predicted using population pharmacokinetic models. HIV-viral load was measured at delivery for mothers and at 6 weeks of age for infants. Ninety-seven women participated; 44 had tuberculosis. Median efavirenz Cmin during pregnancy was 1.35 µg/mL (interquartile range [IQR], 0.90-2.07 µg/mL; 27% had an efavirenz Cmin of pregnant women with extensive CYP2B6 genotypes had an efavirenz Cmin of HIV-viral load at delivery was more common among pregnant women with tuberculosis, in whom ART was generally initiated later. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Chemical interactions study of antiretroviral drugs efavirenz and lamivudine concerning the development of stable fixed-dose combination formulations for AIDS treatment

    International Nuclear Information System (INIS)

    Gomes, Elionai C. de L.; Mussel, Wagner N.; Resende, Jarbas M.; Yoshida, Maria I.

    2013-01-01

    Lamivudine and efavirenz are among the most worldwide used drugs for acquired immune deficiency syndrome (AIDS) treatment. Solid state nuclear magnetic resonance (ssNMR), Fourier-transformed infrared spectroscopy (FTIR), differential scanning calorimetry (DSC) and thermo-optical analysis (TOA) were used to study possible interactions between these drugs, aiming the development of a fixed-dose drug combination. DSC and TOA have evidenced significant shifts on the melting points of both drugs in the mixture, which may be due to interaction between them. Although DSC and TOA results indicated incompatibility between the drugs, FTIR spectra were mostly unmodified due to overlapping peaks. The ssNMR analyses showed significant changes in chemical shifts values of the mixture when compared with spectra of pure drugs, especially in the signals relating to the deficient electron carbon atoms of both drugs. These results confirm the interactions suggested by DSC and TOA, which is probably due to acid-base interactions between electronegative and deficient electron atoms of both lamivudine and efavirenz. (author)

  12. Chemical interactions study of antiretroviral drugs efavirenz and lamivudine concerning the development of stable fixed-dose combination formulations for AIDS treatment

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Elionai C. de L.; Mussel, Wagner N.; Resende, Jarbas M.; Yoshida, Maria I., E-mail: mirene@ufmg.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Instituto de Ciencias Exatas. Departamento de Quimica; Fialho, Silvia L.; Barbosa, Jamile; Fialho, Silvia L. [Fundacao Ezequiel Dias, Belo Horizonte, MG (Brazil)

    2013-04-15

    Lamivudine and efavirenz are among the most worldwide used drugs for acquired immune deficiency syndrome (AIDS) treatment. Solid state nuclear magnetic resonance (ssNMR), Fourier-transformed infrared spectroscopy (FTIR), differential scanning calorimetry (DSC) and thermo-optical analysis (TOA) were used to study possible interactions between these drugs, aiming the development of a fixed-dose drug combination. DSC and TOA have evidenced significant shifts on the melting points of both drugs in the mixture, which may be due to interaction between them. Although DSC and TOA results indicated incompatibility between the drugs, FTIR spectra were mostly unmodified due to overlapping peaks. The ssNMR analyses showed significant changes in chemical shifts values of the mixture when compared with spectra of pure drugs, especially in the signals relating to the deficient electron carbon atoms of both drugs. These results confirm the interactions suggested by DSC and TOA, which is probably due to acid-base interactions between electronegative and deficient electron atoms of both lamivudine and efavirenz. (author)

  13. Effects to chronic administration of Efavirenz on the body and brain ...

    African Journals Online (AJOL)

    The rats of both sexes (n=16), with the average weight of 200g were randomly assigned into treatment (n=8) and control (n=8) groups. The rats in the treatment group received 8.57mg/kg bogy weight of Efavirenz dissolved in distilled water daily for 30 days (thirty days) through the orogastric tube. The control group received ...

  14. Long term adverse drug reaction to Efavirenz in a HIV infected ...

    African Journals Online (AJOL)

    There is only one published case of serious adverse reaction to Efavirenz in an adolescent after long-term use. The case of a male HIV Positive Nigerian patient aged 13 years. He presented with five-day history of Difficulty sleeping, abnormal dreams, inability to concentrate, restlessness, irrational behavior and long-term ...

  15. Race/Ethnicity and the Pharmacogenetics of Reported Suicidality With Efavirenz Among Clinical Trials Participants.

    Science.gov (United States)

    Mollan, Katie R; Tierney, Camlin; Hellwege, Jacklyn N; Eron, Joseph J; Hudgens, Michael G; Gulick, Roy M; Haubrich, Richard; Sax, Paul E; Campbell, Thomas B; Daar, Eric S; Robertson, Kevin R; Ventura, Diana; Ma, Qing; Edwards, Digna R Velez; Haas, David W

    2017-09-01

    We examined associations between suicidality and genotypes that predict plasma efavirenz exposure among AIDS Clinical Trials Group study participants in the United States. Four clinical trials randomly assigned treatment-naive participants to efavirenz-containing regimens; suicidality was defined as reported suicidal ideation or attempted or completed suicide. Genotypes that predict plasma efavirenz exposure were defined by CYP2B6 and CYP2A6 polymorphisms. Associations were evaluated with weighted Cox proportional hazards models stratified by race/ethnicity. Additional analyses adjusted for genetic ancestry and selected covariates. Among 1833 participants, suicidality was documented in 41 in exposed analyses, and 34 in on-treatment analyses. In unadjusted analyses based on 12 genotype levels, suicidality increased per level in exposed (hazard ratio, 1.11; 95% confidence interval, .96-1.27) and on-treatment 1.16; 1.01-1.34) analyses. In the on-treatment analysis, the association was strongest among white but nearly null among black participants. Considering 3 metabolizer levels (extensive, intermediate and slow), slow metabolizers were at increased risk. Results were similar after baseline covariate-adjustment for genetic ancestry, sex, age, weight, injection drug use history, and psychiatric history or recent psychoactive medication. Genotypes that predict higher plasma efavirenz exposure were associated with increased risk of suicidality. Strength of association varied by race/ethnicity. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  16. Enhancement of Solubility, Dissolution rate and Bioavailability of Efavirenz by Cyclodextrins and Solutol HS15 - A Factorial Study

    OpenAIRE

    R. Yogananda; K. P. R. Chowdary

    2013-01-01

    Efavirenz widely prescribed anti-retroviral drug belongs to class II BCS and exhibit low and variable oral bioavailability due to its poor aqueous solubility and it requires enhancement in solubility and dissolution rate for increasing its oral bioavailability. The objective of the present investigation is to enhance the solubility, dissolution rate and bioavailability of efavirenz by the use of cyclodextrins (%CD and HP%CD) and surfactant, Solutol HS15. The individual main effects and combin...

  17. Can voluntary pooled procurement reduce the price of antiretroviral drugs? a case study of Efavirenz.

    Science.gov (United States)

    Kim, Sung Wook; Skordis-Worrall, Jolene

    2017-05-01

    : A number of strategies have aimed to assist countries in procuring antiretroviral therapy (ARV) at lower prices. In 2009, as the Global Fund to Fight AIDS, Tuberculosis and Malaria (GFATM) commenced a voluntary pooled procurement scheme, however, the impact of the scheme on ARV prices remains uncertain. This study aims to estimate the effect of VPP on drug prices using Efavirenz as a case study. This analysis uses WHO Global price report mechanism (GPRM) data from 2004 to 2013. Due to the highly skewed distribution of drug Prices, a generalized linear model (GLM) was used to conduct a difference-in-difference estimation of drug price changes over time. These analyses found that voluntary pooled procurement reduced both the ex-works price of generic Efavirenz and the incoterms price by 16.2 and 19.1%, respectively ( P <  0.001) in both cases). The year dummies were also statistically significant from 2006 to 2013 ( P <  0.001), indicating a strong decreasing trend in the price of Efavirenz over that period. Voluntary pooled procurement significantly reduced the price of 600 mg generic Efavirenz between 2009 and 2013. Voluntary pooled procurement therefore offers a potentially effective strategy for the reduction in HIV drug prices and the improvement of technical efficiency in HIV programming. Further work is required to establish if these findings hold also for other drugs. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  18. Acute Liver Failure among Patients on Efavirenz-Based Antiretroviral Therapy

    Directory of Open Access Journals (Sweden)

    Innocent Lule Segamwenge

    2018-01-01

    Full Text Available Objectives. To describe the clinical characteristics of patients presenting with fulminant liver failure after varying periods of exposure to Efavirenz containing antiretroviral medications. Methods. We report a series of 4 patients with human immunodeficiency virus (HIV infection who were admitted with acute liver failure (ALF over a 6-month period. All these patients had been treated with a range of Efavirenz containing antiretroviral regimens and were negative for hepatitis A, B, and C infections as well as other opportunistic infections, all were negative for autoimmune hepatitis, and none had evidence of chronic liver disease or use of alcohol or herbal medications. Information on patient clinical characteristics, current antiretroviral regimen, CD4 count, HIV-1 RNA levels, and clinical chemistry parameters was collected. Informed consent was provided. Results. During a 6-month period, four patients without other known risk factors for acute hepatitis presented with symptomatic drug-induced liver injury with varying symptoms and outcomes. The pattern of liver injury was hepatocellular for all the 4 cases. Liver biopsies were done for all the four cases and the results showed a heavy mixed inflammatory cell infiltrate with eosinophils. For three patients withdrawal of Efavirenz from their antiretroviral regimen was sufficient to restore transaminase levels to normal and led to improvement of clinical symptoms. For one patient his clinical course was characterized by fulminant liver failure and fluctuating episodes of hepatic encephalopathy which ultimately resulted in his death. Conclusion. Hepatotoxicity of Efavirenz is not as rare as previously described in the literature and does actually present with fatal outcomes. The key message to note is that frequent monitoring of liver enzymes should be done at initiation of antiretroviral therapy and should continue throughout the treatment period.

  19. Efavirenz Has the Highest Anti-Proliferative Effect of Non-Nucleoside Reverse Transcriptase Inhibitors against Pancreatic Cancer Cells.

    Directory of Open Access Journals (Sweden)

    Markus Hecht

    Full Text Available Cancer prevention and therapy in HIV-1-infected patients will play an important role in future. The non-nucleoside reverse transcriptase inhibitors (NNRTI Efavirenz and Nevirapine are cytotoxic against cancer cells in vitro. As other NNRTIs have not been studied so far, all clinically used NNRTIs were tested and the in vitro toxic concentrations were compared to drug levels in patients to predict possible anti-cancer effects in vivo.Cytotoxicity was studied by Annexin-V-APC/7AAD staining and flow cytometry in the pancreatic cancer cell lines BxPC-3 and Panc-1 and confirmed by colony formation assays. The 50% effective cytotoxic concentrations (EC50 were calculated and compared to the blood levels in our patients and published data.The in vitro EC50 of the different drugs in the BxPC-3 pancreatic cancer cells were: Efavirenz 31.5 μmol/l (= 9944 ng/ml, Nevirapine 239 μmol/l (= 63,786 ng/ml, Etravirine 89.0 μmol/l (= 38,740 ng/ml, Lersivirine 543 μmol/l (= 168,523 ng/ml, Delavirdine 171 μmol/l (= 78,072 ng/ml, Rilpivirine 24.4 μmol/l (= 8941 ng/ml. As Efavirenz and Rilpivirine had the highest cytotoxic potential and Nevirapine is frequently used in HIV-1 positive patients, the results of these three drugs were further studied in Panc-1 pancreatic cancer cells and confirmed with colony formation assays. 205 patient blood levels of Efavirenz, 127 of Rilpivirine and 31 of Nevirapine were analyzed. The mean blood level of Efavirenz was 3587 ng/ml (range 162-15,363 ng/ml, of Rilpivirine 144 ng/ml (range 0-572 ng/ml and of Nevirapine 4955 ng/ml (range 1856-8697 ng/ml. Blood levels from our patients and from published data had comparable Efavirenz levels to the in vitro toxic EC50 in about 1 to 5% of all patients.All studied NNRTIs were toxic against cancer cells. A low percentage of patients taking Efavirenz reached in vitro cytotoxic blood levels. It can be speculated that in HIV-1 positive patients having high Efavirenz blood levels pancreatic

  20. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  1. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  2. Lipid-based nutrient supplements do not affect efavirenz but lower plasma nevirapine concentrations in Ethiopian adult HIV patients

    DEFF Research Database (Denmark)

    Abdissa, A; Olsen, Mette Frahm; Yilma, D

    2015-01-01

    OBJECTIVES: Lipid-based nutrient supplements (LNSs) are increasingly used in HIV programmes in resource-limited settings. However, the possible effects of LNSs on the plasma concentrations of antiretroviral drugs have not been assessed. Here, we aimed to assess the effects of LNSs on plasma...... efavirenz and nevirapine trough concentrations in Ethiopian adult HIV-infected patients. METHODS: The effects of LNSs were studied in adults initiating antiretroviral therapy (ART) in a randomized trial. Patients with body mass index (BMI) > 17 kg/m(2) (n = 282) received daily supplementation of an LNS.......9; -0.9 μg/mL; P = 0.01), respectively, compared with the group not receiving supplements. There were no differences between groups with respect to efavirenz plasma concentrations. The CYP2B6 516 G>T polymorphism was associated with a 5 μg/mL higher plasma efavirenz concentration compared with the wild...

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. A sensitive and selective liquid chromatography/tandem mass spectrometry method for quantitative analysis of efavirenz in human plasma.

    Directory of Open Access Journals (Sweden)

    Praveen Srivastava

    Full Text Available A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI, efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC-MS/MS. Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and ¹³C₆-efavirenz (Internal Standard, respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (¹³C₆-efavirenz and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99 over the concentration range of 1.0-2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ was 9.24% and for quality control (QC samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100-111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03-9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2-108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients.

  6. Lipid profiles for etravirine versus efavirenz in treatment-naive patients in the randomized, double-blind SENSE trial

    DEFF Research Database (Denmark)

    Fätkenheuer, G; Duvivier, C; Rieger, A

    2012-01-01

    Etravirine is approved for use in treatment-experienced patients at a dose of 200 mg twice daily. Efavirenz has been associated with greater increases in serum lipids compared with other non-nucleosides in randomized trials of first-line treatment.......Etravirine is approved for use in treatment-experienced patients at a dose of 200 mg twice daily. Efavirenz has been associated with greater increases in serum lipids compared with other non-nucleosides in randomized trials of first-line treatment....

  7. Pharmacogenetic associations with plasma efavirenz concentrations and clinical correlates in a retrospective cohort of Ghanaian HIV-infected patients.

    Science.gov (United States)

    Sarfo, Fred S; Zhang, Yuan; Egan, Deirdre; Tetteh, Lambert A; Phillips, Richard; Bedu-Addo, George; Sarfo, Maame Anima; Khoo, Saye; Owen, Andrew; Chadwick, David R

    2014-02-01

    Efavirenz is widely used in first-line antiretroviral therapy in sub-Saharan Africa. However, exposure to efavirenz shows marked interindividual variability that is genetically mediated with potential for important pharmacodynamic consequences. The aims of this study were to assess the frequencies of CYP2B6, CYP2A6, UGT2B7 and CAR single nucleotide polymorphisms (SNPs) and their impact on plasma efavirenz concentration and clinical/immunological responses in Ghanaian patients. Genomic DNA from 800 HIV-infected patients was genotyped for selected SNPs by real-time PCR-based allelic discrimination. Mid-dose plasma efavirenz concentrations were measured for 521 patients using HPLC with UV detection. Clinical outcomes in 299 patients on efavirenz were retrospectively assessed. Univariate and multivariate linear regression were performed using best subset selection. Time-to-event outcomes were analysed using a Cox proportional hazards regression model. The variant allele frequencies for CYP2B6 516G>T (rs3745274), CYP2B6 983T>C (rs28399499), CYP2A6 -48T>G (CYP2B6*9B; rs28399433), UGT2B7 802C>T (UGT2B7*2; rs7439366), UGT2B7 735A>G (UGT2B7*1c; rs28365062) and CAR 540C>T (rs2307424) were 48%, 4%, 3%, 23%, 15% and 7%, respectively. CYP2B6 516G>T, CYP2B6 983T>C and CYP2A6 -48T>G were associated with significantly elevated efavirenz concentrations. A trend towards association between plasma efavirenz concentration and CAR 540C>T was observed. CYP2B6 516G homozygosity was associated with immunological failure [adjusted hazards ratio compared with T homozygosity, 1.70 (1.04-2.76); P = 0.03]. CYP2B6 and CYP2A6 SNPs were associated with higher plasma efavirenz concentrations due to reduction in major and minor phase I routes of elimination, respectively. Further prospective studies are needed to validate the pharmacodynamic correlates of these polymorphisms in this population.

  8. Formulation and development of bicontinuous nanostructured liquid crystalline particles of efavirenz.

    Science.gov (United States)

    Avachat, Amelia M; Parpani, Shreekrishna S

    2015-02-01

    Efavirenz is a lipophilic non-nucleoside reverse transcriptase inhibitor used in the first-line pediatric therapeutic cocktail. Due to its high lipophilicity (logP = 5.4) and poor aqueous solubility (intrinsic water solubility = 8.3 μg/mL) efavirenz has low bioavailability. A 30 mg/mL solution in a medium-chain triglyceride vehicle is the only pediatric formulation available with an oral bioavailability 20% lower than the solid form. The current work was aimed at formulating and characterizing liquid crystal nanoparticles for oral delivery of efavirenz to improve oral bioavailability, provide sustained release, minimize side effects and drug resistance. Formulation of cubosomes was done by two methods; sonication and spray drying. Sonication gave highest entrapment efficiency and least particle size. Further, monoolein was substituted with phytantriol as monoolein gets degraded in the presence of lipase when administered orally with consequent loss of liquid crystalline structure. It was confirmed that there was no difference in particle size, entrapment efficiency and nature of product formed by using monoolein or phytantriol. The best formulation was found to be F9, having particle size 104.19 ± 0.21 nm and entrapment efficiency 91.40 ± 0.10%. In vitro release at the end of 12h was found to be 56.45% and zeta potential to be -23.14 mV which stabilized the cubic phase dispersions. It was further characterized for TEM, small angle X-ray scattering (SAXS), DSC and stability studies. SAXS revealed Pn3m space group, indicating a diamond cubic phase which was further confirmed by TEM. Pharmacokinetics of EFV was studied in male Wistar rats. EFV-loaded cubosome dispersions exhibited 1.93 and 1.62-fold increase in peak plasma concentration (Cmax) and 1.48 and 1.42-fold increase in AUC in comparison to that of a suspension prepared with the contents of EFV capsules suspended in 1.5% carboxymethylcellulose PBS solution (pH 5.0), and an EFV solution in medium

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  11. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  12. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  13. Comparison of genotypic resistance profiles and virological response between patients starting nevirapine and efavirenz in EuroSIDA

    DEFF Research Database (Denmark)

    Bannister, Wendy P; Ruiz, Lidia; Cozzi-Lepri, Alessandro

    2008-01-01

    OBJECTIVE: To compare virological outcome and genotypic resistance profiles in HIV-1-infected patients starting non-nucleoside reverse transcriptase inhibitor (NNRTI)-containing regimens. METHODS: NNRTI-naive patients were included who started treatment with nevirapine (NVP) or efavirenz (EFV) wi...

  14. The effect of efavirenz versus nevirapine-containing regimens on immunologic, virologic and clinical outcomes in a prospective observational study

    NARCIS (Netherlands)

    Collaboration, H.-C.; Koopmans †, P.P.; Brouwer, A.M.; Dofferhoff, A.S.M.; Flier, M. van der; Groot, R. de; Hofstede, H.J.M. ter; Keuter, M.; Ven, A.J.A.M. van der; et al.,

    2012-01-01

    OBJECTIVE: To compare regimens consisting of either efavirenz or nevirapine and two or more nucleoside reverse transcriptase inhibitors (NRTIs) among HIV-infected, antiretroviral-naive, and AIDS-free individuals with respect to clinical, immunologic, and virologic outcomes. DESIGN: Prospective

  15. Effect of Food on the Steady-State Pharmacokinetics of Tenofovir and Emtricitabine plus Efavirenz in Ugandan Adults

    Directory of Open Access Journals (Sweden)

    Mohammed Lamorde

    2012-01-01

    Full Text Available We investigated the effect of food on the steady-state pharmacokinetics of a proprietary fixed-dose combination (FDC tablet containing tenofovir disoproxil fumarate (TDF/emtricitabine/efavirenz. Fifteen Ugandan HIV-1 patients at steady-state dosing with TDF/emtricitabine/efavirenz were admitted for 24-hour intensive pharmacokinetic sampling after dosing in the fasting state. Blood sampling was repeated seven days later with TDF/emtricitabine/efavirenz administered with food (19 g fat. Drug concentrations in plasma were determined by liquid chromatography and tandem mass spectrometry. Geometric mean ratios (GMRs and confidence intervals (CIs of parameters were calculated (reference, fasting. For efavirenz, GMRs (90% CIs for Cmax, AUC0-24, and C24 were 1.47 (1.24–1.75, 1.13 (1.03–1.23, and 1.01 (0.91–1.11, respectively. Corresponding GMRs were 1.04 (0.84–1.27, 1.19 (1.10–1.29, and 0.99 (0.82–1.19 for tenofovir, 0.83 (0.76–0.92, 0.87 (0.78–0.97, and 0.91 (0.73–1.14 for emtricitabine. Stable patients may take the FDC without meal restrictions. The FDC should be taken without food by patients experiencing central nervous system toxicities.

  16. Development and validation of reversed-phase HPLC gradient method for the estimation of efavirenz in plasma.

    Directory of Open Access Journals (Sweden)

    Shweta Gupta

    Full Text Available Efavirenz is an anti-viral agent of non-nucleoside reverse transcriptase inhibitor category used as a part of highly active retroviral therapy for the treatment of infections of human immune deficiency virus type-1. A simple, sensitive and rapid reversed-phase high performance liquid chromatographic gradient method was developed and validated for the determination of efavirenz in plasma. The method was developed with high performance liquid chromatography using Waters X-Terra Shield, RP18 50 x 4.6 mm, 3.5 μm column and a mobile phase consisting of phosphate buffer pH 3.5 and Acetonitrile. The elute was monitored with the UV-Visible detector at 260 nm with a flow rate of 1.5 mL/min. Tenofovir disoproxil fumarate was used as internal standard. The method was validated for linearity, precision, accuracy, specificity, robustness and data obtained were statistically analyzed. Calibration curve was found to be linear over the concentration range of 1-300 μg/mL. The retention times of efavirenz and tenofovir disoproxil fumarate (internal standard were 5.941 min and 4.356 min respectively. The regression coefficient value was found to be 0.999. The limit of detection and the limit of quantification obtained were 0.03 and 0.1 μg/mL respectively. The developed HPLC method can be useful for quantitative pharmacokinetic parameters determination of efavirenz in plasma.

  17. Efficacy and Safety of Lopinavir/ritonavir- versus Efavirenz-based Antiretroviral Therapy in HIV-Infected Pregnant Ugandan Women

    Science.gov (United States)

    COHAN, Deborah; NATUREEBA, Paul; KOSS, Catherine A.; PLENTY, Albert; LUWEDDE, Flavia; MWESIGWA, Julia; ADES, Veronica; CHARLEBOIS, Edwin D.; GANDHI, Monica; CLARK, Tamara D.; NZARUBARA, Bridget; ACHAN, Jane; RUEL, Theodore; KAMYA, Moses R.; HAVLIR, Diane V.

    2015-01-01

    Objective Combination antiretroviral therapy (ART) is now the global standard for HIV-infected pregnant and breastfeeding women at all CD4 cell counts. We compared the efficacy and safety of an efavirenz versus lopinavir/ritonavir regimen for HIV-infected pregnant women initiating ART in rural Uganda. Design Randomized clinical trial. Methods We performed a planned secondary analysis comparing viral load suppression (HIV-1 RNA ≤400 copies/ml), safety, and HIV transmission to infants in a trial designed to test the hypothesis that lopinavir/ritonavir- versus efavirenz-based ART would reduce placental malaria (PROMOTE, ClinicalTrials.gov, NCT00993031). HIV-infected, ART-naïve pregnant women at 12–28 weeks gestation and any CD4 cell count were randomized. ART was provided and participants were counseled to breastfeed for one year postpartum. Results The median age of the 389 study participants was 29 years; median CD4 cell count was 370 cells/mm3. At delivery, virologic suppression was 97.6% in the efavirenz arm and 86.0% in the lopinavir/ritonavir arm, p HIV (both in the lopinavir/ritonavir arm) and HIV-free infant survival was similar between study arms: 92.9% (lopinavir/ritonavir) versus 97.2% (efavirenz), p = 0.10. Conclusions Virologic suppression at delivery was higher with an efavirenz- versus lopinavir/ritonavir-based regimen. However, women in both arms achieved high levels of virologic suppression through one year postpartum and the risk of transmission to infants was low. PMID:25426808

  18. Efficacy and safety of lopinavir/ritonavir versus efavirenz-based antiretroviral therapy in HIV-infected pregnant Ugandan women.

    Science.gov (United States)

    Cohan, Deborah; Natureeba, Paul; Koss, Catherine A; Plenty, Albert; Luwedde, Flavia; Mwesigwa, Julia; Ades, Veronica; Charlebois, Edwin D; Gandhi, Monica; Clark, Tamara D; Nzarubara, Bridget; Achan, Jane; Ruel, Theodore; Kamya, Moses R; Havlir, Diane V

    2015-01-14

    Combination antiretroviral therapy (ART) is now the global standard for HIV-infected pregnant and breastfeeding women at all CD4⁺ cell counts. We compared the efficacy and safety of an efavirenz versus lopinavir/ritonavir regimen for HIV-infected pregnant women initiating ART in rural Uganda. Randomized clinical trial. We performed a planned secondary analysis comparing viral load suppression (HIV-1 RNA ≤400 copies/ml), safety, and HIV transmission to infants in a trial designed to test the hypothesis that lopinavir/ritonavir versus efavirenz-based ART would reduce placental malaria (PROMOTE, ClinicalTrials.gov, NCT00993031). HIV-infected, ART-naive pregnant women at 12-28 weeks gestation and any CD4⁺ cell count were randomized. ART was provided and participants were counseled to breastfeed for 1 year postpartum. The median age of the 389 study participants was 29 years; median CD4⁺ cell count was 370 cells/μl. At delivery, virologic suppression was 97.6% in the efavirenz arm and 86.0% in the lopinavir/ritonavir arm (P HIV (both in the lopinavir/ritonavir arm), and HIV-free infant survival was similar between study arms: 92.9% (lopinavir/ritonavir) versus 97.2% (efavirenz) (P = 0.10). Virologic suppression at delivery was higher with an efavirenz versus lopinavir/ritonavir-based regimen. However, women in both arms achieved high levels of virologic suppression through 1 year postpartum and the risk of transmission to infants was low.

  19. Lack of association between use of efavirenz and death from suicide

    DEFF Research Database (Denmark)

    Smith, Colette; Ryom, Lene; Monforte, Antonella d'Arminio

    2014-01-01

    INTRODUCTION: A recent meta-analysis of 4 RCTs showed an increased rate of suicidality events (suicidal ideation or attempted/completed suicide) associated with efavirenz (EFV) compared to other regimens, but only a trend towards a higher rate of completed/attempted suicides, as only 17 events...... occurred. We investigated the association between EFV use and completed suicide. MATERIALS AND METHODS: All D:A:D participants were followed from study entry to the first of death, last study visit or 1 February 2013. Deaths are centrally validated using cause of death methodology, which assigns underlying......, immediate and up to four contributing causes of death. Two endpoints were considered: 1) suicide or psychiatric disease as the underlying cause, and 2) suicide or psychiatric disease mentioned as an underlying, immediate or contributing cause of death (anywhere). Adjusted rate ratios were calculated using...

  20. Efavirenz and 8-hydroxyefavirenz induce cell death via a JNK- and BimEL-dependent mechanism in primary human hepatocytes

    Energy Technology Data Exchange (ETDEWEB)

    Bumpus, Namandje N., E-mail: nbumpus1@jhmi.edu

    2011-12-15

    Chronic use of efavirenz (EFV) has been linked to incidences of hepatotoxicity in patients receiving EFV to treat HIV-1. While recent studies have demonstrated that EFV stimulates hepatic cell death a role for the metabolites of efavirenz in this process has yet to be examined. In the present study, incubation of primary human hepatocytes with synthetic 8-hydroxyEFV (8-OHEFV), which is the primary metabolite of EFV, resulted in cell death, caspase-3 activation and reactive oxygen species formation. The metabolite exerted these effects at earlier time points and using lower concentrations than were required for the parent compound. In addition, pharmacological inhibition of cytochrome P450-dependent metabolism of EFV using 1-aminobenzotriazole markedly decreased reactive oxygen species formation and cell death. Treatment of primary human hepatocytes with EFV and 8-OHEFV also stimulated phosphorylation of c-Jun N-terminal kinase (JNK) as well as phosphorylation of the JNK substrate c-Jun. Further, the mRNA and protein expression of an isoform of Bim (Bcl-2 interacting mediator of cell death) denoted as BimEL, which is proapoptotic and has been shown to be modulated by JNK, was increased. Inhibition of JNK using SP600125 prevented the EFV- and 8-OHEFV-mediated cell death. Silencing of Bim using siRNA transfected into hepatocytes also prevented cell death resulting from 8-OHEFV-treatment. These data suggest that the oxidative metabolite 8-OHEFV is a more potent inducer of hepatic cell death than the parent compound EFV. Further, activation of the JNK signaling pathway and BimEL mRNA expression appear to be required for EFV- and 8-OHEFV-mediated hepatocyte death. -- Highlights: Black-Right-Pointing-Pointer 8-Hydroxyefavirenz is a more potent stimulator of cell death than efavirenz. Black-Right-Pointing-Pointer Efavirenz and 8-hydroxyefavirenz increase JNK activity and BimEL mRNA expression. Black-Right-Pointing-Pointer JNK and Bim are required for efavirenz- and 8

  1. Antioxidants as recipes for efavirenz-induced liver damage: A study in albino rats

    Directory of Open Access Journals (Sweden)

    Elias Adikwu

    2018-03-01

    Full Text Available Objective: Hepatotoxicity is a clinical challenge associated with the use of efavirenz (EFV. This study investigated the effects of n-acetylcysteine (NAC, vitamins C and E on EFV-induced hepatotoxicity in albino rats. Methods: Rats were divided into groups and administered with NAC (20mg/kg, Vit C (50mg/kg, Vit  E (50mg/kg, Vit C+ E and 60mg/kg of EFV respectively. Rats were also divided into groups and pretreated with NAC, Vit C, E, and combined doses of Vit C+E prior to treatment with EFV for 15 days respectively. After drug administration rats were sacrificed and serum was collected and evaluated for liver function parameters. Rats were dissected, liver was collected weighed and evaluated for alkaline phosphatase (ALP, alanine aminotransferase (AST, aspartate aminotransferase (ALT, gamma glutamyl transferase (GGT, lactate dehydrogenase (LDH, malondialdehyde (MDA, super oxide dismutase (SOD, catalase (CAT, glutathione (GSH, gluthatione peroxidase (GPX levels and pathological damage. Results: Effects were not significant (p>0.05 on body and liver weights, however, the levels of AST, ALT, AST, GGT, LDH, CB, TB and MDA were increased significantly (p<0.05 whereas SOD, CAT, SOD, GSH and GPX were decreased significantly (p<0.05 in EFV-treated rats in comparison to control. The liver of EFV-treated rats showed necrosis of hepatocytes. Nevertheless, EFV-induced alterations in the above parameters were significantly (p<0.05 ameliorated in antioxidants pretreated rats.  The combined doses of Vit C and E produced the best and significant (p<0.05 ameliorative effects in comparison to their individual doses. Conclusion: This study shows the prospects of antioxidants as candidates for the treatments of efavirenz-induced hepatotoxicity.

  2. Efficacy and safety of maraviroc versus efavirenz, both with zidovudine/lamivudine: 96-week results from the MERIT study.

    Science.gov (United States)

    Sierra-Madero, Juan; Di Perri, Giovanni; Wood, Robin; Saag, Michael; Frank, Ian; Craig, Charles; Burnside, Robert; McCracken, Jennifer; Pontani, Dennis; Goodrich, James; Heera, Jayvant; Mayer, Howard

    2010-01-01

    The MERIT study evaluated maraviroc versus efavirenz, both with zidovudine/lamivudine, in treatment-naïve patients with CCR5-tropic (R5) HIV-1. Post hoc analyses previously assessed week 48 outcomes in patients rescreened with R5 virus by a more sensitive tropism assay. Week 96 efficacy (post hoc, n = 614) and safety (n = 721) were assessed. Proportions of subjects <50 copies/mL (58.8% maraviroc, 62.7% efavirenz) and time to loss of virologic response (TLOVR) responders (<50 copies/mL: 60.5% vs 60.7%) were similar. Maraviroc recipients had greater CD4 increases (+ 212 vs + 171 cells/mm(3)) and fewer adverse event discontinuations (6.1% vs 15.5%), malignancies, and category C events. Week 96 data confirm week 48 observations in MERIT.

  3. Comprehensive Evaluation for Substrate Selectivity of Cynomolgus Monkey Cytochrome P450 2C9, a New Efavirenz Oxidase.

    Science.gov (United States)

    Hosaka, Shinya; Murayama, Norie; Satsukawa, Masahiro; Uehara, Shotaro; Shimizu, Makiko; Iwasaki, Kazuhide; Iwano, Shunsuke; Uno, Yasuhiro; Yamazaki, Hiroshi

    2015-07-01

    Cynomolgus monkeys are widely used as primate models in preclinical studies, because of their evolutionary closeness to humans. In humans, the cytochrome P450 (P450) 2C enzymes are important drug-metabolizing enzymes and highly expressed in livers. The CYP2C enzymes, including CYP2C9, are also expressed abundantly in cynomolgus monkey liver and metabolize some endogenous and exogenous substances like testosterone, S-mephenytoin, and diclofenac. However, comprehensive evaluation regarding substrate specificity of monkey CYP2C9 has not been conducted. In the present study, 89 commercially available drugs were examined to find potential monkey CYP2C9 substrates. Among the compounds screened, 20 drugs were metabolized by monkey CYP2C9 at a relatively high rates. Seventeen of these compounds were substrates or inhibitors of human CYP2C9 or CYP2C19, whereas three drugs were not, indicating that substrate specificity of monkey CYP2C9 resembled those of human CYP2C9 or CYP2C19, with some differences in substrate specificities. Although efavirenz is known as a marker substrate for human CYP2B6, efavirenz was not oxidized by CYP2B6 but by CYP2C9 in monkeys. Liquid chromatography-mass spectrometry analysis revealed that monkey CYP2C9 and human CYP2B6 formed the same mono- and di-oxidized metabolites of efavirenz at 8 and 14 positions. These results suggest that the efavirenz 8-oxidation could be one of the selective markers for cynomolgus monkey CYP2C9 among the major three CYP2C enzymes tested. Therefore, monkey CYP2C9 has the possibility of contributing to limited specific differences in drug oxidative metabolism between cynomolgus monkeys and humans. Copyright © 2015 by The American Society for Pharmacology and Experimental Therapeutics.

  4. PXR and CAR single nucleotide polymorphisms influence plasma efavirenz levels in South African HIV/AIDS patients

    Directory of Open Access Journals (Sweden)

    Swart Marelize

    2012-11-01

    Full Text Available Abstract Background This study investigated variation in NR1I2 and NR1I3 and its effect on plasma efavirenz levels in HIV/AIDS patients. Variability in plasma drug levels has largely led research on identifying causative variants in drug metabolising enzyme (DME genes, with little focus on the nuclear receptor genes NR1I2 and NR1I3, coding for PXR and CAR, respectively, that are involved in regulating DMEs. Methods 464 Bantu-speaking South Africans comprising of HIV/AIDS patients on efavirenz-based treatment (n=301 and 163 healthy subjects were genotyped for 6 SNPs in NR1I2 and NR1I3. 32 of the 301 patients had their DNA binding domains (DBDs in NR1I2 and NR1I3 sequenced. Results Significantly decreased efavirenz plasma concentrations were observed in patients carrying the NR1I3 rs3003596C/C and T/C genotypes (P=0.015 and P=0.010, respectively. Sequencing resulted in the discovery of a further 13 SNPs, 3 of which are novel variants in the DBD of NR1I2. There were significant differences in the distribution of NR1I2 and NR1I3 SNPs between South Africans when compared to Caucasian, Asian and Yoruba population groups. Conclusion For the realisation of personalised medicine, PXR and CAR genetic variation should be taken into consideration because of their involvement in the regulation of DMEs.

  5. Predicting Optimal Dihydroartemisinin-Piperaquine Regimens to Prevent Malaria During Pregnancy for Human Immunodeficiency Virus-Infected Women Receiving Efavirenz.

    Science.gov (United States)

    Wallender, Erika; Vucicevic, Katarina; Jagannathan, Prasanna; Huang, Liusheng; Natureeba, Paul; Kakuru, Abel; Muhindo, Mary; Nakalembe, Mirium; Havlir, Diane; Kamya, Moses; Aweeka, Francesca; Dorsey, Grant; Rosenthal, Philip J; Savic, Radojka M

    2018-03-05

    A monthly treatment course of dihydroartemisinin-piperaquine (DHA-PQ) effectively prevents malaria during pregnancy. However, a drug-drug interaction pharmacokinetic (PK) study found that pregnant human immunodeficiency virus (HIV)-infected women receiving efavirenz-based antiretroviral therapy (ART) had markedly reduced piperaquine (PQ) exposure. This suggests the need for alternative DHA-PQ chemoprevention regimens in this population. Eighty-three HIV-infected pregnant women who received monthly DHA-PQ and efavirenz contributed longitudinal PK and corrected QT interval (QTc) (n = 25) data. Population PK and PK-QTc models for PQ were developed to consider the benefits (protective PQ coverage) and risks (QTc prolongation) of alternative DHA-PQ chemoprevention regimens. Protective PQ coverage was defined as maintaining a concentration >10 ng/mL for >95% of the chemoprevention period. PQ clearance was 4540 L/day. With monthly DHA-PQ (2880 mg PQ), 96% of women, respectively. All regimens were safe, with ≤2% of women predicted to have ≥30 msec QTc increase. For HIV-infected pregnant women receiving efavirenz, low daily DHA-PQ dosing was predicted to improve protection against parasitemia and reduce risk of toxicity compared to monthly dosing. NCT02282293. © The Author(s) 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  6. The male genital tract is not a pharmacological sanctuary from efavirenz.

    Science.gov (United States)

    Avery, L B; Bakshi, R P; Cao, Y J; Hendrix, C W

    2011-07-01

    Many antiretroviral (ARV) drugs have large blood plasma-to-seminal plasma (BP/SP) concentration ratios. Concern exists that these drugs do not adequately penetrate the male genital tract (MGT), resulting in the MGT becoming a "pharmacological sanctuary" from these agents, with ineffective MGT concentrations despite effective blood concentrations. Efavirenz (EFV) is the most highly protein-bound ARV drug, with >99% binding in blood plasma and the largest BP/SP total EFV concentration ratio, reportedly ranging from 11 to 33. To evaluate protein binding as an explanation for the differences between the drug concentrations in blood and semen, we developed a novel ultrafiltration method, corrected for the duration of centrifugation, to measure protein binding in the two matrices. In six subjects, protein-free EFV concentrations were the same in blood and semen; the median (interquartile range (IQR)) protein-free EFV SP/BP ratio was 1.21 (0.99-1.35); EFV protein binding was 99.82% (99.79-99.86) in BP and 95.26% (93.24-96.67) in SP. This shows that the MGT is not a sanctuary from EFV.

  7. Solubility and dissolution performances of spray-dried solid dispersion of Efavirenz in Soluplus.

    Science.gov (United States)

    Lavra, Zênia Maria Maciel; Pereira de Santana, Davi; Ré, Maria Inês

    2017-01-01

    Efavirenz (EFV), a first-line anti-HIV drug largely used as part of antiretroviral therapies, is practically insoluble in water and belongs to BCS class II (low solubility/high permeability). The aim of this study was to improve the solubility and dissolution performances of EFV by formulating an amorphous solid dispersion of the drug in polyvinyl caprolactam-polyvinyl acetate-polyethylene glycol graft copolymer (Soluplus ® ) using spray-drying technique. To this purpose, spray-dried dispersions of EFV in Soluplus ® at different mass ratios (1:1.25, 1:7, 1:10) were prepared and characterized using particle size measurements, SEM, XRD, DSC, FTIR and Raman microscopy mapping. Solubility and dissolution were determined in different media. Stability was studied at accelerated conditions (40 °C/75% RH) and ambient conditions for 12 months. DSC and XRD analyses confirmed the EFV amorphous state. FTIR spectroscopy analyses revealed possible drug-polymer molecular interaction. Solubility and dissolution rate of EFV was enhanced remarkably in the developed spray-dried solid dispersions, as a function of the polymer concentration. Spray-drying was concluded to be a proper technique to formulate a physically stable dispersion of amorphous EFV in Soluplus ® , when protected from moisture.

  8. Pharmacokinetic Interactions between the Hormonal Emergency Contraception, Levonorgestrel (Plan B, and Efavirenz

    Directory of Open Access Journals (Sweden)

    Monica L. Carten

    2012-01-01

    Full Text Available Objectives. Compare the Plan B levonorgestrel (LNG area under the concentration- time curve (AUC12 prior to and with efavirenz (EFV. Design. Prospective, open-label, single-arm, equivalence study. Methods. Healthy HIV-negative subjects underwent 12 hr intensive pharmacokinetic (PK sampling following single dose LNG alone and after 14 days of EFV. Geometric means, Geometric Mean Ratios, and 90% confidence intervals (CI are reported for PK Parameters. T-tests were utilized. Clinical parameters and liver function tests (LFTs were assessed. Results. 24 women enrolled and 21 completed the study. With EFV, LNG AUC12 was reduced 56% (95% CI: 49%, 62% from 42.9 to 17.8 ng*hr/mL, and maximum concentration (Cmax⁡ was reduced 41% (95% CI: 33%, 50% from 8.4 to 4.6 ng/mL. LNG was well tolerated with no grade 3 or 4 treatment-related toxicities. Conclusions. EFV significantly reduced LNG exposures. Higher LNG doses may be required with EFV. These results reinforce the importance of effective contraception in women taking EFV.

  9. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  10. Importance of ethnicity, CYP2B6 and ABCB1 genotype for efavirenz pharmacokinetics and treatment outcomes: a parallel-group prospective cohort study in two sub-Saharan Africa populations.

    Directory of Open Access Journals (Sweden)

    Eliford Ngaimisi

    Full Text Available We evaluated the importance of ethnicity and pharmacogenetic variations in determining efavirenz pharmacokinetics, auto-induction and immunological outcomes in two African populations.ART naïve HIV patients from Ethiopia (n = 285 and Tanzania (n = 209 were prospectively enrolled in parallel to start efavirenz based HAART. CD4+ cell counts were determined at baseline, 12, 24 and 48 weeks. Plasma and intracellular efavirenz and 8-hydroxyefvairenz concentrations were determined at week 4 and 16. Genotyping for common functional CYP2B6, CYP3A5, ABCB1, UGT2B7 and SLCO1B1 variant alleles were done.Patient country, CYP2B6*6 and ABCB1 c.4036A>G (rs3842A>G genotype were significant predictors of plasma and intracellular efavirenz concentration. CYP2B6*6 and ABCB1 c.4036A>G (rs3842 genotype were significantly associated with higher plasma efavirenz concentration and their allele frequencies were significantly higher in Tanzanians than Ethiopians. Tanzanians displayed significantly higher efavirenz plasma concentration at week 4 (pG genotype. Within country analyses indicated a significant decrease in the mean plasma efavirenz concentration by week 16 compared to week 4 in Tanzanians (p = 0.006, whereas no significant differences in plasma concentration over time was observed in Ethiopians (p = 0.84. Intracellular efavirenz concentration and patient country were significant predictors of CD4 gain during HAART.We report substantial differences in efavirenz pharmacokinetics, extent of auto-induction and immunologic recovery between Ethiopian and Tanzanian HIV patients, partly but not solely, due to pharmacogenetic variations. The observed inter-ethnic variations in efavirenz plasma exposure may possibly result in varying clinical treatment outcome or adverse event profiles between populations.

  11. Pharmacogenetic & pharmacokinetic biomarker for efavirenz based ARV and rifampicin based anti-TB drug induced liver injury in TB-HIV infected patients.

    Directory of Open Access Journals (Sweden)

    Getnet Yimer

    Full Text Available BACKGROUND: Implication of pharmacogenetic variations and efavirenz pharmacokinetics in concomitant efavirenz based antiviral therapy and anti-tubercular drug induced liver injury (DILI has not been yet studied. We performed a prospective case-control association study to identify the incidence, pharmacogenetic, pharmacokinetic and biochemical predictors for anti-tubercular and antiretroviral drugs induced liver injury (DILI in HIV and tuberculosis (TB co-infected patients. METHODS AND FINDINGS: Newly diagnosed treatment naïve TB-HIV co-infected patients (n = 353 were enrolled to receive efavirenz based ART and rifampicin based anti-TB therapy, and assessed clinically and biochemically for DILI up to 56 weeks. Quantification of plasma efavirenz and 8-hydroxyefaviernz levels and genotyping for NAT2, CYP2B6, CYP3A5, ABCB1, UGT2B7 and SLCO1B1 genes were done. The incidence of DILI and identification of predictors was evaluated using survival analysis and the Cox Proportional Hazards Model. The incidence of DILI was 30.0%, or 14.5 per 1000 person-week, and that of severe was 18.4%, or 7.49 per 1000 person-week. A statistically significant association of DILI with being of the female sex (p = 0.001, higher plasma efavirenz level (p = 0.009, efavirenz/8-hydroxyefavirenz ratio (p = 0.036, baseline AST (p = 0.022, ALT (p = 0.014, lower hemoglobin (p = 0.008, and serum albumin (p = 0.007, NAT2 slow-acetylator genotype (p = 0.039 and ABCB1 3435TT genotype (p = 0.001. CONCLUSION: We report high incidence of anti-tubercular and antiretroviral DILI in Ethiopian patients. Between patient variability in systemic efavirenz exposure and pharmacogenetic variations in NAT2, CYP2B6 and ABCB1 genes determines susceptibility to DILI in TB-HIV co-infected patients. Close monitoring of plasma efavirenz level and liver enzymes during early therapy and/or genotyping practice in HIV clinics is recommended for early identification

  12. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  13. Why did the FDA approve efavirenz 800 mg when co-administered with rifampin?

    Science.gov (United States)

    Liu, Jiang; Chan-Tack, Kirk M; Jadhav, Pravin; Seo, Shirley; Robertson, Sarah M; Kraft, Jeffrey; Singer, Mary E; Struble, Kimberly A; Arya, Vikram

    2014-06-01

    Literature reports regarding the efficacy of efavirenz (EFV) 600 mg with rifampin (RIF) are not consistent. Evaluation of a drug-drug interaction (DDI) study and supportive semi-mechanistic population pharmacokinetic (PK) analyses were undertaken to help delineate this issue. DDI study and supportive semi-mechanistic population PK analyses were provided by BMS. Population PK analysis was based on six studies with intensive EFV PK sampling. An ACTG study with sparse PK sampling was used for model evaluation. Simulations compared EFV exposure at various doses in combination with RIF to EFV exposures at 600 mg once daily (QD). Effects of CYP2B6 genotypes on the magnitude of EFV-RIF interaction were also explored. In DDI study, co-administering EFV 600 mg QD and RIF reduced mean EFV exposure by ~ 30%. Population PK model provided acceptable predictive performance of central tendency and variability for EFV C0, Cmax, and AUC. Simulations predicted that increasing EFV to 800 mg QD with RIF would result in EFV AUC and Cmax similar to EFV 600 mg QD alone. EFV AUC and Cmax were ~ 2 times higher in subjects with reduced function CYP2B6 genotypes. However, the RIF effect was consistent across all genotypes. EFV dose adjustment to 800 mg QD did not increase the risk of overexposure compared to 600 mg EFV QD within each genotype. Dose adjustment based on matching systemic exposure was recommended to mitigate the potential for sub-therapeutic EFV exposures. Our review did not reveal any safety concerns in subjects receiving EFV 800 mg QD with RIF.

  14. Improvement of lipoatrophy by switching from efavirenz to lopinavir/ritonavir.

    Science.gov (United States)

    Rojas, J; Lonca, M; Imaz, A; Estrada, V; Asensi, V; Miralles, C; Domingo, P; Montero, M; del Rio, L; Fontdevila, J; Perez, I; Cruceta, A; Gatell, J M; Arnedo, M; Martínez, E

    2016-05-01

    To assess whether changes in antiretroviral drugs other than thymidine nucleoside reverse transcriptase inhibitors (NRTI) may have a body fat impact in HIV-infected patients with lipoatrophy. Ninety-six-week phase IV, open-label, multicentre, pilot randomized trial. HIV-infected patients with moderate/severe lipoatrophy at one or more body sites despite long-term thymidine NRTI-free therapy were randomized to continue their efavirenz (EFV)-based antiretroviral regimen or to switch from EFV to lopinavir/ritonavir (LPV/r). The primary endpoint was the absolute change in limb fat mass measured by dual X-ray absorptiometry from baseline to 96 weeks. Changes in other body fat measurements, subjective perception of lipoatrophy, subcutaneous fat gene expression and plasma lipids were also assessed. Thirty-three patients (73% men, median age 52 years) were recruited. At 96 weeks, absolute limb fat mass increased in the LPV/r arm vs. the EFV arm (estimated difference +1082.1 g; 95% CI +63.7 to +2103.5; P = 0.04); this difference remained significant after adjustment by gender, age, fat mass, body mass index and CD4 cell count at baseline. Subjective lipoatrophy perception scores also improved in the LPV/r arm relative to the EFV arm. Adipogenesis, glucose and lipid metabolism, and mitochondrial gene expression increased in the LPV/r arm compared with the EFV arm at 96 weeks. HDL cholesterol decreased in the LPV/r arm relative to the EFV arm. Switching from EFV to LPV/r in HIV-infected patients with lipoatrophy may offer further limb fat gain beyond thymidine NRTI discontinuation, although this strategy decreased plasma HDL cholesterol and caused changes in subcutaneous fat gene expression that may be associated with increased insulin resistance. © 2015 British HIV Association.

  15. Design and formulation of nano-sized spray dried efavirenz-part I: influence of formulation parameters

    Energy Technology Data Exchange (ETDEWEB)

    Katata, Lebogang, E-mail: lebzakate@yahoo.com; Tshweu, Lesego; Naidoo, Saloshnee; Kalombo, Lonji; Swai, Hulda [Materials Science and Manufacturing, Centre of Polymers and Composites, Council for Scientific and Industrial Research (South Africa)

    2012-11-15

    Efavirenz (EFV) is one of the first-line antiretroviral drugs recommended by the World Health Organisation for treating HIV. It is a hydrophobic drug that suffers from low aqueous solubility (4 {mu}g/mL), which leads to a limited oral absorption and low bioavailability. In order to improve its oral bioavailability, nano-sized polymeric delivery systems are suggested. Spray dried polycaprolactone-efavirenz (PCL-EFV) nanoparticles were prepared by the double emulsion method. The Taguchi method, a statistical design with an L{sub 8} orthogonal array, was implemented to optimise the formulation parameters of PCL-EFV nanoparticles. The types of sugar (lactose or trehalose), surfactant concentration and solvent (dichloromethane and ethyl acetate) were chosen as significant parameters affecting the particle size and polydispersity index (PDI). Small nanoparticles with an average particle size of less than 254 {+-} 0.95 nm in the case of ethyl acetate as organic solvent were obtained as compared to more than 360 {+-} 19.96 nm for dichloromethane. In this study, the type of solvent and sugar were the most influencing parameters of the particle size and PDI. Taguchi method proved to be a quick, valuable tool in optimising the particle size and PDI of PCL-EFV nanoparticles. The optimised experimental values for the nanoparticle size and PDI were 217 {+-} 2.48 nm and 0.093 {+-} 0.02.

  16. Design and formulation of nano-sized spray dried efavirenz-part I: influence of formulation parameters

    International Nuclear Information System (INIS)

    Katata, Lebogang; Tshweu, Lesego; Naidoo, Saloshnee; Kalombo, Lonji; Swai, Hulda

    2012-01-01

    Efavirenz (EFV) is one of the first-line antiretroviral drugs recommended by the World Health Organisation for treating HIV. It is a hydrophobic drug that suffers from low aqueous solubility (4 μg/mL), which leads to a limited oral absorption and low bioavailability. In order to improve its oral bioavailability, nano-sized polymeric delivery systems are suggested. Spray dried polycaprolactone-efavirenz (PCL-EFV) nanoparticles were prepared by the double emulsion method. The Taguchi method, a statistical design with an L 8 orthogonal array, was implemented to optimise the formulation parameters of PCL-EFV nanoparticles. The types of sugar (lactose or trehalose), surfactant concentration and solvent (dichloromethane and ethyl acetate) were chosen as significant parameters affecting the particle size and polydispersity index (PDI). Small nanoparticles with an average particle size of less than 254 ± 0.95 nm in the case of ethyl acetate as organic solvent were obtained as compared to more than 360 ± 19.96 nm for dichloromethane. In this study, the type of solvent and sugar were the most influencing parameters of the particle size and PDI. Taguchi method proved to be a quick, valuable tool in optimising the particle size and PDI of PCL-EFV nanoparticles. The optimised experimental values for the nanoparticle size and PDI were 217 ± 2.48 nm and 0.093 ± 0.02.

  17. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  18. The effect of malnutrition on the pharmacokinetics and virologic outcomes of lopinavir, efavirenz and nevirapine in food insecure HIV-infected children in Tororo, Uganda

    NARCIS (Netherlands)

    Bartelink, Imke H.; Savic, Rada M.; Dorsey, Grant; Ruel, Theodore; Gingrich, David; Scherpbier, Henriette J.; Capparelli, Edmund; Jullien, Vincent; Young, Sera L.; Achan, Jane; Plenty, Albert; Charlebois, Edwin; Kamya, Moses; Havlir, Diane; Aweeka, Francesca

    2015-01-01

    Malnutrition may impact the pharmacokinetics (PKs) of antiretroviral medications and virologic responses in HIV-infected children. The authors therefore evaluated the PK of nevirapine (NVP), efavirenz (EFV) and lopinavir (LPV) in associations with nutritional status in a cohort of HIV-infected

  19. Efavirenz, tenofovir and emtricitabine combined with first-line tuberculosis treatment in tuberculosis-HIV-coinfected Tanzanian patients: a pharmacokinetic and safety study

    NARCIS (Netherlands)

    Semvua, H.H.; Mtabho, C.M.; Fillekes, Q.; Boogaard, J. van den; Kisonga, R.M.; Mleoh, L.; Ndaro, A.; Kisanga, E.R.; Ven, A. van der; Aarnoutse, R.E.; Kibiki, G.S.; Boeree, M.J.; Burger, D.M.

    2013-01-01

    BACKGROUND: To evaluate the effect of rifampicin-based tuberculosis (TB) treatment on the pharmacokinetics of efavirenz/tenofovir/emtricitabine in a fixed-dose combination tablet, and vice versa, in Tanzanian TB-HIV-coinfected patients. METHODS: This was a Phase II open-label multiple dose

  20. Pregnancy rates in HIV-positive women using contraceptives and efavirenz-based or nevirapine-based antiretroviral therapy in Kenya: a retrospective cohort study.

    Science.gov (United States)

    Patel, Rena C; Onono, Maricianah; Gandhi, Monica; Blat, Cinthia; Hagey, Jill; Shade, Starley B; Vittinghoff, Eric; Bukusi, Elizabeth A; Newmann, Sara J; Cohen, Craig R

    2015-11-01

    Concerns have been raised about efavirenz reducing the effectiveness of contraceptive implants. We aimed to establish whether pregnancy rates differ between HIV-positive women who use various contraceptive methods and either efavirenz-based or nevirapine-based antiretroviral therapy (ART) regimens. We did this retrospective cohort study of HIV-positive women aged 15-45 years enrolled in 19 HIV care facilities supported by Family AIDS Care and Education Services in western Kenya between Jan 1, 2011, and Dec 31, 2013. Our primary outcome was incident pregnancy diagnosed clinically. The primary exposure was a combination of contraceptive method and efavirenz-based or nevirapine-based ART regimen. We used Poisson models, adjusting for repeated measures, and demographic, behavioural, and clinical factors, to compare pregnancy rates among women receiving different contraceptive and ART combinations. 24,560 women contributed 37,635 years of follow-up with 3337 incident pregnancies. In women using implants, adjusted pregnancy incidence was 1.1 per 100 person-years (95% CI 0.72-1.5) for nevirapine-based ART users and 3.3 per 100 person-years (1.8-4.8) for efavirenz-based ART users (adjusted incidence rate ratio [IRR] 3.0, 95% CI 1.3-4.6). In women using depot medroxyprogesterone acetate, adjusted pregnancy incidence was 4.5 per 100 person-years (95% CI 3.7-5.2) for nevirapine-based ART users and 5.4 per 100 person-years (4.0-6.8) for efavirenz-based ART users (adjusted IRR 1.2, 95% CI 0.91-1.5). Women using other contraceptive methods, except for intrauterine devices and permanent methods, had 3.1-4.1 higher rates of pregnancy than did those using implants, with 1.6-2.8 higher rates in women using efavirenz-based ART. Although HIV-positive women using implants and efavirenz-based ART had a three-times higher risk of contraceptive failure than did those using nevirapine-based ART, these women still had lower contraceptive failure rates than did those receiving all other

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. Efavirenz, tenofovir and emtricitabine combined with first-line tuberculosis treatment in tuberculosis-HIV-coinfected Tanzanian patients: a pharmacokinetic and safety study.

    Science.gov (United States)

    Semvua, Hadija H; Mtabho, Charles M; Fillekes, Quirine; van den Boogaard, Jossy; Kisonga, Riziki M; Mleoh, Liberate; Ndaro, Arnold; Kisanga, Elton R; van der Ven, Andre; Aarnoutse, Rob E; Kibiki, Gibson S; Boeree, Martin J; Burger, David M

    2013-01-01

    To evaluate the effect of rifampicin-based tuberculosis (TB) treatment on the pharmacokinetics of efavirenz/tenofovir/emtricitabine in a fixed-dose combination tablet, and vice versa, in Tanzanian TB-HIV-coinfected patients. This was a Phase II open-label multiple dose pharmacokinetic and safety study. This study was conducted in TB-HIV-coinfected Tanzanian patients who started TB treatment (rifampicin/isoniazid/pyrazinamide/ethambutol) at week 1 to week 8 and continued with rifampicin and isoniazid for another 16 weeks. Antiretroviral treatment (ART) of efavirenz/tenofovir/emtricitabine in a fixed-dose combination tablet was started at week 4 after initiation of TB treatment. A 24-h pharmacokinetic sampling curve was recorded at week 8 (with TB treatment) and week 28 (ART alone). For TB drugs, blood samples at 2 and 5 h post-dose were taken at week 3 (TB treatment alone) and week 8 (with ART). A total of 25 patients (56% male) completed the study; 21 had evaluable pharmacokinetic profiles. The area under the concentration-time curve 0-24 h post-dose of efavirenz, tenofovir and emtricitabine were slightly higher when these drugs were coadministered with TB drugs; geometric mean ratios (90% CI) were 1.08 (0.90, 1.30), 1.13 (0.93, 1.38) and 1.05 (0.85, 1.29), respectively. For TB drugs, equivalence was suggested for peak plasma concentrations when administered with and without efavirenz/tenofovir/emtricitabine. Adverse events were mostly mild and no serious adverse events or drug discontinuations were reported. Coadministration of efavirenz, tenofovir and emtricitabine with a standard first-line TB treatment regimen did not significantly alter the pharmacokinetic parameters of these drugs and was tolerated well by Tanzanian TB patients who are coinfected with HIV.

  4. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  5. Prevalence of etravirine-associated mutations in clinical samples with resistance to nevirapine and efavirenz.

    Science.gov (United States)

    Llibre, J M; Santos, J R; Puig, T; Moltó, J; Ruiz, L; Paredes, R; Clotet, B

    2008-11-01

    To evaluate the expected activity of etravirine in clinical samples, according to mutational patterns associated with decreased virological response (VR). We identified 1586 routine clinical samples with resistance-associated mutations (RAMs) to nevirapine and efavirenz (K103N 60%, Y181C 37%, G190A 27%, V108I 13%). Concerning in vitro identified etravirine mutations, samples with F227C, Y181I, M230L or L100I plus K103N plus Y181C were considered highly resistant. Samples with two RAMs plus Y181C or V179D or K101E or Y188L were considered intermediate. The prevalence of 13 RAMs recently associated with decreased VR to etravirine in the DUET clinical trials was also investigated. Most samples (69%) harboured more than one IAS-USA RAM to first-generation non-nucleoside reverse transcriptase inhibitors (NNRTIs): 42% harboured two RAMs, 21% three RAMs and 6% four or more RAMs. The prevalence of 13 specific etravirine RAMs was V179F 0.12%, G190S 3.9%, Y181V 0.1%, V106I 2.6%, V179D 1.6%, K101P 2.0%, K101E 10.1%, Y181C 36.9%, A98G 5.9%, V90I 6.9%, Y181I 3.6%, G190A 27% and L100I 9.1%. The five RAMs with the most impact on VR (V179F/D, G190S, Y181V and V106I) occurred less often. Overall, 8.2% of the samples had three or more etravirine RAMs and only 1.1% had four or more. In addition, patterns of RAMs previously associated with intermediate etravirine resistance were present in 26.2% of the samples, whereas 4.85% displayed patterns of high-degree resistance. For RAMs associated with decreased VR, etravirine resistance in routine clinical samples was lower than previously reported. High-degree resistance was uncommon, even in patients with resistance to first-generation NNRTIs, whereas low-to-intermediate etravirine resistance was more common.

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. A retrospective cohort analysis comparing pregnancy rates among HIV-positive women using contraceptives and efavirenz- or nevirapine-based antiretroviral therapy in Kenya

    Science.gov (United States)

    PATEL, Rena C.; ONONO, Maricianah; GANDHI, Monica; BLAT, Cinthia; HAGEY, Jill; SHADE, Starley B.; VITTINGHOFF, Eric; BUKUSI, Elizabeth A.; NEWMANN, Sara J.; COHEN, Craig R.

    2015-01-01

    SUMMARY Background Given recent concerns of efavirenz reducing the efficacy of contraceptive implants, we sought to determine if pregnancy rates differ among HIV-positive women using various contraceptive methods and efavirenz- or nevirapine-based antiretroviral therapy (ART) regimens. Methods We conducted a retrospective cohort analysis of HIV-positive women aged 15–45 years enrolled in HIV care facilities in western Kenya from January 2011 to December 2013. Pregnancy was diagnosed clinically and the primary exposure was a combination of contraceptive method and ART regimen. We used Poisson models, adjusting for repeated measures, as well as demographic, behavioral and clinical factors, to compare pregnancy rates among women on different contraceptive/ART combinations. Findings 24,560 women contributed 37,635 years of follow-up with 3,337 incident pregnancies. Among women using implants, adjusted pregnancy incidence for nevirapine- and efavirenz-based ART users were 1·1 (95% CI 0·72–1·5) and 3·3 (95% CI 1·8–4·8) per 100 women-years (w-y), respectively (adjusted incidence rate ratio (aIRR) 3·0, 95% CI 1·3–4·6). Among women using depomedroxyprogesterone acetate (DMPA), adjusted pregnancy incidence for nevirapine- and efavirenz-based ART users were 4·5 (95% CI 3·7–5·2) and 5·4 (95% CI 4·0–6·8) per 100 w-y, respectively (aIRR 1·2, 95% CI 0·91–1·5). Women using other contraceptive methods, except for intrauterine devices and permanent methods, experienced 3·1–4·1 higher rates of pregnancy than women using implants, with 1·6–2·8 higher rates specifically among women using efavirenz-based ART. Interpretation While HIV-positive women using implants on efavirenz-based ART faced three times higher risk of contraceptive failure than those on nevirapine-based ART, these women still experienced lower contraceptive failure rates than women on all other contraceptive methods, except for intrauterine devices and permanent methods

  8. Concomitant contraceptive implant and efavirenz use in women living with HIV: perspectives on current evidence and policy implications for family planning and HIV treatment guidelines.

    Science.gov (United States)

    Patel, Rena C; Morroni, Chelsea; Scarsi, Kimberly K; Sripipatana, Tabitha; Kiarie, James; Cohen, Craig R

    2017-05-11

    Preventing unintended pregnancies is important among all women, including those living with HIV. Increasing numbers of women, including HIV-positive women, choose progestin-containing subdermal implants, which are one of the most effective forms of contraception. However, drug-drug interactions between contraceptive hormones and efavirenz-based antiretroviral therapy (ART) may reduce implant effectiveness. We present four inter-related perspectives on this issue. First, as a case study, we discuss how limited data prompted country-level guidance against the use of implants among women concomitantly using efavirenz in South Africa and its subsequent negative effects on the use of implants in general. Second, we discuss the existing clinical data on this topic, including the observational study from Kenya showing women using implants plus efavirenz-based ART had three-fold higher rates of pregnancy than women using implants plus nevirapine-based ART. However, the higher rates of pregnancy in the implant plus efavirenz group were still lower than the pregnancy rates among women using common alternative contraceptive methods, such as injectables. Third, we discuss the four pharmacokinetic studies that show 50-70% reductions in plasma progestin concentrations in women concurrently using efavirenz-based ART as compared to women not on any ART. These pharmacokinetic studies provide the biologic basis for the clinical findings. Fourth, we discuss how data on this topic have marked implications for both family planning and HIV programmes and policies globally. This controversy underlines the importance of integrating family planning services into routine HIV care, counselling women appropriately on increased risk of pregnancy with concomitant implant and efavirenz use, and expanding contraceptive method mix for all women. As global access to ART expands, greater research is needed to explore implant effectiveness when used concomitantly with newer ART regimens. Data on how

  9. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  13. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  14. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  15. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  20. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  1. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  2. Efficacy and safety of rilpivirine (TMC278) versus efavirenz at 48 weeks in treatment-naive HIV-1-infected patients: pooled results from the phase 3 double-blind randomized ECHO and THRIVE Trials.

    Science.gov (United States)

    Cohen, Calvin J; Molina, Jean-Michel; Cahn, Pedro; Clotet, Bonaventura; Fourie, Jan; Grinsztejn, Beatriz; Wu, Hao; Johnson, Margaret A; Saag, Michael; Supparatpinyo, Khuanchai; Crauwels, Herta; Lefebvre, Eric; Rimsky, Laurence T; Vanveggel, Simon; Williams, Peter; Boven, Katia

    2012-05-01

    Pooled analysis of phase 3, double-blind, double-dummy ECHO and THRIVE trials comparing rilpivirine (TMC278) and efavirenz. Treatment-naive HIV-1-infected adults were randomized 1:1 to rilpivirine 25 mg once daily or efavirenz 600 mg once daily, with background tenofovir disoproxil fumarate/emtricitabine (TDF/FTC) (ECHO) or TDF/FTC, zidovudine/lamivudine, or abacavir/lamivudine (THRIVE). The primary endpoint was confirmed response [viral load effects on virologic failure were more apparent with rilpivirine. CD4 cell count increased over time in both groups. Rilpivirine compared with efavirenz gave smaller incidences of adverse events leading to discontinuation (3% vs. 8%, respectively), treatment-related grade 2-4 adverse events (16% vs. 31%), rash (3% vs. 14%), dizziness (8% vs. 26%), abnormal dreams/nightmares (8% vs. 13%), and grade 2-4 lipid abnormalities. At week 48, rilpivirine 25 mg once daily and efavirenz 600 mg once daily had comparable response rates. Rilpivirine had more virologic failures and improved tolerability versus efavirenz.

  3. TUBERCULOSIS COMO ENFERMEDAD OCUPACIONAL

    Science.gov (United States)

    Mendoza-Ticona, Alberto

    2014-01-01

    Existe evidencia suficiente para declarar a la tuberculosis como enfermedad ocupacional en diversos profesionales especialmente entre los trabajadores de salud. En el Perú están normados y reglamentados los derechos laborales inherentes a la tuberculosis como enfermedad ocupacional, como la cobertura por discapacidad temporal o permanente. Sin embargo, estos derechos aún no han sido suficientemente socializados. En este trabajo se presenta información sobre el riesgo de adquirir tuberculosis en el lugar de trabajo, se revisan las evidencias para declarar a la tuberculosis como enfermedad ocupacional en trabajadores de salud y se presenta la legislación peruana vigente al respecto. PMID:22858771

  4. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  5. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  6. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  7. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  8. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  9. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  10. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  11. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  12. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  13. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  14. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  15. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  16. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  18. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  19. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  20. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  1. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  2. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  5. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  6. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  7. Incidence of neuropsychiatric side effects of efavirenz in HIV-positive treatment-naïve patients in public-sector clinics in the Eastern Cape

    Directory of Open Access Journals (Sweden)

    Razia Gaida

    2016-06-01

    Full Text Available Background: It is acknowledged that almost half of patients initiated on efavirenz will experience at least one neuropsychiatric side effect. Objectives: The aim was to determine the incidence and severity of neuropsychiatric side effects associated with efavirenz use in five public-sector primary healthcare clinics in the Eastern Cape. Method: The study was a prospective drug utilisation study. A total of 126 medical records were reviewed to obtain the required information. After baseline assessment, follow-up reviews were conducted at 4 weeks, 12 weeks and 24 weeks from 2014 to 2015. Results: The participant group was 74.60% female (n = 94, and the average age was 37.57±10.60 years. There were no neuropsychiatric side effects recorded for any patient. After the full follow-up period, there were a total of 49 non-adherent patients and one patient had demised. A non-adherent patient was defined as a patient who did not return to the clinic for follow-up assessment and medication refills 30 days or more after the appointed date. Some patients (n = 11 had sent a third party to the clinic to collect their antiretroviral therapy (ART. The clinic pharmacy would at times dispense a two-month supply of medication resulting in the patient presenting only every two months. Conclusion: Further pharmacovigilance studies need to be conducted to determine the true incidence of these side effects. Healthcare staff must be encouraged to keep complete records to ensure meaningful patient assessments. Patients being initiated on ART need to personally attend the clinic monthly for at least the first 6 months of treatment. Clinic staff should receive regular training concerning ART, including changes made to guidelines as well as reminders of side effects experienced. Keywords: neuropsychiatric; side effects; efavirenz; HIV-positive patients

  8. Impact of body weight on virological and immunological responses to efavirenz-containing regimens in HIV-infected, treatment-naive adults

    DEFF Research Database (Denmark)

    Marzolini, Catia; Sabin, Caroline; Raffi, François

    2015-01-01

    OBJECTIVE: The prevalence of overweight and obesity is increasing among HIV-infected patients. Whether standard antiretroviral drug dosage is adequate in heavy individuals remains unresolved. We assessed the virological and immunological responses to initial efavirenz (EFV)-containing regimens...... individuals had significantly higher CD4 cell count at baseline, CD4 cell recovery at 6 and 12 months after EFV initiation was comparable to normal-weight individuals. CONCLUSION: Virological and immunological responses to initial EFV-containing regimens were not impaired in heavy individuals, suggesting...

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  11. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  12. Anti-HIV drugs nevirapine and efavirenz affect anxiety-related behavior and cognitive performance in mice.

    Science.gov (United States)

    Romão, Pedro R T; Lemos, Joelson C; Moreira, Jeverson; de Chaves, Gisele; Moretti, Morgana; Castro, Adalberto A; Andrade, Vanessa M; Boeck, Carina R; Quevedo, João; Gavioli, Elaine C

    2011-01-01

    Nevirapine (NVP) and efavirenz (EFV) belong to the class of anti-HIV drugs called non-nucleoside reverse transcriptase inhibitors (NNRTIs), commonly used as part of highly active antiretroviral therapy (HAART). Although the HAART is able to bring down viral load to undetectable levels and restore immune function, their prolonged use causes several adverse effects. It has been demonstrated that both NVP and EFV are able to cross the blood-brain barrier, causing important central nervous system-related side effects. Thus, this study investigated the effects of chronic administration of EFV (10 mg/kg) and NVP (3.3 mg/kg) in mice submitted to two distinct series of experiments, which aimed to evaluate: (1) the emotional behavior (elevated plus-maze, forced swimming, and open-field test) and (2) the cognitive performance (object recognition and inhibitory avoidance test) of mice. Our results demonstrated that EFV, but not NVP, reduced the exploration to open arms in the elevated plus-maze test. Neither NVP nor EFV altered mouse behavior in the forced swimming and open-field tests. Both drugs reduced the recognition index in the object recognition test, but only EFV significantly impaired the aversive memory assessed in the inhibitory avoidance test 24 h after training. In conclusion, our findings point to a genuine anxiogenic-like effect to EFV, since it reduced exploration to open arms of elevated plus-maze test without affecting spontaneous locomotion. Additionally, both drugs impaired recognition memory, while only the treatment with EFV impaired significantly aversive memory.

  13. Tolerability of central nervous system symptoms among HIV-1 infected efavirenz users: analysis of patient electronic medical record data.

    Science.gov (United States)

    Rosenblatt, Lisa; Broder, Michael S; Bentley, Tanya G K; Chang, Eunice; Reddy, Sheila R; Papoyan, Elya; Myers, Joel

    2017-08-01

    Efavirenz (EFV) is a non-nucleoside reverse transcriptase inhibitor indicated for treatment of HIV-1 infection. Despite concern over EFV tolerability in clinical trials and practice, particularly related to central nervous system (CNS) adverse events, some observational studies have shown high rates of EFV continuation at one year and low rates of CNS-related EFV substitution. The objective of this study was to further examine the real-world rate of CNS-related EFV discontinuation in antiretroviral therapy naïve HIV-1 patients. This retrospective cohort study used a nationally representative electronic medical records database to identify HIV-1 patients ≥12 years old, treated with a 1st-line EFV-based regimen (single or combination antiretroviral tablet) from 1 January 2009 to 30 June 2013. Patients without prior record of EFV use during 6-month baseline (i.e., antiretroviral therapy naïve) were followed 12 months post-medication initiation. CNS-related EFV discontinuation was defined as evidence of a switch to a replacement antiretroviral coupled with record of a CNS symptom within 30 days prior, absent lab evidence of virologic failure. We identified 1742 1st-line EFV patients. Mean age was 48 years, 22.7% were female, and 8.1% had a prior report of CNS symptoms. The first year, overall discontinuation rate among new users of EFV was 16.2%. Ten percent of patients (n = 174) reported a CNS symptom and 1.1% (n = 19) discontinued EFV due to CNS symptoms: insomnia (n = 12), headache (n = 5), impaired concentration (n = 1), and somnolence (n = 1). The frequency of CNS symptoms was similar for patients who discontinued EFV compared to those who did not (10.3 vs. 9.9%; P = .86). Our study found that EFV discontinuation due to CNS symptoms was low, consistent with prior reports.

  14. Determination of Efavirenz in Human Dried Blood Spots by Reversed-Phase High Performance Liquid Chromatography with UV Detection

    Science.gov (United States)

    Hoffman, Justin T; Rossi, Steven S; Espina-Quinto, Rowena; Letendre, Scott; Capparelli, Edmund V

    2013-01-01

    Background Previously published methods for determination of efavirenz (EFV) in human dried blood spots (DBS) employ costly and complex liquid chromatography/mass spectrometry. We describe the validation and evaluation of a simple and inexpensive high-performance liquid chromatography (HPLC) method for EFV quantification in human DBS and dried plasma spots (DPS), using ultraviolet (UV) detection appropriate for resource-limited settings. Methods 100μl of heparinized whole blood or plasma were spotted onto blood collection cards, dried, punched, and eluted. Eluates are injected onto a C-18 reversed phase HPLC column. EFV is separated isocratically using a potassium phosphate and ACN mobile phase. UV detection is at 245nm. Quantitation is by use of external calibration standards. Following validation, the method was evaluated using whole blood and plasma from HIV-positive patients undergoing EFV therapy. Results Mean recovery of drug from dried blood spots is 91.5%. The method is linear over the validated concentration range of 0.3125 – 20.0μg/mL. A good correlation (Spearman r=0.96) between paired plasma and DBS EFV concentrations from the clinical samples was observed, and hematocrit level was not found to be a significant determinant of the EFV DBS level. The mean observed CDBS/Cplasma ratio was 0.68. A good correlation (Spearman r=0.96) between paired plasma and DPS EFV concentrations from the clinical samples was observed. The mean percent deviation of DPS samples from plasma samples is 1.68%. Conclusions Dried whole blood spot or dried plasma spot sampling is well suited for monitoring EFV therapy in resource limited settings, particularly when high sensitivity is not essential. PMID:23503446

  15. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  16. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  17. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  18. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  19. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  20. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  1. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  2. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  3. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  4. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  5. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  6. Hoy como ayer

    OpenAIRE

    Rodríguez M.

    2010-01-01

    Leyendo el artículo titulado “Los medicamentos baratos” de la revista La Farmacia Española, publicada en Madrid el jueves 21 de diciembre de 1893, uno se pregunta cómo puede ser que se reconozca la situación como si fuera de ahora mismo, cómo puede ser que estemos igual que hace más de cien años. Entonces eran los descuentos que se empezaban a extender en las farmacias, francesas sobre todo, y que amenazaban el prestigio profesional de todo el colectivo. Con frases como éstas se define la sit...

  7. La razonabilidad como virtud

    OpenAIRE

    Muñoz Oliveira, Luis Humberto

    2008-01-01

    Consultable des del TDX Títol obtingut de la portada digitalitzada Esta tesis doctoral explora la idea de que la razonabilidad es una virtud fundamental para que las sociedades plurales puedan convertirse en, o mantenerse como, un sistema de cooperación donde la justicia sea posible. La hipótesis central es que la razonabilidad como virtud es una manera de ser tolerante de forma solidaria, es entender al conciudadano, escucharlo, saber que juntos acordaron las reglas de cooperación y ac...

  8. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  9. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  10. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  11. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  12. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  13. Nevirapine and efavirenz elicit different changes in lipid profiles in antiretroviral-therapy-naive patients infected with HIV-1.

    Directory of Open Access Journals (Sweden)

    Frank van Leth

    2004-10-01

    Full Text Available BACKGROUND: Patients infected with HIV-1 initiating antiretroviral therapy (ART containing a non-nucleoside reverse transcriptase inhibitor (NNRTI show presumably fewer atherogenic lipid changes than those initiating most ARTs containing a protease inhibitor. We analysed whether lipid changes differed between the two most commonly used NNRTIs, nevirapine (NVP and efavirenz (EFV. METHODS AND FINDINGS: Prospective analysis of lipids and lipoproteins was performed in patients enrolled in the NVP and EFV treatment groups of the 2NN study who remained on allocated treatment during 48 wk of follow-up. Patients were allocated to NVP (n = 417, or EFV (n = 289 in combination with stavudine and lamivudine. The primary endpoint was percentage change over 48 wk in high-density lipoprotein cholesterol (HDL-c, total cholesterol (TC, TC:HDL-c ratio, non-HDL-c, low-density lipoprotein cholesterol, and triglycerides. The increase of HDL-c was significantly larger for patients receiving NVP (42.5% than for patients receiving EFV (33.7%; p = 0.036, while the increase in TC was lower (26.9% and 31.1%, respectively; p = 0.073, resulting in a decrease of the TC:HDL-c ratio for patients receiving NVP (-4.1% and an increase for patients receiving EFV (+5.9%; p < 0.001. The increase of non-HDL-c was smaller for patients receiving NVP (24.7% than for patients receiving EFV (33.6%; p = 0.007, as were the increases of triglycerides (20.1% and 49.0%, respectively; p < 0.001 and low-density lipoprotein cholesterol (35.0% and 40.0%, respectively; p = 0.378. These differences remained, or even increased, after adjusting for changes in HIV-1 RNA and CD4+ cell levels, indicating an effect of the drugs on lipids over and above that which may be explained by suppression of HIV-1 infection. The increases in HDL-c were of the same order of magnitude as those seen with the use of the investigational HDL-c-increasing drugs. CONCLUSION: NVP-containing ART shows larger increases in HDL

  14. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  15. Tribal ethnicity and CYP2B6 genetics in Ugandan and Zimbabwean populations in the UK: implications for efavirenz dosing in HIV infection.

    Science.gov (United States)

    Jamshidi, Y; Moreton, M; McKeown, D A; Andrews, S; Nithiyananthan, T; Tinworth, L; Holt, D W; Sadiq, S T

    2010-12-01

    To determine differences in CYP2B6 loss of function (LoF) single nucleotide polymorphisms (SNPs) and haplotypes between Zimbabweans and Ugandans, and within Ugandan populations (Bantu and Nilotic). Genetic epidemiological study enrolling adult black African Ugandan and Zimbabwean patients attending a UK HIV-1 clinic, irrespective of antiretroviral therapy status. Genomic DNA was extracted from whole blood and the presence of CYP2B6 alleles was determined by direct sequencing of all nine exons of the CYP2B6 gene. Blood was also collected, where appropriate, for determination of efavirenz concentrations. Frequency of SNPs in all patients and LoF haplotype frequencies were calculated. The relationship between the number of LoF haplotype alleles possessed and efavirenz trough concentration (ETC) was determined. Thirty-six Zimbabweans and 74 Ugandans (58 Bantu and 16 Nilotic) were recruited. The definite haplotypes determined were *6, *18, *20 and *27 as LoF and *4 as gain of function. Among those with definite genotypes, the frequency of LoF alleles was 65% [95% confidence interval (95% CI): 51-80] of Zimbabweans versus 22% (95% CI: 12-31) of Ugandan Bantus (P = 10(-6)) and versus 39% (95% CI: 14-64) of Ugandan Nilotics (P = 0.09). Among the 19 patients with definite genotype and with available ETCs, log ETCs were associated with a greater number of LoF haplotype alleles [848 ng/mL (n = 12), 1069 ng/mL (n = 4) and 1813 ng/mL (n = 3) for 0, 1 or 2 LoF haplotypes, respectively (P = 0.016)]. Among Zimbabweans, LoF haplotypes constitute the majority of CYP2B6 alleles and are significantly higher in prevalence compared with Ugandans. Frequencies of LoF haplotypes and SNPs in Ugandan Nilotics appear to lie between those of Zimbabweans and Ugandan Bantus. These findings may have relevance to pharmacokinetics and dosing of efavirenz in African populations.

  16. Issues in resistance, adherence, and comparative efficacy of the single-tablet regimen combination of tenofovir, emtricitabine, and efavirenz in the management of HIV-1 infection

    Directory of Open Access Journals (Sweden)

    Rebick G

    2012-09-01

    Full Text Available Gabriel Rebick, Sharon L WalmsleyDivision of Infectious Diseases, Department of Medicine, University Health Network, University of Toronto, Toronto, ON, CanadaAbstract: Atripla is the first once-daily, single-tablet, triple-combination antiretroviral therapy. It is recommended for the initial treatment of the naïve patient with human immunodeficiency virus-1 (HIV-1 infection in all current guidelines, based on its proven efficacy in numerous head-to-head randomized clinical trials. Not only has it proven efficacy, but the fixed-dose combination, Atripla, has resulted in an improvement in adherence, quality of life, and satisfaction among naïve as well as virally suppressed patients switching from another regimen. Despite the advantages, tolerability issues can arise that are related primarily to the efavirenz component, which is known to cause central nervous side effects such as dizziness, abnormal dreams, and anxiety. Although generally self-limited, these side-effects can lead to treatment discontinuation in the short- or long-term. Based on the observation of neural tube defects in macaque models, and isolated case reports in human fetuses with first trimester exposure, it is rated as Food and Drug Administration pregnancy category D, and considered as contraindicated in the first trimester of pregnancy where alternatives are available. Given the low genetic barrier of each of the individual components, resistance remains an important issue for patients with poor adherence, but is balanced in part by the long half-life of the drugs. Transmitted resistance is described in up to 16% of newly infected patients in population surveys, and is particularly prevalent in men who have sex with men. Minority variants that may impart resistant to efavirenz are not detected with currently used HIV-1 genotype assays, but nonetheless may also be implicated in patients who fail initial treatment. Several single-tablet regimens are recently licensed or in

  17. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  18. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  19. CYP2B6 genotype-based efavirenz dose recommendations during rifampicin-based antituberculosis cotreatment for a sub-Saharan Africa population.

    Science.gov (United States)

    Mukonzo, Jackson K; Bisaso, Ronald K; Ogwal-Okeng, Jasper; Gustafsson, Lars L; Owen, Joel S; Aklillu, Eleni

    2016-04-01

    To assess genotype effect on efavirenz (EFV) pharmacokinetics, treatment outcomes and provide genotype-based EFV doses recommendations during for tuberculosis (TB)-HIV-1 cotreatment. EFV concentrations from 158 HIV-TB co-infected patients treated with EFV/lamivudine/zidovidine and rifampicin were analyzed. Genotype and CD4 and viral load data were analyzed using a population PK model. Simulated AUCs for 600 mg EFV dose were 1.2- and 2.4-times greater than the product label for Ugandans in general and CYP2B6*6/*6 genotypes respectively. EFV daily doses of 450 and 250 mg for Ugandans and CYP2B6*6/*6 genotypes, respectively, yielded simulated exposures comparable to the product label. Around 450 and 250 mg daily doses might meet EFV dosing needs of HIV-TB infected Ugandans in general and CYP2B6*6/*6 genotypes, respectively.

  20. Ouabaina como Hormona

    OpenAIRE

    J. Hernando Ordoñez

    1996-01-01

    Comentario sobre su origen endógeno y sus aplicaciones terapéuticas

    Pocas drogas han sido más estudiadas que el grupo de los digitálicos, estrofantinas y ouabaina, cuyo estudio es objeto del presente trabajo.

    La ouabaina empezó a ser estudiada desde el siglo pasado. La primera referencia conocida corresponde a Pelikan, 1865 (1), como veneno que empleaban para las flechas en Gabón (Africa). (.)

  1. Hoy como ayer

    Directory of Open Access Journals (Sweden)

    Rodríguez M.

    2010-06-01

    Full Text Available Leyendo el artículo titulado “Los medicamentos baratos” de la revista La Farmacia Española, publicada en Madrid el jueves 21 de diciembre de 1893, uno se pregunta cómo puede ser que se reconozca la situación como si fuera de ahora mismo, cómo puede ser que estemos igual que hace más de cien años. Entonces eran los descuentos que se empezaban a extender en las farmacias, francesas sobre todo, y que amenazaban el prestigio profesional de todo el colectivo. Con frases como éstas se define la situación que se presentaba en aquel momento: “…el desprestigio de que vaya por unos cuantos desnaturalizándose el ejercicio de la farmacia en tal forma que se convierta en un comercio impuro y de la peor estofa; pero conviene mucho combatir con mano firme la tendencia a la baratería, tanto más cuanto que no puede dudarse que significa un rebajamiento a todas luces nocivo y que supone una desorganización que nos llevaría en breve a la más completa ruina, y lo que creo aún más grave, a la desmoralización y el desorden, que no se compadecen en modo alguno con lo que en realidad es hoy y ha sido siempre el ejercicio de una profesión genuinamente científica como lo es la de la farmacia”. La propuesta que se hacía para controlar la situación era “la limitación de farmacias, con vigilancia estrecha del Estado y tarifa uniforme oficial”, así como “hace falta mucha inteligencia y mucha unión, hace falta que nadie permanezca indiferente, que todos y cada uno pongan de su parte lo que puedan”. Nuestra profesión mezcla una doble vertiente sanitaria y comercial que no siempre es fácil mantener equilibrada y, por lo que se ve, esto ha ocurrido así desde siempre. El problema que existe hoy en día es la mercantilización de la farmacia, un desplazamiento de establecimiento sanitario hacia una simple empresa.

  2. Incident AIDS or Death After Initiation of Human Immunodeficiency Virus Treatment Regimens Including Raltegravir or Efavirenz Among Adults in the United States.

    Science.gov (United States)

    Cole, Stephen R; Edwards, Jessie K; Hall, H Irene; Brookhart, M Alan; Mathews, W Christopher; Moore, Richard D; Crane, Heidi M; Kitahata, Mari M; Mugavero, Michael J; Saag, Michael S; Eron, Joseph J

    2017-06-01

    The long-term effectiveness of human immunodeficiency virus (HIV) treatments containing integrase inhibitors is unknown. We use observational data from the Centers for AIDS Research Network of Integrated Clinical Systems and the Centers for Disease Control and Prevention to estimate 4-year risk of AIDS and all-cause mortality among 415 patients starting a raltegravir regimen compared to 2646 starting an efavirenz regimen (both regimens include emtricitabine and tenofovir disoproxil fumarate). We account for confounding and selection bias as well as generalizability by standardization for measured variables, and present both observational intent-to-treat and per-protocol estimates. At treatment initiation, 12% of patients were female, 36% black, 13% Hispanic; median age was 37 years, CD4 count 321 cells/µL, and viral load 4.5 log10 copies/mL. Two hundred thirty-five patients incurred an AIDS-defining illness or died, and 741 patients left follow-up. After accounting for measured differences, the 4-year risk was similar among those starting both regimens (ie, intent-to treat hazard ratio [HR], 0.96 [95% confidence interval {CI}, .63-1.45]; risk difference, -0.9 [95% CI, -4.5 to 2.7]), as well as among those remaining on regimens (ie, per-protocol HR, 0.95 [95% CI, .59-1.54]; risk difference, -0.5 [95% CI, -3.8 to 2.9]). Raltegravir and efavirenz-based initial antiretroviral therapy have similar 4-year clinical effects. Vigilance regarding longer-term comparative effectiveness of HIV regimens using observational data is needed because large-scale experimental data are not forthcoming. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  3. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  6. Efavirenz, Emtricitabine, and Tenofovir

    Science.gov (United States)

    ... HIV-related illnesses such as serious infections or cancer. Taking these medications along with practicing safer sex ... of hormonal contraceptives (birth control pills, patches, rings, implants, or ... (''buffalo hump''), breasts, and around your stomach. You may notice a ...

  7. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  8. O ensaio como narrativa

    Directory of Open Access Journals (Sweden)

    Pedro Duarte

    2016-02-01

    Full Text Available O artigo tenta demonstrar que todos os textos, mesmo aqueles cuja natureza é teórica, têm alguma forma de narrativa. Nem sempre são personagens que os ocupam, podem ser ideias, mas mesmo assim há um enredo conceitual que se passa. Modernamente, a forma dessa narrativa foi sobretudo o sistema, com a pretensão totalizadora presente, por exemplo, na filosofia de Hegel. Contemporaneamente, porém, a forma do ensaio – surgida ainda na era moderna – ganha destaque por sua forma descontínua de narrar. O objetivo do artigo é apontar que, se o ensaio é uma forma, como explicitaram Lukács, Benjamin e Adorno, ele é também uma forma de narrar – ainda que de narrar conceitualmente objetos da cultura.

  9. La Justicia como virtud

    Directory of Open Access Journals (Sweden)

    Martínez-Sicluna y Sepúlveda, Consuelo

    2003-07-01

    Full Text Available El sentimiento de la justicia representa el hábito de conducta por el que nos vemos obligados, en cualquier relación, a dar a cada uno lo suyo. Ahora bien, esta disposición del espíritu se inscribe en las coordenadas que definen al hombre: verdad, libertad y bien. El hombre como ser racional y por tanto libre: el único ser que se determina a sí mismo y que alcanza en el bien el sentido de su proyección personal, esto es, la perfección. Dar a cada uno lo suyo es dar al sujeto el reconocimiento de este fundamento ontológico.

  10. Como comunicar la Alegria

    Directory of Open Access Journals (Sweden)

    Pablo Portales

    2015-01-01

    Full Text Available Se ofrece un amplio análisis sobre la industria electoral, recordando que un candidato a presidente es "un producto para la venta". Se Desmenuzan las estrategias utilizadas en el plebiscito chileno,las elecciones norteamericanas con el NO a BUSH. El Mercadeo Social es una nueva metodología utilizada en proyectos de desarrollo a nivel de campo por ello se hace un esclarecimiento y clarifica el vínculo con la comunicación. Se agrega temas como: Los modelos de recepción de mensajes cuyos marcos conceptuales y metodologías aún no se han adaptado al potencial de esta línea de trabajo.Se analiza la agonía de las radios mineras en Bolivia en la que 42 años de historia y heroísmo se desmoronan.

  11. El inquisidor como profesor

    Directory of Open Access Journals (Sweden)

    Adriano PROSPERI

    2009-12-01

    Full Text Available Giovanni Botero, en una célebre página de su Ragion di stato, se detuvo sobre el tema de la fuerza de la religión en los gobiernos. Esta función de la religión cristiana —para Botero, católica— es garante del orden público y se presenta también como opuesta a la generadora de desorden de Lutero y Calvino, quienes siembran por todo cizañas y revoluciones de estados y ruinas de los reinos. Estamos en los orígenes del esquema historiografía de la periodización de la Edad Moderna que confió precisamente a la Reforma el papel de nodriza de las revoluciones que nacieron en Europa.

  12. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  13. La persona como creatura

    Directory of Open Access Journals (Sweden)

    Emmanuel Housset

    2010-01-01

    Full Text Available El artículo de Emmanuel Housset implica un esfuerzo de rehabilitación del concepto «persona» para la filosofía contemporánea y la fenomenología. Para ello el autor busca mostrar cómo poco a poco «persona» tomó otra significación que la de «personaje» o sujeto de derecho. Es en autores como san Agustín y santo Tomás de Aquino que se halla un acceso diferente que pone el énfasis más bien en su carácter relacional y responsivo de la persona, antes que en su dimensión autónoma y autotélica. Tal dimensión aparece, según Housset, junto con la idea de persona como creatura y en oposición a la de individuo racional dueño de sí. La dimensión afectiva, la personalidad despertada por las diversas figuras de la alteridad son algunas de las dimensiones de la persona que examina el autor a partir del examen de la carne, las pasiones, la memoria, la historicidad y el amor alteridad.Emmanuel Housset's paper is an effort to revitalize the concept of 'person' for contemporary philosophy and phenomenology To this end the author looks to show how little by little the understanding of 'person' took on a different meaning to that of 'character' or "right bearing individual". It is in authors such as St. Augustine and St. Thomas Aquinas that a different approach is found, one that puts emphasis on the relational and responsive character of a person, rather than on the autonomous and auto telic dimension. According to Housset, such a dimension appears together with the idea of the person as a creation, and in opposition to the idea of the rational individual, that is his own master. The emotional dimension and the personality that is awoken by the many figures of alterity are some of the dimensions of the person that the author analyzes, based on examining the flesh, passions, memory historicity and love.

  14. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  15. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  16. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  17. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  18. El signo como emblema

    Directory of Open Access Journals (Sweden)

    Sáez, Carlos

    2003-06-01

    Full Text Available The aim of this article is to study the signs and symbols that appear in the hispanic medieval documents and manuscripts. These signs and symbols have usually been considered simply as mere elements to validate the charters. However, these alements were useful as a mean of visual communication between the high classes, able to generate charters, and the rest of medieval society—the majority illiterate— who received those charters. Because of their inability of understand an alphabetical code, they needed the graphic help to comprehend the message. Besides this, the article deals with non diplomatic signs and their function.

    Este artículo se centra en los signos o símbolos presentes en los documentos y manuscritos medievales hispanos, que habitualmente han sido tratados como meros elementos de validación de los diplomas. Pero estos elementos servían también de nexos de comunicación visual entre las clases poderosas, capaces de producir escritos, y los demás miembros de la sociedad medieval, receptores y destinatarios de tales escritos, en su mayoría analfabetos. Precisamente por esta razón, su incapacidad de descifrar un código alfabético, necesitan de auxilio gráfico para acercarse a la comprensión del mensaje. Asimismo, tratamos de los signos no diplomáticos y de su función.

  19. O analista como testemunha

    Directory of Open Access Journals (Sweden)

    Jô Gondar

    2016-04-01

    Full Text Available Resumo A proposta deste artigo é pensar o lugar da testemunha como um lugar terceiro que o analista, na clínica do traumático, é capaz de sustentar. Nos sonhos traumáticos, segundo Ferenczi, já existe a convocação de um terceiro. Não se trata da testemunha da esfera do Direito, tampouco do lugar do pai ou da Lei simbólica. Trata-se de um terceiro espaço que pode ser chamado de potencial, espaço intersticial, indeterminado e informe no qual circula - e aos poucos ganha forma -, algo que a princípio seria incomunicável. Esse espaço permite e suporta a literalidade da narrativa testemunhal, seus titubeios, paradoxos e silêncios. Mais do que uma teoria do trauma, a noção de espaço potencial seria a grande contribuição da psicanálise às pesquisas teóricas e clínicas com sobreviventes de campos de extermínio, de situações de tortura e de violência.

  20. Arte como espelho

    Directory of Open Access Journals (Sweden)

    Pedro Süssekind

    2016-12-01

    Full Text Available Este artigo tem como ponto de partida o exemplo da relação espelhada entre um livro e uma pintura de mesmo nome: o retrato que Lucian Freud fez do crítico de arte Martin Gayford e o diário que esse crítico escreveu sobre seu retratista, ambas as obras chamadas Homem com cachecol azul. A partir do exemplo, discuto a metáfora do espelho para caracterizar a arte, recorrendo para isso à teoria da representação artísticas elaborada pelo filósofo norte-americano Arthur Danto no artigo “O mundo da arte”, de 1964, e no primeiro capítulo do livro A transfiguração do lugar-comum, de 1981. Recorro, por fim, a dois exemplos artísticos de espelhamento na representação analisados por Danto em O abuso da beleza, de 2003, um quadro holandês do século dezessete e um poema de Rainer Maria Rilke.

  1. El riesgo como oportunidad

    Directory of Open Access Journals (Sweden)

    Daniela Gargantini

    2003-01-01

    Full Text Available Durante los últimos años el crecimiento mundial de catástrofes naturales ha ido en franco aumento. Sin embargo, desde un enfoque sistémico puede verificarse que la gran mayoría de los desastres se origina en los países en desarrollo (entre ellos los latinoamericanos, siendo las pérdidas en ellos significativamente más altas que en los países industrializados. Bajo esta postura los desastres no son sólo naturales sino socio- naturales, enfatizando la estrecha relación de causalidad entre modelos de desarrollo y urbanización y procesos de generación de riesgos, al incrementar la vulnerabilidad de los sectores más desprotegidos. El desastre pone en evidencia así una situación (la pobreza y segregación urbana ya existente, pero no considerada hasta el momento de la catástrofe. Frente a este panorama el desastre aparece como oportunidad que precipita tres catalizadores de políticas habitacionales: tierra, asistencia técnica y financiamiento, incrementando la celeridad y la creatividad de las respuestas. El interrogante que surge es por qué esperar el desastre para ponerlos en marcha, cuando ninguno de ellos es estrictamente dependiente de la situación de riesgo, sino sujeto de luchas de poder.

  2. Endomarketing: como diferencial competitivo

    Directory of Open Access Journals (Sweden)

    Karin Birck

    2013-05-01

    Full Text Available Atualmente, com a profissionalização das empresas e com a grande concorrência no mercado, observa-se uma demanda cada vez maior de gestores comprometidos com o bem-estar pessoal e profissional de seus colaboradores. E, com esse intuito, de apresentar algumas idéias básicas de gestão voltadas à aplicação nas mais diversas técnicas de Endomarketing. Demonstra assim, a importância da utilização de feedback, tanto por parte dos colaboradores quanto dos gestores, destacando a importância de trabalhos de motivação, do clima organizacional favorável e de uma comunicação interna eficaz e a necessidade ímpar de tratar o colaborador como o diferencial dentro de uma empresa. Desta forma, foi realizada uma pesquisa bibliográfica que auxiliará e dará subsídios que lhe permitam retribuir em ações e atitudes de sucesso e, também, fazer um confronto de idéias, onde os autores apresentam suas mais diversas opiniões. Contudo, valendo-se, muitas vezes, de narrativas de experiências de outros gestores e até mesmo de suas próprias, tirando cada um suas próprias conclusões.

  3. Efecto de inhibidores de etileno en la longevidad floral del clavel (Dianthus caryophyllus L. como probables sustitutos del tiosulfato de plata (STS Effect of ethylene inhibitors in extending the vase life of camation iDianthus caryophyllus L. cut flowers as substitutes of silver thiosulfate (STS

    Directory of Open Access Journals (Sweden)

    Cubillos Eliberto

    2001-12-01

    Full Text Available En Colombia el sector de flores de corte para exportación se
    ha constituido en un renglón de gran importancia económica.
    Sin embargo, los altos costos de producción y los bajos precios han hecho que la rentabilidad de esta industria haya
    decaído en los últimos años. El tiosulfato de plata (STS es
    uno de los productos más empleados en poscosecha de flores, pero en la actualidad es visto como un posible contaminante ambiental. Con el objetivo de comparar la respuesta de productos tradicionales en poscosecha, cuyo ingrediente principal es el ion Ag", con otros de mayor degradabilidad, se llevó a cabo un ensayo con clavel estándar de la variedad 'Nelson '. Los tallos florales fueron cosechados en cultivo bajo condiciones de invernadero en una finca comercial de la Sabana de Bogotá, seleccionados y sometidos a diferentes soluciones de poscosecha para prolongar la vida en florero. Posteriormente, se les realizó la simulación de transporte durante un periodo de diez días y, en seguida, llevados al laboratorio de Fisiología de Cultivos de la Universidad Nacional de Colombia, Sede Bogotá, en donde, se monto el ensayo para realizar las respectivas mediciones. En el laboratorio, los tallos se mantuvieron hidratados en agua destilada, en condiciones de 12 h de fotoperiodo (luz artificial de 6 a.m. a 6 p.m. y con ventilación de 30 min en las horas de la mañana para evitar la acumulación del etileno. Las condiciones ambientales promedio del laboratorio fueron de 19°C y H. R. de 75%. Los productos comerciales empleados fueron Tiosulfato de Plata (STS elaborado en la finca, l-Metilciclopropano (1-MCP, Chrysal AVB, Chrysal EVB, Florissima 125, Florissima 135 y Florissant 100. Los mejores resultados en longevidad floral se obtuvieron con la combinación de 1-MCP + Florissima 135 (22 días, Florissima 125 (21, 7 días y STS finca (21 ,5 días. Se comprobó que algunos productos que no contienen el ion Ag (Florissima 135 y

  4. Ouabaina como Hormona

    Directory of Open Access Journals (Sweden)

    J. Hernando Ordoñez

    1996-04-01

    Full Text Available

    Comentario sobre su origen endógeno y sus aplicaciones terapéuticas

    Pocas drogas han sido más estudiadas que el grupo de los digitálicos, estrofantinas y ouabaina, cuyo estudio es objeto del presente trabajo.

    La ouabaina empezó a ser estudiada desde el siglo pasado. La primera referencia conocida corresponde a Pelikan, 1865 (1, como veneno que empleaban para las flechas en Gabón (Africa. (.

    (. Vinieron luego los trabajos de Fraser, 1869, (2, 3, 6, Polaillon, 1871 (4, Amaud, 1888 (5, Vaquez y Lutembacher, 1917 (7, Stoll, 1939 (8, Lapicque, 1929 (9, Wiggers, 1927 (10, Ytantos otros (11, 12, 13. En la obra de Kisch (14 aparecen más de 700 referencias bibliográficas sobre el particular.

    Ouabaina de origen endógeno. Purificación
    Durante muchos años se conoció la ouabaina como de origen vegetal, elaborada por las plantas Strophanthus Glaber(k-estrofantina,AcocantheraOuabaio(Ouabaina yStrophanthus Kombe (k-estrofantina y kestrofantósido. Una propiedad común a todos los digitálicos, estrofantina y ouabaina es que todos son inhibidores de la bomba de Na-K, encargada de regular la salida de Na y la entrada de K celular.

    Estudiando los inhibido res de esta bomba han encontrado en años recientes resultados extraordinarios en relación con el origen endógeno de algunos de estos inhibidores, entre ellos la ouabaina. Por considerarlos de extrema importancia y actualidad científica me permito citar algunos de ellos. Hamlyn y Manunta (15, 16, 17, 18, Y 19 hicieron estudios sobre el particular y lograron identificar en el plasma humano un compuesto igual a la ouabaina. Estos hallazgos fueron confirmados después por otros autores (20, 21, 22, 23 Y24.

    Ham bl yn (19 da varios argumentos que ponen en evidencia que el compuesto químico encontrado es ouabaina pura, y, lo que es más interesante, que tiene un origen endógeno. a Por espectroscopia de alta resoluci

  5. O direito como imperativo

    Directory of Open Access Journals (Sweden)

    Cloter Miglioriani

    1988-08-01

    Full Text Available We have examined one of the facets which Law presents to society, looking at the theme through a brief history of Law, in which Roman Law stands out, up to modem times, comparing current juridical systems such as the Continental System, Common Law, and Soviet Law. We have looked at Law from the viewpoint of society 's need to have basic mies for living together, with the juridical ruZe being one of the most important. We have highlighted the views of Hart and Kelsen on the foundations of the validity of Law. We have also considered the obligatoriness of Law; giving the point of view of tadbruch who, explaining his ''Theory of the Obligatoriness of Law ", concluded that the obligatoriness of Law can only be withdraw when there is a Clash between morals, law, use and social conventions. We have looked at the notion of the imperativeness of Law the central theme of the work -drawing on the views of Miguel Reale, for whom the juridical nonn cannot be reduced to a "command of a volitional nature", but rather the obligatory character of the juridical nonn arises from the pressure of social values. Del Vecchio, who is also quoted, recognized that imperativeness exists in the juridical norm, whether it is preceptive (a positive command or permissive. Also mentioned is the opinion of Tercio Sampaio Ferraz, for whom the juridical norm has imperativeness to the extent that the imposition of behaviour is unconditionally guaranteed. Foi feita a abordagem de uma das facetas com que o Direito se apresenta à sociedade, enfocando o tema a partir de um brevíssimo histórico do Direito, onde revela a fase romana, até os períodos modernos, com comparações dos sistemas jurídicos hodiernos, como o sistema continental, o da Commum Law e o soviético. Foi enfocado o Direito em face da necessidade sociedade em ter básicas de convivência, despontando a regra jurídica como das mais importantes. Foi dado destaque às posições de Hart e Kelsen, sobre os

  6. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  7. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  8. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  9. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  10. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  11. Liver enzyme abnormalities and associated risk factors in HIV patients on efavirenz-based HAART with or without tuberculosis co-infection in Tanzania.

    Directory of Open Access Journals (Sweden)

    Sabina Mugusi

    Full Text Available To investigate the timing, incidence, clinical presentation, pharmacokinetics and pharmacogenetic predictors for antiretroviral and anti-tuberculosis drug induced liver injury (DILI in HIV patients with or without TB co-infection.A total of 473 treatment naïve HIV patients (253 HIV only and 220 with HIV-TB co-infection were enrolled prospectively. Plasma efavirenz concentration and CYP2B6*6, CYP3A5*3, *6 and *7, ABCB1 3435C/T and SLCO1B1 genotypes were determined. Demographic, clinical and laboratory data were collected at baseline and up to 48 weeks of antiretroviral therapy. DILI case definition was according to Council for International Organizations of Medical Sciences (CIOMS. Incidence of DILI and identification of predictors was evaluated using Cox Proportional Hazards Model. The overall incidence of DILI was 7.8% (8.3 per 1000 person-week, being non-significantly higher among patients receiving concomitant anti-TB and HAART (10.0%, 10.7 per 1000 person-week than those receiving HAART alone (5.9%, 6.3 per 1000 person-week. Frequency of CYP2B6*6 allele (p = 0.03 and CYP2B6*6/*6 genotype (p = 0.06 was significantly higher in patients with DILI than those without. Multivariate cox regression model indicated that CYP2B6*6/*6 genotype and anti-HCV IgG antibody positive as significant predictors of DILI. Median time to DILI was 2 weeks after HAART initiation and no DILI onset was observed after 12 weeks. No severe DILI was seen and the gain in CD4 was similar in patients with or without DILI.Antiretroviral and anti-tuberculosis DILI does occur in our setting, presenting early following HAART initiation. DILI seen is mild, transient and may not require treatment interruption. There is good tolerance to HAART and anti-TB with similar immunological outcomes. Genetic make-up mainly CYP2B6 genotype influences the development of efavirenz based HAART liver injury in Tanzanians.

  12. Europa como cultura

    Directory of Open Access Journals (Sweden)

    Javier San Martín

    2012-02-01

    Full Text Available El presente texto tiene un origen muy concreto. El día 15 de marzo de 2002, con motivo de la Cumbre Europea de Barcelona, Jorge Semprún reflexionaba, en las páginas de un conocido diario madrileño, sobre el significado que para él tenía ser europeo. Para ello emprendía tres "viajes intelectuales" a Viena, Praga y Buchenwald, los tres de gran significado histórico y cultural. Dado el interés del texto y del momento, me pareció entonces oportuno glosar algunos aspectos de aquel artículo, primero, para subrayar el valor de la aportación de Semprún, luego para corregir alguna inexactitud de carácter biográfico, debida posiblemente a la rapidez de la traducción, y, tercero, para ampliar con algún comentario la valiosa contribución de Sempnín, sobre todo en lo que concierne al sentido de Europa. En este texto se toma aquel comentario como punto de partida.The origins of this text are very specific. On 15 March 2002, on the occasion of the European Summit in Barcelona, and on the pages of a well-known Madrid newspaper, Jorge Semprún reflected on the meaning that being European had for him. To do this, he embarked on three "intellectual journeys" to Vienna, Prague and Buchenwald, all three of great historical and cultural significance. Given the interest of the text and of the moment, I considered it appropriate to comment on aspects of the article -firstly, to underline the value of Semprún's contribution; secondly, to correct certain biographical inaccuracies, possibly due to a hasty translation; and thirdly, to complement Semprún's valuable contribution, essentially concerning the meaning of Europe. This text takes that comment as its starting point.

  13. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  14. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  15. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  16. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  17. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  18. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  20. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  1. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  2. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  4. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  5. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  6. El CO2 como disolvente y como reactivo

    OpenAIRE

    La Franca Pitarresi, Vincenzo Rosario

    2016-01-01

    Existen numerosas ventajas asociada con el uso de CO2 , tanto como disolvente que como reactivo, y todas se pueden resumir en cuatro categorías generales: beneficios ambiental, beneficios de salud y seguridad, beneficios en el procedimiento y beneficios químicos. Los procesos que implican el CO2 como disolvente no aumentaría las emisiones de CO2, más bien proporcionaría una oportunidad para el reciclaje de CO2 residual. Además, los esfuerzos para secuestrar el CO2 producido de los gases de co...

  7. The Effect of Malnutrition on the Pharmacokinetics and Virologic Outcomes of Lopinavir, Efavirenz and Nevirapine in Food Insecure HIV-Infected Children in Tororo, Uganda

    Science.gov (United States)

    Bartelink, Imke H.; Savic, Rada M.; Dorsey, Grant; Ruel, Theodore; Gingrich, David; Scherpbier, Henriette J.; Capparelli, Edmund; Jullien, Vincent; Young, Sera L.; Achan, Jane; Plenty, Albert; Charlebois, Edwin; Kamya, Moses; Havlir, Diane; Aweeka, Francesca

    2014-01-01

    Background Malnutrition may impact the pharmacokinetics (PK) of antiretroviral medications and virologic responses in HIV-infected children. We therefore evaluated the PK of nevirapine (NVP), efavirenz (EFV) and lopinavir (LPV) in associations with nutritional status in a cohort of HIV-infected Ugandan children. Methods Sparse dried blood spot (DBS) samples from Ugandan children were used to estimate plasma concentrations. Historical PK data from children from three resource-rich countries (RRC) were utilized to develop the PK models. Results Concentrations in 330 DBS from 163 Ugandan children aged 0.7–7 years were analyzed in reference to plasma PK data (1189 samples) from 204 children from RRC aged 0.5–12 years. Among Ugandan children 48% was malnourished (underweight, thin or stunted). Compared to RRC, Ugandan children exhibited reduced bioavailability of EFV and LPV; 11% (P=0.045) and 18% (P=0.008) respectively. In contrast, NVP bioavailability was 46% higher in Ugandan children (Pmalnutrition on bioavailability. In children receiving NVP, the relation between exposure, malnutrition and outcome turned out to be marginally significant. Further investigations are warranted using more intensive PK measurements and adequate adherence assessements, to further assess causes of virologic failure in Ugandan children. PMID:25742090

  8. Systematic Approach for the Formulation and Optimization of Solid Lipid Nanoparticles of Efavirenz by High Pressure Homogenization Using Design of Experiments for Brain Targeting and Enhanced Bioavailability

    Science.gov (United States)

    Gupta, Shweta; Kesarla, Rajesh; Chotai, Narendra; Misra, Ambikanandan

    2017-01-01

    The nonnucleoside reverse transcriptase inhibitors, used for the treatment of HIV infections, are reported to have low bioavailability pertaining to high first-pass metabolism, high protein binding, and enzymatic metabolism. They also show low permeability across blood brain barrier. The CNS is reported to be the most important HIV reservoir site. In the present study, solid lipid nanoparticles of efavirenz were prepared with the objective of providing increased permeability and protection of drug due to biocompatible lipidic content and nanoscale size and thus developing formulation having potential for enhanced bioavailability and brain targeting. Solid lipid nanoparticles were prepared by high pressure homogenization technique using a systematic approach of design of experiments (DoE) and evaluated for particle size, polydispersity index, zeta potential, and entrapment efficiency. Particles of average size 108.5 nm having PDI of 0.172 with 64.9% entrapment efficiency were produced. Zeta potential was found to be −21.2 mV and the formulation was found stable. The in-vivo pharmacokinetic studies revealed increased concentration of the drug in brain, as desired, when administered through intranasal route indicating its potential for an attempt towards complete eradication of HIV and cure of HIV-infected patients. PMID:28243600

  9. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  10. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A condicionalidade como zona conceitual

    Directory of Open Access Journals (Sweden)

    Taísa Peres de OLIVEIRA

    Full Text Available RESUMO Neste trabalho avaliam-se diferentes padrões de construções condicionais no português a partir dos parâmetros de condicionalidade. O objetivo principal é mostrar como a categoria está internamente organizada não apenas em termos de um núcleo prototípico, mas mostrando como os exemplares mais periféricos se relacionam a ele. As bases teóricas deste trabalho assentam-se sobre concepções funcional-cognitivistas, nos termos de Bybee (2010 e Dancygier (1998, especialmente considerando a relativa instabilidade da gramática e a fluidez da categoria. As reflexões principais apontam a condicionalidade como uma categoria bastante complexa que serve de/como abrigo de múltiplas construções.

  12. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  13. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  14. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  15. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  16. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  19. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  20. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  1. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  2. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  3. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  4. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  5. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  6. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  7. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  8. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  9. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  10. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  11. The effect of malnutrition on the pharmacokinetics and virologic outcomes of lopinavir, efavirenz and nevirapine in food insecure HIV-infected children in Tororo, Uganda.

    Science.gov (United States)

    Bartelink, Imke H; Savic, Rada M; Dorsey, Grant; Ruel, Theodore; Gingrich, David; Scherpbier, Henriette J; Capparelli, Edmund; Jullien, Vincent; Young, Sera L; Achan, Jane; Plenty, Albert; Charlebois, Edwin; Kamya, Moses; Havlir, Diane; Aweeka, Francesca

    2015-03-01

    Malnutrition may impact the pharmacokinetics (PKs) of antiretroviral medications and virologic responses in HIV-infected children. The authors therefore evaluated the PK of nevirapine (NVP), efavirenz (EFV) and lopinavir (LPV) in associations with nutritional status in a cohort of HIV-infected Ugandan children. Sparse dried blood spot samples from Ugandan children were used to estimate plasma concentrations. Historical PK data from children from 3 resource-rich countries (RRC) were utilized to develop the PK models. Concentrations in 330 dried blood spot from 163 Ugandan children aged 0.7-7 years were analyzed in reference to plasma PK data (1189 samples) from 204 children from RRC aged 0.5-12 years. Among Ugandan children, 48% was malnourished (underweight, thin or stunted). Compared to RRC, Ugandan children exhibited reduced bioavailability of EFV and LPV; 11% (P=0.045) and 18% (P=0.008), respectively. In contrast, NVP bioavailability was 46% higher in Ugandan children (PChildren receiving LPV, EFV or NVP had comparable risk of virologic failure. Among children on NVP, low height and weight for age Z scores were associated with reduced risk of virologic failure (P=0.034, P=0.068, respectively). Ugandan children demonstrated lower EFV and LPV and higher NVP exposure compared to children in RRC, perhaps reflecting the consequence of malnutrition on bioavailability. In children receiving NVP, the relation between exposure, malnutrition and outcome turned out to be marginally significant. Further investigations are warranted using more intensive PK measurements and adequate adherence assessments, to further assess causes of virologic failure in Ugandan children.

  12. Quantitative analysis of the effect of zidovudine, efavirenz, and ritonavir on insulin aggregation by multivariate curve resolution alternating least squares of infrared spectra

    International Nuclear Information System (INIS)

    Martí-Aluja, Idoia; Ruisánchez, Itziar; Larrechi, M. Soledad

    2013-01-01

    Highlights: ► The structure of insulin can be changed via interaction with antiretroviral drugs. ► The chemical interaction promotes the formation of aggregates. ► This drug effect was evaluated by MCR-ALS coupled to IR spectroscopy. ► Formation of aggregates was favourable if drugs were able to form hydrogen bonds. ► Higher drug concentrations favoured formation of amorphous aggregates. - Abstract: Quantification of the effect of antiretroviral drugs on the insulin aggregation process is an important area of research due to the serious metabolic diseases observed in AIDS patients after prolonged treatment with these drugs. In this work, multivariate curve resolution alternating least squares (MCR-ALS) was applied to infrared monitoring of the insulin aggregation process in the presence of three antiretroviral drugs to quantify their effect. To evidence concentration dependence in this process, mixtures at two different insulin:drug molar ratios were used. The interaction between insulin and each drug was analysed by 1 H NMR spectroscopy. In all cases, the aggregation process was monitored during 45 min by infrared spectroscopy. The aggregates were further characterised by scanning electron microscopy (SEM). MCR-ALS provided the spectral and concentration profiles of the different insulin–drug conformations that are involved in the process. Their feasible band boundaries were calculated using the MCR-BANDS methodology. The kinetic profiles describe the aggregation pathway and the spectral profiles characterise the conformations involved. The retrieved results show that each of the three drugs modifies insulin conformation in a different way, promoting the formation of aggregates. Ritonavir shows the strongest promotion of aggregation, followed by efavirenz and zidovudine. In the studied concentration range, concentration dependence was only observed for zidovudine, with shorter aggregation time obtained as the amount of zidovudine increased. This factor

  13. Role of Rilpivirine and Etravirine in Efavirenz and Nevirapine-Based Regimens Failure in a Resource-Limited Country: A Cross- Sectional Study.

    Directory of Open Access Journals (Sweden)

    Phairote Teeranaipong

    Full Text Available Etravirine(ETR can be used for patients who have failed NNRTI-based regimen. In Thailand, ETR is approximately 45 times more expensive than rilpivirine(RPV. However, there are no data of RPV use in NNRTI failure. Therefore, we assessed the susceptibility and mutation patterns of first line NNRTI failure and the possibility of using RPV compared to ETV in patients who have failed efavirenz(EFV- and nevirapine(NVP-based regimens.Clinical samples with confirmed virological failure from EFV- or NVP-based regimens were retrospectively analyzed. Resistance-associated mutations (RAMs were interpreted by IAS-USA Drug Resistance Mutations. Susceptibility of ETR and RPV were interpreted by DUET, Monogram scoring system, and Stanford University HIV Drug Resistance Database.1,279 and 528 patients failed EFV- and NVP-based regimens, respectively. Y181C was the most common NVP-associated RAM (54.3% vs. 14.7%, p<0.01. K103N was the most common EFV-associated RAM (56.5% vs. 19.1%, P<0.01. The results from all three scoring systems were concordant. 165(11.1% and 161(10.9% patients who failed NVP-based regimen were susceptible to ETR and RPV, respectively (p = 0.85. 195 (32.2% and 191 (31.6% patients who failed EFV-based regimen, were susceptible to ETR and RPV, respectively (p = 0.79. The susceptibility of ETV and RPV in EFV failure was significantly higher than NVP failure (p<0.01.The mutation patterns for ETR and RPV were similar but 32% and 11% of patients who failed EFV and NVP -based regimen, respectivly were susceptible to RPV. This finding suggests that RPV can be used as the alternative antiretroviral agent in patients who have failed EFV-based regimen.

  14. Genotypic evaluation of etravirine sensitivity of clinical human immunodeficiency virus type 1 (HIV-1) isolates carrying resistance mutations to nevirapine and efavirenz.

    Science.gov (United States)

    Oumar, A A; Jnaoui, K; Kabamba-Mukadi, B; Yombi, J C; Vandercam, B; Goubau, P; Ruelle, J

    2010-01-01

    Etravirine is a second-generation non-nucleoside reverse transcriptase inhibitor (NNRTI) with a pattern of resistance mutations quite distinct from the current NNRTIs. We collected all routine samples of HIV-1 patients followed in the AIDS reference laboratory of UCLouvain (in 2006 and 2007) carrying resistance-associated mutations to nevirapine (NVP) or efavirenz (EFV). The sensitivity to Etravirine was estimated using three different drug resistance algorithms: ANRS (July 2008), IAS (December 2008) and Stanford (November 2008). We also verified whether the mutations described as resistance mutations are not due to virus polymorphisms by the study of 58 genotypes of NNRTI-naive patients. Sixty one samples harboured resistance to NVP and EFV: 41/61 had at least one resistance mutation to Etravirine according to ANRS-IAS algorithms; 42/61 samples had at least one resistance mutation to Etravirine according to the Stanford algorithm. 48 and 53 cases were fully sensitive to Etravirine according to ANRS-IAS and Stanford algorithms, respectively. Three cases harboured more than three mutations and presented a pattern of high-degree resistance to Etravirine according to ANRS-IAS algorithm, while one case harboured more than three mutations and presented high degree resistance to Etravirine according to the Stanford algorithm. The V1061 and V179D mutations were more frequent in the ARV-naive group than in the NNRTI-experienced one. According to the currently available algorithms, Etravirine can still be used in the majority of patients with virus showing resistance to NVP and/or EFV, if a combination of other active drugs is included.

  15. Changes in Liver Steatosis After Switching From Efavirenz to Raltegravir Among Human Immunodeficiency Virus-Infected Patients With Nonalcoholic Fatty Liver Disease.

    Science.gov (United States)

    Macías, Juan; Mancebo, María; Merino, Dolores; Téllez, Francisco; Montes-Ramírez, M Luisa; Pulido, Federico; Rivero-Juárez, Antonio; Raffo, Miguel; Pérez-Pérez, Montserrat; Merchante, Nicolás; Cotarelo, Manuel; Pineda, Juan A

    2017-09-15

    Antiretroviral drugs with a lower potential to induce hepatic steatosis in human immunodeficiency virus (HIV) infection need to be identified. We compared the effect of switching efavirenz (EFV) to raltegravir (RAL) on hepatic steatosis among HIV-infected patients with nonalcoholic fatty liver disease (NAFLD) receiving EFV plus 2 nucleoside analogues. HIV-infected patients on EFV plus tenofovir/emtricitabine or abacavir/lamivudine with NAFLD were randomized 1:1 to switch from EFV to RAL (400 mg twice daily), maintaining nucleoside analogues unchanged, or to continue with EFV plus 2 nucleoside analogues. At baseline, eligible patients should show controlled attenuation parameter (CAP) values ≥238 dB/m. Changes in hepatic steatosis at 48 weeks of follow-up over baseline levels were measured by CAP. Overall, 39 patients were included, and 19 of them were randomized to switch to RAL. At week 48, median CAP for the RAL group was 250 (Q1-Q3, 221-277) dB/m and 286 (Q1-Q3, 269-314) dB/m for the EFV group (P = .035). The median decrease in CAP values was -20 (Q1-Q3, -67 to 15) dB/m for the RAL arm and 30 (Q1-Q3, -17 to 49) dB/m for the EFV group (P = .011). CAP values hepatic steatosis, as measured by CAP, compared with those continuing with EFV. In addition, the proportion of patients without significant hepatic steatosis after 48 weeks was greater for those who switched to RAL. NCT01900015. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  16. Effectiveness and Safety of Generic Fixed-Dose Combination of Tenofovir/Emtricitabine/Efavirenz in HIV-1-Infected Patients in Western India.

    Science.gov (United States)

    Pujari, Sanjay; Dravid, Ameet; Gupte, Nikhil; Joshix, Kedar; Bele, Vivek

    2008-08-20

    To assess effectiveness and safety of a generic fixed-dose combination of tenofovir (TDF)/emtricitabine (FTC)/efavirenz (EFV) among HIV-1-infected patients in Western India. Antiretroviral (ARV)-naive and experienced (thymidine analog nucleoside reverse transcriptase inhibitor [tNRTI] replaced by TDF) patients were started on a regimen of 1 TDF/FTC/EFV pill once a day. They were followed clinically on a periodic basis, and viral loads and CD4 counts were measured at 6 and 12 months. Creatinine clearance was calculated at baseline and at 6 months and/or as clinically indicated. Effectiveness was defined as not having to discontinue the regimen due to failure or toxicity. One hundred forty-one patients who started TDF/FTC/EFV before 1 June 2007 were eligible. Of these, 130 (92.2%) and 44 (31.2%) had 6- and 12-months follow-up, respectively. Thirty-five percent of the patients were ARV-naive. Eleven patients discontinued treatment (4 for virologic failure, 1 for grade 3-4 central nervous system disturbances, 4 for grade 3-4 renal toxicity, and 2 for cost). Ninety-six percent of patients were virologically suppressed at 6 months. Frequency of TDF-associated grade 3-4 renal toxicity was 2.8%; however, 3 of these patients had comorbid conditions associated with renal dysfunction. A fixed-dose combination of generic TDF/FTC/EFV is effective in ARV-naive and experienced patients. Although frequency of severe renal toxicity was higher than has been reported in the literature, it was safe in patients with no comorbid renal conditions.

  17. Development and validation of a liquid chromatography-MS/MS method for simultaneous quantification of tenofovir and efavirenz in biological tissues and fluids.

    Science.gov (United States)

    Barreiros, Luisa; Cunha-Reis, Cassilda; Silva, Eduarda M P; Carvalho, Joana R B; das Neves, José; Sarmento, Bruno; Segundo, Marcela A

    2017-03-20

    Millions of people worldwide live with human immunodeficiency virus (HIV) infection thus justifying the continuous search for new prevention and treatment strategies, including topical microbicide products combining antiretroviral drugs (ARVs) such as tenofovir (TFV) and efavirenz (EFV). Therefore, the aim of this work was to develop and validate a high performance liquid chromatography method coupled to triple quadrupole-tandem mass spectrometry (HPLC-MS/MS) for the quantification of TFV and EFV in biological matrices (mouse vaginal tissue, vaginal lavage and blood plasma). Chromatographic separation was achieved using a reversed phase C18 column (3μm, 100×2.1mm) at 45°C and elution in gradient mode using a combination of 0.1% (v/v) formic acid in water and 0.1% (v/v) formic acid in acetonitrile at 0.35mLmin -1 . Total run time was 9min, with retention time of 2.8 and 4.1min for TFV and EFV, respectively. The MS was operated in positive ionization mode (ESI+) for TFV and in negative ionization mode (ESI-) for EFV detection. Data were acquired in selected reaction monitoring (SRM) mode and deuterated ARVs were employed as internal standards. Calibration curves were linear for ARV concentrations ranging from 4 to 500ngmL -1 with LOD and LOQ for both analytes ≤0.4 and ≤0.7ngmL -1 in sample extracts, respectively. The method was found to be specific, accurate (96.0-106.0% of nominal values) and precise (RSDfluids were ≥88.4%. Matrix effects were observed for EFV determination in tissue and plasma extracts but compensated by the use of deuterated internal standards. The proposed methodology was successfully applied to a pharmacokinetic study following intravaginal administration of both ARVs. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Brief Report: CYP2B6 516G>T Minor Allele Protective of Late Virologic Failure in Efavirenz-Treated HIV-Infected Patients in Botswana.

    Science.gov (United States)

    Vujkovic, Marijana; Bellamy, Scarlett L; Zuppa, Athena F; Gastonguay, Marc; Moorthy, Ganesh S; Ratshaa, Bakgaki R; Han, Xiaoyan; Steenhoff, Andrew P; Mosepele, Mosepele; Strom, Brian L; Aplenc, Richard; Bisson, Gregory P; Gross, Robert

    2017-08-01

    CYP2B6 polymorphisms that affect efavirenz (EFV) concentrations are common, but the effect of this polymorphism on HIV virologic failure in clinical practice settings has not fully been elucidated. Our objective was to investigate the relationship between the CYP2B6 516G>T genotype and late virologic failure in patients treated with EFV in Gaborone, Botswana. We performed a case-control study that included 1338 HIV-infected black Batswana on EFV-based antiretroviral therapy (ART). Patients were approached for enrollment during regular visits at one of the outpatient HIV clinics between July 2013 and April 2014. Cases experienced late HIV failure, defined as plasma HIV RNA >1000 copies/mL after maintaining viral suppression (ART for at least 6 months. Logistic regression was used to determine the adjusted odds of late HIV failure by 516G>T genotype. After adjustment for the confounding variables age and CD4 count, the CYP2B6 516 T-allele was protective against late HIV virologic breakthrough, adjusted OR 0.70; 95% CI: 0.50 to 0.97. The CYP2B6 516 T-allele was protective against late virologic breakthrough in patients with initial (6 month) HIV RNA suppression on EFV-based ART. Future studies are needed to assess long-term viral benefits of identifying and offering EFV containing ART to black African HIV-infected patients with CYP2B6 T-alleles, especially given the wider availability of a single pill EFV in this setting.

  19. Role of Rilpivirine and Etravirine in Efavirenz and Nevirapine-Based Regimens Failure in a Resource-Limited Country: A Cross- Sectional Study.

    Science.gov (United States)

    Teeranaipong, Phairote; Sirivichayakul, Sunee; Mekprasan, Suwanna; Ohata, Pirapon June; Avihingsanon, Anchalee; Ruxrungtham, Kiat; Putcharoen, Opass

    2016-01-01

    Etravirine(ETR) can be used for patients who have failed NNRTI-based regimen. In Thailand, ETR is approximately 45 times more expensive than rilpivirine(RPV). However, there are no data of RPV use in NNRTI failure. Therefore, we assessed the susceptibility and mutation patterns of first line NNRTI failure and the possibility of using RPV compared to ETV in patients who have failed efavirenz(EFV)- and nevirapine(NVP)-based regimens. Clinical samples with confirmed virological failure from EFV- or NVP-based regimens were retrospectively analyzed. Resistance-associated mutations (RAMs) were interpreted by IAS-USA Drug Resistance Mutations. Susceptibility of ETR and RPV were interpreted by DUET, Monogram scoring system, and Stanford University HIV Drug Resistance Database. 1,279 and 528 patients failed EFV- and NVP-based regimens, respectively. Y181C was the most common NVP-associated RAM (54.3% vs. 14.7%, p<0.01). K103N was the most common EFV-associated RAM (56.5% vs. 19.1%, P<0.01). The results from all three scoring systems were concordant. 165(11.1%) and 161(10.9%) patients who failed NVP-based regimen were susceptible to ETR and RPV, respectively (p = 0.85). 195 (32.2%) and 191 (31.6%) patients who failed EFV-based regimen, were susceptible to ETR and RPV, respectively (p = 0.79). The susceptibility of ETV and RPV in EFV failure was significantly higher than NVP failure (p<0.01). The mutation patterns for ETR and RPV were similar but 32% and 11% of patients who failed EFV and NVP -based regimen, respectivly were susceptible to RPV. This finding suggests that RPV can be used as the alternative antiretroviral agent in patients who have failed EFV-based regimen.

  20. Outcomes for efavirenz versus nevirapine-containing regimens for treatment of HIV-1 infection: a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Prinitha Pillay

    Full Text Available There is conflicting evidence and practice regarding the use of the non-nucleoside reverse transcriptase inhibitors (NNRTI efavirenz (EFV and nevirapine (NVP in first-line antiretroviral therapy (ART.We systematically reviewed virological outcomes in HIV-1 infected, treatment-naive patients on regimens containing EFV versus NVP from randomised trials and observational cohort studies. Data sources include PubMed, Embase, the Cochrane Central Register of Controlled Trials and conference proceedings of the International AIDS Society, Conference on Retroviruses and Opportunistic Infections, between 1996 to May 2013. Relative risks (RR and 95% confidence intervals were synthesized using random-effects meta-analysis. Heterogeneity was assessed using the I(2 statistic, and subgroup analyses performed to assess the potential influence of study design, duration of follow up, location, and tuberculosis treatment. Sensitivity analyses explored the potential influence of different dosages of NVP and different viral load thresholds.Of 5011 citations retrieved, 38 reports of studies comprising 114 391 patients were included for review. EFV was significantly less likely than NVP to lead to virologic failure in both trials (RR 0.85 [0.73-0.99] I(2 = 0% and observational studies (RR 0.65 [0.59-0.71] I(2 = 54%. EFV was more likely to achieve virologic success than NVP, though marginally significant, in both randomised controlled trials (RR 1.04 [1.00-1.08] I(2 = 0% and observational studies (RR 1.06 [1.00-1.12] I(2 = 68%.EFV-based first line ART is significantly less likely to lead to virologic failure compared to NVP-based ART. This finding supports the use of EFV as the preferred NNRTI in first-line treatment regimen for HIV treatment, particularly in resource limited settings.

  1. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  2. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  3. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  4. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  5. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  6. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  7. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  8. Montagem e Imagem como Paradigma

    Directory of Open Access Journals (Sweden)

    Cesar Huapaya

    2016-01-01

    Full Text Available O pensar como montagem e imagem tornou-se um método revelador nos processos de estudos práticos e teóricos do artista e dos pesquisadores nos séculos XX e XXI. Este artigo procura articular três formas de pensar por montagem: nas obras de Bertolt Brecht, Sergei Eisenstein e Georges DidiHuberman. O filósofo e historiador da arte Georges Didi-Huberman reinaugura o debate e o exercício de pensar a antropologia da imagem e a montagem como metalinguagem e forma de conhecimento.

  9. La ciudad como ecosistema urbano

    OpenAIRE

    Higueras García, Esther

    2013-01-01

    LA CIUDAD COMO ECOSISTEMA URBANO .- La ecología y los ecosistemas .- El ecosistema urbano, definición, alcance y oportunidad .- El metabolismo urbano .- Los síntomas de la patología urbana .- Los objetivos del nuevo ecosistema urbano .- Las aportaciones de los ecobarrios

  10. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  11. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  13. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  14. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  15. LA CUESTIÓN AGRARIA COMO ENFOQUE Y COMO PROBLEMA

    Directory of Open Access Journals (Sweden)

    Carlos Salgado

    2000-01-01

    Full Text Available Este ensayo trata de llamar la atención sobre los problemas del agro hoy, colocando especial énfasis en la disrupción entre las razones tecnológicas -que se asumen como ideológicas-, económicas y políticas que han inspirado el desarrollo del agro, o mejor, sobre las cuales se han basado las políticas de crecimiento agropecuario.

  16. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  17. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  18. Pilot study of once-daily simplification therapy with abacavir/lamivudine/zidovudine and efavirenz for treatment of HIV-1 infection.

    Science.gov (United States)

    Ruane, Peter; Lang, Joseph; DeJesus, Edwin; Berger, Daniel S; Dretler, Robin; Rodriguez, Allan; Ward, Douglas J; Lim, Michael L; Liao, Qiming; Reddy, Sunila; Clair, Marty St; Vila, Tania; Shaefer, Mark S

    2006-01-01

    The purpose of this pilot study was to explore the efficacy and safety of the abacavir/lamivudine/zidovudine fixed-dose combination tablet administered as two tablets once daily (qd) versus one tablet twice daily (bid) in combination with efavirenz (EFV). This was a prospective, randomized, open-label, multicenter study with a 24-week treatment period in 7 outpatient HIV clinics in the United States. Patients currently receiving an initial regimen of abacavir/lamivudine/zidovudine bid plus EFV qd for at least 6 months with HIV-1 RNA or = 200 cells/mm3 were eligible. Thirty-six patients enrolled, and 35 (97%) completed the study. Participants were randomized to switch to 2 tablets of abacavir/lamivudine/zidovudine qd plus EFV qd (QD arm) or continue current treatment (BID arm) for 24 weeks. Efficacy, safety, and adherence were evaluated. Median baseline CD4+ cell count was 521 cells/mm3. At week 24, HIV-1 RNA or = 0.29 to +0.18, p = 1.000). At week 24, median CD4+ cell count change from baseline was +26 cells/mm3 for the QD arm and -39 cells/mm3 for BID arm. One patient randomized to the QD arm met virologic failure criteria (confirmed HIV-1 RNA >120 copies/mL) at week 20 and viral genotype showed M184V. After failure, this patient revealed he never took EFV throughout the entire study after randomization, effectively receiving only abacavir/lamivudine/zidovudine qd alone. Median adherence was slightly higher in the QD arm, although both arms had broad variability and overlapping interquartile ranges. Adverse events were infrequent and occurred with similar frequency between arms; treatment-related adverse events were abdominal pain, flatulence, nausea, headache, and abnormal dreams (1 patient [3%] for each adverse event). No patients withdrew due to adverse events, and no abacavir hypersensitivity reactions were reported. In this pilot study of patients suppressed on abacavir/lamivudine/zidovudine bid plus EFV, 94% of participants switching to abacavir

  19. Virological failure of staggered and simultaneous treatment interruption in HIV patients who began Efavirenz-based regimens after allergic reactions to nevirapine

    Directory of Open Access Journals (Sweden)

    Siripassorn Krittaecho

    2013-01-01

    Full Text Available Abstract Objective The objective of this work was to study the virological outcomes associated with two different types of treatment interruption strategies in patients with allergic reactions to nevirapine (NVP. We compared the virological outcomes of (1 HIV-1-infected patients who discontinued an initial NVP-based regimen because of cutaneous allergic reactions to NVP; different types of interruption strategies were used, and second-line regimen was based on efavirenz (EFV; and (2 HIV-1-infected patients who began an EFV-based regimen as a first-line therapy (controls. Methods This retrospective cohort included patients who began an EFV-based regimen, between January 2002 and December 2008, as either an initial regimen or as a subsequent regimen after resolving a cutaneous allergic reaction against an initial NVP-based regimen. The study ended in March 2010. The primary outcome was virological failure, which was defined as either (a two consecutive plasma HIV-1 RNA levels >400 copies/mL or (b a plasma HIV-1 RNA level >1,000 copies/mL plus any genotypic resistance mutation. Results A total of 559 patients were stratified into three groups: (a Simultaneous Interruption, in which the subjects simultaneously discontinued all the drugs in an NVP-based regimen following an allergic reaction (n=161; (b Staggered Interruption, in which the subjects discontinued NVP treatment while continuing nucleoside reverse transcriptase inhibitor (NRTI backbone therapy for a median of 7 days (n=82; and (c Control, in which the subjects were naïve to antiretroviral therapy (n=316. The overall median follow-up time was 43 months. Incidence of virological failure in Simultaneous Interruption was 12.9 cases per 1,000 person-years, which trended toward being higher than the incidences in Staggered Interruption (5.4 and Control (6.6. However, differences were not statistically significant. Conclusions Among the patients who had an acute allergic reaction to first

  20. Current Efavirenz (EFV or ritonavir-boosted lopinavir (LPV/r use correlates with elevate markers of atherosclerosis in HIV-infected subjects in Addis Ababa, Ethiopia.

    Directory of Open Access Journals (Sweden)

    Rudolph L Gleason

    Full Text Available HIV patients on antiretroviral therapy have shown elevated incidence of dyslipidemia, lipodystrophy, and cardiovascular disease (CVD. Most studies, however, focus on cohorts from developed countries, with less data available for these co-morbidities in Ethiopia and sub-Saharan Africa.Adult HIV-negative (n = 36, treatment naïve (n = 51, efavirenz (EFV-treated (n = 91, nevirapine (NVP-treated (n = 95, or ritonavir-boosted lopinavir (LPV/r-treated (n=44 subjects were recruited from Black Lion Hospital in Addis Ababa, Ethiopia. Aortic pressure, augmentation pressure, and pulse wave velocity (PWV were measured via applanation tonometry and carotid intima-media thickness (cIMT and carotid arterial stiffness, and brachial artery flow-mediated dilation (FMD were measured via non-invasive ultrasound. Body mass index, waist-to-hip circumference ratio (WHR, skinfold thickness, and self-reported fat redistribution were used to quantify lipodystrophy. CD4+ cell count, plasma HIV RNA levels, fasting glucose, total-, HDL-, and LDL-cholesterol, triglycerides, hsCRP, sVCAM-1, sICAM-1, leptin and complete blood count were measured.PWV and normalized cIMT were elevate and FMD impaired in EFV- and LPV/r-treated subjects compared to NVP-treated subjects; normalized cIMT was also elevated and FMD impaired in the EFV- and LPV/r-treated subjects compared to treatment-naïve subjects. cIMT was not statistically different across groups. Treated subjects exhibited elevated markers of dyslipidemia, inflammation, and lipodystrophy. PWV was associated with age, current EFV and LPV/r used, heart rate, blood pressure, triglycerides, LDL, and hsCRP, FMD with age, HIV duration, WHR, and glucose, and cIMT with age, current EFV use, skinfold thickness, and blood pressure.Current EFV- or LPV/r-treatment, but not NVP-treatment, correlated with elevated markers of atherosclerosis, which may involve mechanisms distinct from traditional risk factors.

  1. Los ancianos como actores sociales

    Directory of Open Access Journals (Sweden)

    Mª PIA ARENYS

    1996-01-01

    Full Text Available En este artículo se recogen las discusiones de grupo que se realizaron durante cuatro meses en Barcelona como preparación del "II congreso de la gent gran" (II congreso de ancianos de esta ciudad, en noviembre de 1993. las discusiones se llevaron a cabo en las sedes de cada distrito, previa presentación de la ponencia por parte de un técnico. los componentes de los grupos son personas mayores sensibilizadas y comprometidas que forman parte del consejo de bienestar social del ayuntamiento de Barcelona. la metodología utilizada es cualitativa para el análisis del discurso, que se ha estructurado en los siguientes puntos: bajo el epígrafe "los ancianos como ciudadanos de derechos y obligaciones" se recogen los temas de valoración de la vejez, la familia, la jubilación, las implicaciones de los ancianos como ciudadanos de derechos y deberes y las funciones sociales de los ancianos.-- sobre estos temas, los mayores expresaron sus opiniones, que se vertieron, resumidamente, en la redacción final de la ponencia. aquí se recogen y se han analizado los materiales que ofrecen una versión de primera mano sobre lo que opinan los ancianos respecto al tema de "la valoración de la vejez".

  2. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  3. La proyección como proceso y como mecanismo

    OpenAIRE

    Brusset, Bernard

    2001-01-01

    La proyección es a la vez un mecanismo elemental testimonio de la fragilidad de la organización defensiva y un proceso cuyo rol es fundamental en el funcionamiento psíquico. En este doble aspecto, la proyección guarda especificidades diferentes en la neurosis y en la psicosis al punto que se las deba considerar como fundamentalmente diferentes? O bien las diferencias de contexto, del nivel del funcionamiento psíquico, de lugar, de función pueden explicar las diferencias clínicas justificando ...

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  6. Efavirenz or nevirapine in three-drug combination therapy with two nucleoside or nucleotide-reverse transcriptase inhibitors for initial treatment of HIV infection in antiretroviral-naïve individuals.

    Science.gov (United States)

    Mbuagbaw, Lawrence; Mursleen, Sara; Irlam, James H; Spaulding, Alicen B; Rutherford, George W; Siegfried, Nandi

    2016-12-10

    The advent of highly active antiretroviral therapy (ART) has reduced the morbidity and mortality due to HIV infection. The World Health Organization (WHO) ART guidelines focus on three classes of antiretroviral drugs, namely nucleoside or nucleotide reverse transcriptase inhibitors (NRTI), non-nucleoside reverse transcriptase inhibitors (NNRTI) and protease inhibitors. Two of the most common medications given as first-line treatment are the NNRTIs, efavirenz (EFV) and nevirapine (NVP). It is unclear which NNRTI is more efficacious for initial therapy. This systematic review was first published in 2010. To determine which non-nucleoside reverse transcriptase inhibitor, either EFV or NVP, is more effective in suppressing viral load when given in combination with two nucleoside reverse transcriptase inhibitors as part of initial antiretroviral therapy for HIV infection in adults and children. We attempted to identify all relevant studies, regardless of language or publication status, in electronic databases and conference proceedings up to 12 August 2016. We searched MEDLINE, Embase, the Cochrane Central Register of Controlled Trials (CENTRAL), the World Health Organization (WHO) International Clinical Trials Registry Platform (ICTRP) and ClinicalTrials.gov to 12 August 2016. We searched LILACS (Latin American and Caribbean Health Sciences Literature) and the Web of Science from 1996 to 12 August 2016. We checked the National Library of Medicine (NLM) Gateway from 1996 to 2009, as it was no longer available after 2009. We included all randomized controlled trials (RCTs) that compared EFV to NVP in people with HIV without prior exposure to ART, irrespective of the dosage or NRTI's given in combination.The primary outcome of interest was virological success. Other primary outcomes included mortality, clinical progression to AIDS, severe adverse events, and discontinuation of therapy for any reason. Secondary outcomes were change in CD4 count, treatment failure

  7. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  8. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  9. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  10. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  12. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  13. El dictado como tarea comunicativa

    Directory of Open Access Journals (Sweden)

    Daniel Cassany

    2004-01-01

    Full Text Available El artículo explora las utilidades didácticas del dictado (la práctica comunicativa de oralizar o leer en voz alta un escrito para el aprendizaje funcional de una lengua materna o extranjera, en los diversos niveles de enseñanza. Después de criticar el uso tradicional de este ejercicio lingüístico, presentamos once formas diferentes de desarrollar un dictado en clase, con sus particulares contenidos, objetivos y metodologías. También analizamos con detalle la técnica más tradicional del dictado magistral, en la que el docente dicta palabra por palabra un texto al alumnado, poniendo énfasis en la ortografía; ofrecemos algunas orientaciones para incrementar el componente comunicativo de esta propuesta. Las conclusiones finales proponen entender esta técnica como un recurso metodológico variado, rico y sugerente, adaptado a cada situación de aprendizaje —y no como una práctica obligatoria y fosilizada.

  14. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  15. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  16. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  17. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  18. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  19. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  20. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  1. Como Lo Hago Yo: Myelomeningocele

    Science.gov (United States)

    Lazareff, Jorge

    2014-01-01

    Fortificación con ádico fólico es efectiva, pero aún falta conciencia en los jóvenes. La legalidad del aborto aumenta la importancia de la consulta prenatal. Realizo la cirugía bajo microcoscopio por razones didácticas. Irrigación continua para reducir la temperatura del tejido. Trato a la plaqueta como tejido viable. No suturo la plaqueta. No cierro músculo. ATB por una semana después de cirugía. Hidrocefalia: Válvula en todos los casos de ventriculomegalia. Médula anclada: Desanclar una sola vez. Chiari II: Revisar la válvula. Incluir en el seguimiento rendimiento escolar, puede indicar obstrucción de la válvula o médula anclada. PMID:24791217

  2. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  4. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  5. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  6. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  7. El investigador como educador musical y como divulgador

    Directory of Open Access Journals (Sweden)

    Suárez-Pajares, Javier

    2000-05-01

    Full Text Available La presente ponencia trataba de plantear en la Mesa redonda "Investigación aplicada a la educación musical" una serie de problemas que surgen no tanto a quienes se dedican profesionalmente a la educación musical, sino a quienes, dedicados a la investigación y al estudio de temas relacionados con la historia de la música desde perspectivas diversas, tienen finalmente que transmitirlos a una audiencia determinada, ya sea otros colegas, alumnos de universidad, u otros públicos en los que se centra el concepto "divulgación" aludido en el título.Se incide particularmente en los problemas de la educación musical universitaria para no especialistas y en la divulgación científica aplicada a la música, un campo casi yermo sobre el que hay mucho que reflexionar: desde las nuevas posibilidades para la explicación de la música que ofrecen soportes nuevos como el CD-Rom, hasta las tradicionales formas de relación con un público melómano a través de notas a programas y críticas de conciertos.

  8. Probable brote de transmisión oral de enfermedad de Chagas en Turbo, Antioquia

    Directory of Open Access Journals (Sweden)

    Juan Fernando Ríos

    2011-03-01

    Conclusión. Se identificó un probable brote agudo de enfermedad de Chagas en Antioquia y se plantea como hipótesis la transmisión por vía oral, mediante la ingestión de T. cruzi en alimentos contaminados con restos de triatominos o excrementos de marsupiales.

  9. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  10. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  11. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  12. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  13. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  14. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  15. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  16. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  17. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  18. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  19. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  20. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  1. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  2. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  3. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  4. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  5. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  6. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  7. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  8. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  9. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  10. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  11. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  12. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  13. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  14. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  15. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  16. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  17. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  18. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  19. La lengua inglesa como neolengua

    Directory of Open Access Journals (Sweden)

    Fernando BELTRÁN LLAVADOR

    2009-11-01

    Full Text Available RESUMEN: El artículo examina algunos usos y abusos de la lengua inglesa en la actualidad en el contexto de la globalización. Se define la neolengua, se ofrecen similitudes de la misma en el lenguaje publicitario en España, se trazan sus orígenes y se apuntan algunos antecedentes en la historia de la literatura inglesa, al tiempo que se ilustran ejemplos de variantes contemporáneas de la misma bajo la denominación de «nukespeak», especialmente abundantes en el discurso de los conflictos bélicos. La lengua inglesa está indisociablemente unida a complejos factores de orden económico, tecnológico y cultural que afectan a su misma morfología. Por otra parte, la presencia ubicua de la lengua inglesa, como idioma global, opera sobre las estructuras de sentimiento, pensamiento y acción de los ciudadanos en todo el mundo. Para los profesionales de la enseñanza del idioma inglés, ello comporta la obligación de discernir y resistir sus efectos potencialmente perversos al tiempo que siguen promoviendo sus valiosos beneficios culturales.ABSTRACT: Contemporary uses and abuses of the English language are examined in the context of complex issues and globalization trends. The term «newspeak» is defined, similitudes of it are found today in the language of advertising in Spain, its origins are traced back and antecedents are located in the history of English literature, while the presence of new modalities of the Orwellian reductionist language, such as «nukespeak», is illustrated within the language of warfare. The English language is inextricably bound up with economical, technological and cultural factors which affect its very morphology just as much as the pervasive influence of English as a global language affects the structures of feeling, thought and action of citizens all over the world, which poses an obligation on the part of EFL teachers to discern and resist its ill effects while they promote its still highly valuable cultural

  20. Como responder ao momento presente?

    Directory of Open Access Journals (Sweden)

    Maria Filomena Molder

    2013-12-01

    Full Text Available http://dx.doi.org/10.5007/1984-784X.2013v13n19p13 Foi com esta pergunta — já um efeito de um primeiro encontro entre Irene Pimentel e eu própria — que decidimos desafiar colegas, estudantes e funci­onários da nossa Faculdade, FCSH (Faculdade de Ciências Sociais e Huma­nas, de outras Faculdades da Universidade Nova de Lisboa, de outras Uni­versidades e todos os interessados em con­siderar e discutir em comum aquilo que se passava em Portugal e que no anúncio da Jornada de 6 de De­zembro de 2012 se descrevia como um “processo de desmantela­mento social, económico e cultural sem precedentes — pese embora tantas compara­ções, baseadas na premissa da ‘eterna repetição’ — e cujas consequências não param de exceder as previsões dos responsáveis por esse desmantelamento”. Acedendo com todo o empenho e gratidão ao convite que me foi dirigido por Humberto Brito para fazer uma resenha da Jornada a publicar no primeiro número de Forma de Vida (saúdo a revista e o título, decidi-me, no entanto, a pôr de lado a resenha, que sob a forma de “Editorial” será em breve publi­cada no blogue Responder ao Momento Presente, entre­tanto criado, conjuntamente com os textos escritos pelos nossos convidados, com as parti­cipações de pessoas que corresponderam ao nosso apelo e ainda com contri­bui­ções que se alargaram para lá da Jornada; a que se juntará uma gravação em video, também disponível no Youtube.   Texto publicado originalmente em Forma de Vida, Lisboa, n.1, fev. 2013. Agrade­cemos à autora por permitir a republicação neste número do Boletim. [N.E.

  1. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  2. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  3. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  4. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  5. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  6. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  7. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  8. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  9. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  10. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  11. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  12. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  13. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  14. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  15. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  16. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  17. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  18. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  19. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  20. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  1. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  2. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  3. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  4. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  5. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  6. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  7. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  8. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  9. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  10. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  11. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  12. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  13. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  14. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  15. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. La sabidur?a como competencia gerencial

    OpenAIRE

    Pinzon-Barrios, Ana-Maria; Cort?s Zapata, Gabriela

    2012-01-01

    Aunque el concepto de sabidur?a ha sido ampliamente estudiado por expertos de ?reas como la filosof?a, la religi?n y la psicolog?a, a?n enfrenta limitaciones en cuanto a su definici?n y evaluaci?n. Por esto, el presente trabajo tiene como objetivo, formular una definici?n del concepto de sabidur?a que permita realizar una propuesta de evaluaci?n del concepto como competencia en los gerentes. Para esto, se realiz? un an?lisis documental de tipo cualitativo. De esta manera, se analizaron divers...

  18. Los ritmos como terapia para la impulsividad

    Directory of Open Access Journals (Sweden)

    Mónica Triviño

    2012-10-01

    Full Text Available Investigaciones recientes muestran que el uso de patrones rítmicos facilita la respuesta óptima en el tiempo, por lo que el entrenamiento mediante ritmos podría proponerse como terapia novedosa ante problemas como la impulsividad. Esto podría beneficiar a pacientes con daño prefrontal o personas con trastorno por déficit de atención e hiperactividad (TDAH, que suelen mostrar conductas impulsivas, así como dificultad para estimar el paso del tiempo.

  19. La transparencia como objetivo del desarrollo sostenible

    OpenAIRE

    Juan José Gilli

    2017-01-01

    El presente trabajo tiene por propósito precisar el significado de la transparencia en la gestión pública como una exigencia de carácter ético relacionada con la información que los agentes deben dar en el ámbito sus funciones, reconociendo al ciudadano como el dueño de la información que producen y guardan. La transparencia tiene un especial valor como herramienta para combatir la corrupción y, de esa forma, contribuir a la meta de lograr instituciones inclusivas y efectivas para el desarrol...

  20. La universidad como organización

    Directory of Open Access Journals (Sweden)

    Luis Aurelio Ordoñez Burbano

    2001-07-01

    Full Text Available El presente artículo versa sobre la unviersidad como organización y tiene como objetivo hacer una reflexión sobre la misión de la universidad colombiana en el marco de diversos entornos. Como punto de partida, se propone una precisión sobre la misión de la universidad moderna y la investigación, a diferencia de la universidad colonial y decimonónica limitada a la divulgación de conocimientos preexistentes.

  1. Carolee Schneemann. El cine como autobiografía, la artista como actriz, el cuerpo como pincel

    Directory of Open Access Journals (Sweden)

    María Barbaño González-Moreno

    2017-10-01

    Full Text Available Este trabajo analiza la relación de cine y mujer a partir de la obra fílmica de Carolee Schneemann, principalmente de su obra autobiográfica Fuses (1964-1966. Desde ella, se plantea el papel de la artista como productora, directora y protagonista principal de todas sus obras. Reflexionamos así sobre el rol del creador-director como actor que derivaría en la consecución de una obra cinematográfica de tintes necesariamente autobiográficos. Asumiendo la visión vanguardista del cine como diario personal/Entendido el cine como diario personal, Schneemann va a explorar en su obra diferentes aspectos de la identidad y la sexualidad de la mujer en un cine artístico, alternativo y de tendencia política feminista. Entendido/Asumido su cine como elemento plástico, la artista explorará de forma paralela la experimentación matérica y física a través de los cuerpos filmados así como de la propia materialidad de la película, excluyendo toda posibilidad narrativa, dramática e ilusoria de proyección del espectador en el espacio cinematográfico y el espacio privado del creador.

  2. El videojuego como material educativo: La Odisea

    Directory of Open Access Journals (Sweden)

    Belén Mainer Blanco

    2012-04-01

    Full Text Available La investigación se basa en la función educativa que pueden cumplir los videojuegos, un campo que consideramos inexplorado por tres motivos principalmente: su reciente incorporación, su impopularidad educativa (el rechazo el videojuego como herramienta de aprendizaje y considerado, por el contrario, como una distracción, y la incompleta incorporación de las nuevas tecnologías de información y comunicación (TIC en el ámbito familiar y educativo. En una segunda parte, se ha realizado una aplicación práctica tomando como referencia la gran obra universal “La Odisea”, cuya intención es mostrar la utilidad del videojuego como complemento educativo.

  3. El Derecho como argumentación

    Directory of Open Access Journals (Sweden)

    Atienza, Manuel

    1999-11-01

    Full Text Available Not available

    Frente a las concepciones del Derecho como norma, como hecho o como valor (que caracterizan, respectivamente, al norrnativismo, al realismo jurídico y al iusnaturalismo, se propone aquí un cuarto enfoque que consiste en ver el Derecho como argumentación (y que cohra especial relevancia en las sociedades democráticas. Sin embargo, no hay una única forma de entender la argumentación jurídica. Aunque conectadas entre sí, en el trabajo se distinguen tres concepciones: la formal, la material y la pragmática o dialéctica; muchas cuestiones que se plantean en el ámbito de la teoría de la argumentación jurídica pueden resolverse -o aclararse- teniendo en cuenta esa triple perspectiva.

  4. La sangre como espectáculo.

    Directory of Open Access Journals (Sweden)

    Rubén Darío Buitrón

    2015-01-01

    Full Text Available El autor asevera que cuando la información es concebida y tratada como una mercancía y no como un bien social, la avidez por el lucro la degenera en productos abyectos donde la sangre es espectáculo que sirve para exacerbar el morbo social, incrementar las ventas y los ingresos publicitarios. Señala que lamentablemente en el Ecuador este tipo de periodismo es una plaga muy bien vendida.

  5. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  6. Las leyendas regionales como intangibles territoriales

    Directory of Open Access Journals (Sweden)

    Eloy Martos Núñez

    2015-01-01

    Full Text Available El artículo examina el concepto de leyenda como intangible territorial en diversas escalas, desde la local a la regional o nacional, y su relación con la construcción de mitos étnicos y la emergencia de tradiciones translocales. Para ello, se revisan la metodología de los estudios corográficos y las nociones de ecotipo y de paisaje cultural, así como la etnografía de los territorios simbólicos a la luz de conceptos como el clásico «témenos» y modernos como el «mytho-moteur» (Abadal, 1958. Se aplican estudios de casos que evidencian cómo es el Imaginario, en interacción con factores geohistóricos del lugar, el que a menudo acota y perimetrea un territorio a través de cauces como la fabulación legendaria y los ritos paralitúgicos, como procesiones o peregrinaciones. La conclusión es que las leyendas y los arquetipos de origen étnico y genealógico reescriben tradiciones que crean identidades y se pueden proyectar en ámbitos diferentes de la vida política o recreativa, con perfiles igualmente diferentes.

  7. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  8. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  9. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  10. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  11. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  12. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  13. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  14. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  16. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  17. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  18. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  19. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  20. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  1. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  2. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  3. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  4. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  5. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  6. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  7. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  8. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  9. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  10. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  11. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  12. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  13. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  15. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  16. The Probability Heuristics Model of Syllogistic Reasoning.

    Science.gov (United States)

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  17. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  18. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  19. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  20. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.

  1. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  2. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  3. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  4. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  5. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  6. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  7. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  8. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  9. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  10. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  11. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  12. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  13. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  14. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  15. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  16. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  17. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  18. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  19. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  20. Quantum probability and conceptual combination in conjunctions.

    Science.gov (United States)

    Hampton, James A

    2013-06-01

    I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.

  1. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  2. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  3. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  4. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  5. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  6. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  7. La realidad como materia novelable: Alejo Carpentier

    Directory of Open Access Journals (Sweden)

    Raquel Arias Careaga

    2007-12-01

    Full Text Available Desde que en 1897 Benito Pérez Galdós defendiera como materia prima legítima de la literatura «la vida misma, de donde el artista saca las ficciones que nos instruyen y embelesan» (Pérez Galdós, 1990: 159, el realismo como instrumento de indagación y análisis de la sociedad no ha dejado de crecer y expandirse. Realismo crítico, como este de Galdós, que implica una ampliación del concepto chato de «realidad», incluyendo en ella «recuerdos, sueños, imaginación, locura, símbolos» para contribuir a la formación de un «realismo total» (Rodríguez Puértolas, 2000, II: 93, al que se une también la asimilación de enseñanzas esenciales como la que representa la narrativa de Cervantes: La novelística de Galdós hunde sus raíces en el mejor Cervantes, como puede verse en su peculiar sentido del humor y de la ironía, en la concepción perspectivista de la realidad y en tantas otras cosas, algunas de ellas en verdad fundamentales. Así el concepto de la Naturaleza y sus relaciones dialécticas con el ser humano; así el Amor como elemento vital y animador del orden cósmico, muy alejado del idealismo vulgar romántico (Ibid., 93.

  8. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  9. A Imagem como Ruína

    Directory of Open Access Journals (Sweden)

    Míriam Volpe

    2001-12-01

    Full Text Available O dialogismo ente o filme Citizen Kane, de Orson Welles, e o poema “Kubla Khan”, de S.T. Coleridgé, é analisado, sendo evidenciados não só o tema em comum – o mito do Paraíso perdido – como também a similariedade na organização do disurso na justaposição das images, como fragmentos da memória a serem preservados.

  10. Los rincones como contextos significativos de aprendizaje

    OpenAIRE

    Sánchez Molero, María Luisa

    2016-01-01

    En el presente trabajo se pretende hacer un estudio sobre los rincones de trabajo en el aula de Educación Infantil, describiéndolos como modelo de organización del espacio y como estrategia metodológica que aboga por un aprendizaje socioconstructivista. El estudio se completa con la descripción y el análisis de los rincones de un aula específica de un centro escolar, incorporando diversas propuestas de mejora fundamentadas en el previo estudio teórico

  11. PARP-1 como regulador del ciclo celular

    OpenAIRE

    Iglesias Vázquez, Pablo

    2015-01-01

    En el presente estudio hemos querido investigar las implicaciones biológicas de la interacción PARP-1/E2F-1 en escenarios en los que el factor de transcripción E2F-1 resulta de gran importancia como son el desarrollo embrionario y la oncogénesis. En este respecto, hemos demostrado que tanto PJ34, inhibidor de la actividad enzimática de PARP, como gosipol, inhibidor de las interacciones proteína-proteína, son capaces de reducir la actividad transcripcional de E2F-1 y la proli...

  12. Dibujo infantil como medio de diagnostico

    OpenAIRE

    González Hernando, Elisa

    2015-01-01

    Con este documento se pretende demostrar la importancia que tiene el dibujo infantil en el correcto desarrollo integral de las personas. Se estudia la importancia del dibujo y su valor a la hora de utilizarlo como método de diagnóstico ante determinados aspectos que pueden determinar la vida de una persona. En definitiva lo que se desarrolla en este trabajo de Fin de Grado es el papel que juega el dibujo como herramienta para el seguimiento del desarrollo de los individuos centrándonos ...

  13. La fatiga como estado motivacional subjetivo

    OpenAIRE

    D. Cárdenas; J. Conde-González; J.C. Perales

    2017-01-01

    Actualmente no existe consenso sobre los factores que determinan la aparición de la fatiga. Hay factores que se derivan exclusivamente del esfuerzo físico, otros que dependen del esfuerzo mental que este lleva aparejado, y otros de los resultados de la tarea que se está realizando. Como consecuencia, se han desarrollado diferentes modelos explicativos que pretenden aunar las diferentes razones de su aparición. No obstante, la tendencia actual es entender la fatiga como un estado motivacion...

  14. Quitina y carboximetilquitosana como agentes desintegrantes

    OpenAIRE

    Sol A Fernández Monagas; María D Rodríguez Albadalejo; Ofelia Bilbao Revoredo; Olga M Nieto Acosta

    1998-01-01

    Se realizó un estudio comparativo de la quitina y la carboximetilquitosana como agentes desintegrantes, y se evaluó la influencia ejercida por el método empleado en la elaboración de las tabletas sobre la actividad desintegrante de ambos polímeros. La quitina presentó buenas características como agente desintegrante independientemente del método utilizado en la elaboración de las tabletas, mientras que la actividad desintegrante de la carboximetilquitosana fue afectada por el proceso de granu...

  15. Probables casos de alergia endocrínea

    Directory of Open Access Journals (Sweden)

    Miguel Agustín Solari

    1951-01-01

    Full Text Available Objeto. Por respecto a una terminología utilizada por autores de indiscutible valor, repetimos en este trabajo la expresión "Alergia Endócrina", pero dejamos constancia de que, en nuestra opinión, no es adecuada para referirse a fenómenos alérgicos en los que las hormones desempeñan el papel de alergenos. En realidad "Alergia Endócrina" corresponde a los fenómenos alérgicos localizados en las glándulas de secreción interna; significa que el órgano de "shock" es alguna parte del sistema endocrino. Cuando un hormón actúa como alergeno debería utilizarse la expresión "alergia hormónica" y si se trata de un hormón que se produce en el mismo organismo a quien enferma correspondería decir "alergia hormonica autógena". Este trabajo es en realidad un estudio hecho sobre probables casos de alergia hormónica autógena; es además una simple "nota previa", casuística, de un trabajo de mayor envergadura en plena evolución. Se presentan aquí tres casos en los que el resultado terapéutico exitoso impone una seria meditación para tratar de comprenderlo; de ahí que este trabajo lleve en el título la palabra "probable", que es prudente y la única que nos hemos permitido utilizar.

  16. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  17. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  18. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  19. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  20. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  1. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  2. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  3. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  4. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  5. Probability Weighting as Evolutionary Second-best

    OpenAIRE

    Herold, Florian; Netzer, Nick

    2011-01-01

    The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...

  6. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  7. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  8. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  9. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  10. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  11. Clan structure analysis and rapidity gap probability

    International Nuclear Information System (INIS)

    Lupia, S.; Giovannini, A.; Ugoccioni, R.

    1995-01-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  12. Clan structure analysis and rapidity gap probability

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)

    1995-03-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  13. Introduction to tensorial resistivity probability tomography

    OpenAIRE

    Mauriello, Paolo; Patella, Domenico

    2005-01-01

    The probability tomography approach developed for the scalar resistivity method is here extended to the 2D tensorial apparent resistivity acquisition mode. The rotational invariant derived from the trace of the apparent resistivity tensor is considered, since it gives on the datum plane anomalies confined above the buried objects. Firstly, a departure function is introduced as the difference between the tensorial invariant measured over the real structure and that computed for a reference uni...

  14. Interaction probability value calculi for some scintillators

    International Nuclear Information System (INIS)

    Garcia-Torano Martinez, E.; Grau Malonda, A.

    1989-01-01

    Interaction probabilities for 17 gamma-ray energies between 1 and 1.000 KeV have been computed and tabulated. The tables may be applied to the case of cylindrical vials with radius 1,25 cm and volumes 5, 10 and 15 ml. Toluene, Toluene/Alcohol, Dioxane-Naftalen, PCS, INSTAGEL and HISAFE II scintillators are considered. Graphical results for 10 ml are also given. (Author) 11 refs

  15. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  16. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  17. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  18. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  19. PSA, subjective probability and decision making

    International Nuclear Information System (INIS)

    Clarotti, C.A.

    1989-01-01

    PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven

  20. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  1. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  2. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  3. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  4. Heidegger E A Técnica Moderna Como Perigo E Como Salvação

    Directory of Open Access Journals (Sweden)

    Robson Costa Cordeiro

    2014-04-01

    Full Text Available O mundo contemporâneo é marcado por uma compreensão de técnica que se ergue como tal a partir de um sentido próprio e autônomo. Este sentido pode ser caracterizado como um logos que se constitui a partir de sua própria natureza e se manifesta como tecnologia no decorrer da história. Compreender esse sentido e assumi-lo como condição de nossa existência no mundo é comprender a técnica como nossa herança e nosso envio. Nesse sentido, compreender a essência da técnica é fundamental para o mundo contemporâneo, assumindo que, nesse processo, consiste o perigo e a salvação da espécie.

  5. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  6. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  7. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  8. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  9. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  10. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  11. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  12. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  13. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  14. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  15. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  16. La biblioteca como editora de contenidos

    Directory of Open Access Journals (Sweden)

    Alonso-Arévalo, Julio

    2015-12-01

    Full Text Available Una de las características más innovadoras de la biblioteca del siglo 21 tiene que ver con la toma de una postura activa frente a la gestión y generación de contenidos. Con la llegada de la Web 2.0 las bibliotecas no sólo siguen salvaguardando y difundiendo información como han venido realizando a lo largo de su historia, también cada vez con más frecuencia crean nueva información con el objetivo de prestar los mejores servicios a sus ciudadanos, a través de recursos y servicios tales como la elaboración guías de investigación, boletines de alerta y novedades, recursos web, información a través de sus blogs, y como administradores de contenidos a través de repositorios y revistas de acceso abierto. Un paso más allá en esta dinámica tienen que ver con la biblioteca como editora y distribuidora de libros, especialmente en el ámbito local, siendo la impulsora, formadora, dinamizador y difusoras de las obras de los autores de su comunidad.

  17. A rota como memória

    Directory of Open Access Journals (Sweden)

    Patrick Fraysse

    Full Text Available No contexto de estudo do patrimônio por um ponto de vista comunicacional, este artigo permitiu-nos visualisar um objeto de comunicação por excelência « a estrada » como portador de informação a decifrar e a interpretar um documento, mas também como um repositário da memória coletiva, quer dizer um monumento. Paralelamente, a patrimonialização dos monumentos, dos conjuntos arquiteturais e sobretudo dos itinerários que os religam, dito de outra maneira, da estrada, assim como a sua documentarização (relatos de viagens, guias, bancos de dados participam de uma nova institucionalização da memória integrante também das estradas míticas como o caminho de São Tiago na França ou a famosa estrada 66 nos Estados-Unidos.

  18. La medicina tradicional como medicina ecocultural

    OpenAIRE

    Aparicio Mena, Alfonso Julio

    2005-01-01

    Los sistemas terap??uticos tradicionales responden a las culturas de los pueblos en los que surgen. En ellos, se concibe la naturaleza ??ntimamente ligada a la tradici??n. Salud es, para los miembros de las culturas tradicionales, bienestar como equilibrio entre el ser humano, la naturaleza, las creencias y la sociedad.

  19. Los patrones de asentamiento como recurso patrimonial

    Directory of Open Access Journals (Sweden)

    Antonio Lista Martín

    2012-12-01

    Full Text Available Los patrones de asentamiento son las pautas con las que distribuyen los diferentes elementos construidos, como casas, caminos, campos, espacios públicos o fábricas, y son una expresión cultural de la sociedad que los construye. Tanto el tipo de elementos, su número como su distribución, dependen del nivel tecnológico, de la estructura social y de su bagaje cultural. Se puede decir, por tanto, que en los patrones estaría reflejado el conocimiento que la sociedad tenia del medio, así como sus anhelos, prioridades e, incluso, sus malas prácticas. La presente comunicación propone que estos patrones puedan servir de base para el planeamiento territorial actual, no sólo preservando aquello que realmente tenga valor, sino incluso recreándolos o reproduciéndolos a otras escalas siempre que puedan aportar un valor añadido al territorio. El escrito plantea una pregunta: ¿Qué se debe mantener de los patrones tradicionales y qué se puede, o incluso se debe, cambiar?, y para responderla en la primera parte se analizan someramente algunos asentamientos-tipo y en la segunda se indican pautas de actuación que han aparecido como propuestas en planes de ordenación territorial en los últimos años.

  20. EL PERIODISMO CIVICO COMO COMUNICACION POLITICA

    Directory of Open Access Journals (Sweden)

    Ana María Miralles C.

    1998-01-01

    Full Text Available La autora retoma el ejercicio del periodismo cívico como el espacio en el cual la formación de la opinión pública adquiere características de un proyecto político dinámico en el que el ciudadano es el ente fundamental.

  1. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  2. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  3. Probability functions in the context of signed involutive meadows

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2016-01-01

    The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.

  4. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  5. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  7. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  8. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  9. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  10. Probability of Criticality for MOX SNF

    International Nuclear Information System (INIS)

    P. Gottlieb

    1999-01-01

    The purpose of this calculation is to provide a conservative (upper bound) estimate of the probability of criticality for mixed oxide (MOX) spent nuclear fuel (SNF) of the Westinghouse pressurized water reactor (PWR) design that has been proposed for use. with the Plutonium Disposition Program (Ref. 1, p. 2). This calculation uses a Monte Carlo technique similar to that used for ordinary commercial SNF (Ref. 2, Sections 2 and 5.2). Several scenarios, covering a range of parameters, are evaluated for criticality. Parameters specifying the loss of fission products and iron oxide from the waste package are particularly important. This calculation is associated with disposal of MOX SNF

  11. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  12. A short walk in quantum probability

    Science.gov (United States)

    Hudson, Robin

    2018-04-01

    This is a personal survey of aspects of quantum probability related to the Heisenberg commutation relation for canonical pairs. Using the failure, in general, of non-negativity of the Wigner distribution for canonical pairs to motivate a more satisfactory quantum notion of joint distribution, we visit a central limit theorem for such pairs and a resulting family of quantum planar Brownian motions which deform the classical planar Brownian motion, together with a corresponding family of quantum stochastic areas. This article is part of the themed issue `Hilbert's sixth problem'.

  13. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  14. Quantum operations, state transformations and probabilities

    International Nuclear Information System (INIS)

    Chefles, Anthony

    2002-01-01

    In quantum operations, probabilities characterize both the degree of the success of a state transformation and, as density operator eigenvalues, the degree of mixedness of the final state. We give a unified treatment of pure→pure state transformations, covering both probabilistic and deterministic cases. We then discuss the role of majorization in describing the dynamics of mixing in quantum operations. The conditions for mixing enhancement for all initial states are derived. We show that mixing is monotonically decreasing for deterministic pure→pure transformations, and discuss the relationship between these transformations and deterministic local operations with classical communication entanglement transformations

  15. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  16. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... algorithm is to keep up with a continuous stream of tweets using a small amount of time and memory. Our contribution is a number of randomized approximation algorithms, categorized according to the available space (superlinear, linear, and sublinear in the number of nodes n) and according to dierent models...

  17. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  18. PWR reactor pressure vessel failure probabilities

    International Nuclear Information System (INIS)

    Dufresne, J.; Lanore, J.M.; Lucia, A.C.; Elbaz, J.; Brunnhuber, R.

    1980-05-01

    To evaluate the rupture probability of a LWR vessel a probabilistic method using the fracture mechanics under probabilistic form has been proposed previously, but it appears that more accurate evaluation is possible. In consequence a joint collaboration agreement signed in 1976 between CEA, EURATOM, JRC Ispra and FRAMATOME set up and started a research program covering three parts: a computer code development, data acquisition and processing, and a support experimental program which aims at clarifying the most important parameters used in the COVASTOL computer code

  19. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  20. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.