WorldWideScience

Sample records for htlv-i probable factor

  1. Induction of galectin-1 expression by HTLV-I Tax and its impact on HTLV-I infectivity

    Directory of Open Access Journals (Sweden)

    Sato Sachiko

    2008-11-01

    Full Text Available Abstract Background Cell-free Human T-cell Leukemia Virus type I (HTLV-I virions are poorly infectious and cell-to-cell contact is often required to achieve infection. Other factors might thus importantly contribute in increasing infection by HTLV-I. Galectin-1 is a galactoside-binding lectin which is secreted by activated T lymphocytes. Several functions have been attributed to this protein including its capacity to increase cell-to-cell adhesion. Based on previous studies, we postulated that this protein could also accentuate HTLV-I infection. Results Herein, we demonstrate that galectin-1 expression and release are higher in HTLV-I-infected T cells in comparison to uninfected T cells. Furthermore, galectin-1 expression was activated in various cell lines expressing the wild type viral Tax protein while this induction was minimal upon expression of NF-κB activation-defective TaxM22. Cotransfection of these Tax expression vectors with galectin-1 promoter-driven luciferase constructs confirmed that Tax upregulated galectin-1 promoter activity. However, a NF-κB-independent mechanism was strongly favoured in this induction of galectin-1 expression as no activation of the promoter was apparent in Jurkat cells treated with known NF-κB activators. Using HTLV-I envelope pseudotyped HIV-1 virions, galectin-1 was shown to increase infectivity. In addition, a co-culture assay with HTLV-I-infected cells also indicated an increase in cell fusion upon addition of galectin-1. This effect was not mediated by factors present in the supernatant of the HTLV-I-infected cells. Conclusion These data suggest that HTLV-I Tax increases galectin-1 expression and that this modulation could play an important role in HTLV-I infection by stabilizing both cell-to-cell and virus-cell interactions.

  2. Comparison of four HTLV-I and HTLV-I + II ELISAs

    NARCIS (Netherlands)

    Vrielink, H.; Reesink, H.; Habibuw, M.; Schuller, M.; van der Meer, C.; Lelie, P.

    1999-01-01

    BACKGROUND: Various countries require blood donor screening using assays applying specific HTLV-I and HTLV-II antigens. We evaluated the sensitivity and specificity of 4 anti-HTLV-I + II ELISAs (Abbott, Murex, Organon Teknika and Ortho). METHODS: Panel A consisted of HTLV-I-positive individuals (n =

  3. Comparison of HTLV-I Proviral Load in Adult T Cell Leukemia/Lymphoma (ATL), HTLV-I-Associated Myelopathy (HAM-TSP) and Healthy Carriers.

    Science.gov (United States)

    Akbarin, Mohammad Mehdi; Rahimi, Hossein; Hassannia, Tahereh; Shoja Razavi, Ghazaleh; Sabet, Faezeh; Shirdel, Abbas

    2013-03-01

    Human T Lymphocyte Virus Type one (HTLV-I) is a retrovirus that infects about 10-20 million people worldwide. Khorasan province in Iran is an endemic area. The majority of HTLV-I-infected individuals sustain healthy carriers but small proportion of infected population developed two progressive diseases: HAM/TSP and ATL. The proviral load could be a virological marker for disease monitoring, therefore in the present study HTLV-I proviral load has been evaluated in ATL and compared to HAM/TSP and healthy carriers. In this case series study, 47 HTLV-I infected individuals including 13 ATL, 23 HAM/TSP and 11 asymptomatic subjects were studied. Peripheral blood mononuclear cells (PBMCs) were investigated for presence of HTLV-I DNA provirus by PCR using LTR and Tax fragments. Then in infected subjects, HTLV-I proviral load was measured using real time PCR TaqMan method. The average age of patients in ATL was 52±8, in HAM/TSP 45.52±15.17 and in carrier's 38.65±14.9 years which differences were not statistically significant. The analysis of data showed a significant difference in mean WBC among study groups (ATL vs HAM/TSP and carriers P=0.0001). Moreover, mean HTLV-I proviral load was 11967.2 ± 5078, 409 ± 71.3 and 373.6 ± 143.3 in ATL, HAM/TSP and Healthy Carriers, respectively. The highest HTLV-I proviral load was measured in ATL group that had a significant correlation with WBC count (R=0.495, P=0.001). The proviral load variations between study groups was strongly significant (ATL vs carrier P=0.0001; ATL vs HAM/TSP P= 0.0001 and HAM/TSP vs carriers P< 0.05). Conclusion : The present study demonstrated that HTLV-I proviral load was higher in ATL group in comparison with HAM/TSP and healthy carriers. Therefore, HTLV-I proviral load is a prognostic factor for development of HTLV-I associated diseases and can be used as a monitoring marker for the efficiency of therapeutic regime.

  4. HTLV-I/II and blood donors: determinants associated with seropositivity in a low risk population

    Directory of Open Access Journals (Sweden)

    Bernadette Catalan Soares

    2003-08-01

    Full Text Available OBJECTIVE: Blood donors in Brazil have been routinely screened for HTLV-I/II since 1993. A study was performed to estimate the prevalence of HTLV-I/II infection in a low risk population and to better understand determinants associated with seropositivity. METHODS: HTLV-I/II seropositive (n=135, indeterminate (n=167 and seronegative blood donors (n=116 were enrolled in an open prevalence prospective cohort study. A cross-sectional epidemiological study of positive, indeterminate and seronegative HTLV-I/II subjects was conducted to assess behavioral and environmental risk factors for seropositivity. HTLV-I/II serological status was confirmed using enzyme-linked immunosorbent assay (EIA and Western blot (WB. RESULTS: The three groups were not homogeneous. HTLV-I/II seropositivity was associated to past blood transfusion and years of schooling, a marker of socioeconomic status, and use of non-intravenous illegal drugs. CONCLUSIONS: The study results reinforce the importance of continuous monitoring and improvement of blood donor selection process.

  5. HTLV-I/II prevalence in different geographic locations

    NARCIS (Netherlands)

    Vrielink, Hans; Reesink, Henk W.

    2004-01-01

    Human T-cell lymphotropic virus (HTLV) type I (HTLV-I) is the etiological agent of adult T-cell leukemia and HTLV-I-associated myelopathy/tropical spastic paraparesis (HAM/TSP). HTLV-II is a closely related virus, and this infection is not clearly associated with clinical disease, although

  6. Regulatory elements involved in tax-mediated transactivation of the HTLV-I LTR.

    Science.gov (United States)

    Seeler, J S; Muchardt, C; Podar, M; Gaynor, R B

    1993-10-01

    HTLV-I is the etiologic agent of adult T-cell leukemia. In this study, we investigated the regulatory elements and cellular transcription factors which function in modulating HTLV-I gene expression in response to the viral transactivator protein, tax. Transfection experiments into Jurkat cells of a variety of site-directed mutants in the HTLV-1 LTR indicated that each of the three motifs A, B, and C within the 21-bp repeats, the binding sites for the Ets family of proteins, and the TATA box all influenced the degree of tax-mediated activation. Tax is also able to activate gene expression of other viral and cellular promoters. Tax activation of the IL-2 receptor and the HIV-1 LTR is mediated through NF-kappa B motifs. Interestingly, sequences in the 21-bp repeat B and C motifs contain significant homology with NF-kappa B regulatory elements. We demonstrated that an NF-kappa B binding protein, PRDII-BF1, but not the rel protein, bound to the B and C motifs in the 21-bp repeat. PRDII-BF1 was also able to stimulate activation of HTLV-I gene expression by tax. The role of the Ets proteins on modulating tax activation was also studied. Ets 1 but not Ets 2 was capable of increasing the degree of tax activation of the HTLV-I LTR. These results suggest that tax activates gene expression by either direct or indirect interaction with several cellular transcription factors that bind to the HTLV-I LTR.

  7. Southernmost carriers of HTLV-I/II in the world.

    Science.gov (United States)

    Cartier, L; Araya, F; Castillo, J L; Zaninovic, V; Hayami, M; Miura, T; Imai, J; Sonoda, S; Shiraki, H; Miyamoto, K

    1993-01-01

    To clarify the real distribution of HTLV-I and -II carriers among indigenous people in central and South America, blood samples collected from indigenous people in isolated regions of Southern Chile were examined. Among 199 inhabitants from Chiloe Island and Pitrufquen town, three cases (1.5%) showed positive anti-HTLV-I antibodies. Two out of the three (82-year-old male and 58-year-old female) reacted to HTLV-II-specific Gag and/or Env proteins but not to HTLV-I-specific ones. The latter case was confirmed as an HTLV-II carrier by polymerase chain reaction test.

  8. Evaluation of a new HTLV-I/II polymerase chain reaction

    NARCIS (Netherlands)

    Vrielink, H.; Zaaijer, H. L.; Cuypers, H. T.; van der Poel, C. L.; Woerdeman, M.; Lelie, P. N.; Winkel, C.; Reesink, H. W.

    1997-01-01

    AIM: Evaluation of a qualitative HTLV-I/II DNA polymerase chain reaction (PCR) test for the detection of HTLV-I/II DNA (Roche Diagnostic Systems, Branchburg, N.J., USA) in various panels. METHODS: The panels consisted of fresh EDTA blood samples from blood donors who were anti-HTLV-I/II ELISA

  9. Virus-induced dysfunction of CD4+CD25+ T cells in patients with HTLV-I-associated neuroimmunological disease.

    Science.gov (United States)

    Yamano, Yoshihisa; Takenouchi, Norihiro; Li, Hong-Chuan; Tomaru, Utano; Yao, Karen; Grant, Christian W; Maric, Dragan A; Jacobson, Steven

    2005-05-01

    CD4(+)CD25(+) Tregs are important in the maintenance of immunological self tolerance and in the prevention of autoimmune diseases. As the CD4(+)CD25(+) T cell population in patients with human T cell lymphotropic virus type I-associated (HTLV-I-associated) myelopathy/tropical spastic paraparesis (HAM/TSP) has been shown to be a major reservoir for this virus, it was of interest to determine whether the frequency and function of CD4(+)CD25(+) Tregs in HAM/TSP patients might be affected. In these cells, both mRNA and protein expression of the forkhead transcription factor Foxp3, a specific marker of Tregs, were lower than those in CD4(+)CD25(+) T cells from healthy individuals. The virus-encoded transactivating HTLV-I tax gene was demonstrated to have a direct inhibitory effect on Foxp3 expression and function of CD4(+)CD25(+) T cells. This is the first report to our knowledge demonstrating the role of a specific viral gene product (HTLV-I Tax) on the expression of genes associated with Tregs (in particular, foxp3) resulting in inhibition of Treg function. These results suggest that direct human retroviral infection of CD4(+)CD25(+) T cells may be associated with the pathogenesis of HTLV-I-associated neurologic disease.

  10. Magnetic resonance imaging findings of HTLV-I-associated myelopathy

    Energy Technology Data Exchange (ETDEWEB)

    Furukawa, Yoshitaka; Une, Humiho; Osame, Mitsuhiro

    1989-02-01

    Magnetic resonance imaging (MRI) of the brain was evaluated in 12 HAM (HTLV-I-associated myelopathy) patients (4 males and 8 females, mean age of 54 yrs) and compaired with 36 non-HAM controls (16 males and 20 females mean age of 52yrs). MRI of the brain was performed using a 0.5 Tesla superconducting unit. Imaging in all patients was done with the long spin echo (TR=2,000msec, TE=100msec) sequences, and 10mm contiguous axial slices of the entire brain were obtained in all cases. Except for two cases, MRI of the brain was abnormal in 10 (83%) HAM patients, while in controls, 18 (50%) cases were abnormal. The abnormalities were high intensity lesions through SE 2000/100 sequences (T/sub 2/ weighted image), and consisted of small isolated hemisphere lesions in 9 patients, periventricular changes in 9 patients, bilateral thalamic lesions in 2 patients and pontine lesions in 3 patients. We found that the factor of age was very important. In patients with ages below 59 yrs, 6 of 8 HAM patients (75%) had abnormalities, while in control cases, 6 of 23 (23%) had abnormalities in periventricular area. And in isolated hemisphere, 6 of 8 HAM patients (75%) had abnormalities, while in control cases, 3 of 23 (13%) had abnormalities. On the other hand, in patients with ages over 60 yrs, 3 of 4 (75%) HAM patients had abnormalities in periventricular area, while in controls, 10 of 13 cases (77%) had abnormalities, and in isolated hemisphere, 3 of 4 (75%) HAM patients had abnormalities, and in controls, 10 of 13 cases (77%) had abnormalities. Our data suggest that HAM patients with ages below 59 years will show a greater percentage of abnormalities than controls. (author).

  11. Magnetic resonance imaging findings of HTLV-I-associated myelopathy

    International Nuclear Information System (INIS)

    Furukawa, Yoshitaka; Une, Humiho; Osame, Mitsuhiro

    1989-01-01

    Magnetic resonance imaging (MRI) of the brain was evaluated in 12 HAM (HTLV-I-associated myelopathy) patients (4 males and 8 females, mean age of 54 yrs) and compaired with 36 non-HAM controls (16 males and 20 females mean age of 52yrs). MRI of the brain was performed using a 0.5 Tesla superconducting unit. Imaging in all patients was done with the long spin echo (TR=2,000msec, TE=100msec) sequences, and 10mm contiguous axial slices of the entire brain were obtained in all cases. Except for two cases, MRI of the brain was abnormal in 10 (83%) HAM patients, while in controls, 18 (50%) cases were abnormal. The abnormalities were high intensity lesions through SE 2000/100 sequences (T 2 weighted image), and consisted of small isolated hemisphere lesions in 9 patients, periventricular changes in 9 patients, bilateral thalamic lesions in 2 patients and pontine lesions in 3 patients. We found that the factor of age was very important. In patients with ages below 59 yrs, 6 of 8 HAM patients (75%) had abnormalities, while in control cases, 6 of 23 (23%) had abnormalities in periventricular area. And in isolated hemisphere, 6 of 8 HAM patients (75%) had abnormalities, while in control cases, 3 of 23 (13%) had abnormalities. On the other hand, in patients with ages over 60 yrs, 3 of 4 (75%) HAM patients had abnormalities in periventricular area, while in controls, 10 of 13 cases (77%) had abnormalities, and in isolated hemisphere, 3 of 4 (75%) HAM patients had abnormalities, and in controls, 10 of 13 cases (77%) had abnormalities. Our data suggest that HAM patients with ages below 59 years will show a greater percentage of abnormalities than controls. (author)

  12. Serological Evidence of HTLV-I and HTLV-II Coinfections in HIV-1 Positive Patients in Belém, State of Pará, Brazil

    Directory of Open Access Journals (Sweden)

    Vallinoto ACR

    1998-01-01

    Full Text Available The occurrence of HTLV-I/II and HIV-1 coinfections have been shown to be frequent, probably in consequence of their similar modes of transmission. This paper presents the prevalence of coinfection of HTLV among HIV-1 infected and AIDS patients in Belém, State of Pará, Brazil. A group of 149 patients attending the AIDS Reference Unit of the State Department of Health was tested for the presence of antibodies to HTLV-I/II using an enzyme immunoassay and the positive reactions were confirmed with a Western blot that discriminates between HTLV-I and HTLV-II infections. Four patients (2.7% were positive to HTLV-I, seven (4.7% to HTLV-II and one (0.7% showed an indeterminate pattern of reaction. The present results show for the first time in Belém not only the occurrence of HTLV-II/HIV-1 coinfections but also a higher prevalence of HTLV-II in relation to HTLV-I. Furthermore, it also enlarges the geographical limits of the endemic area for HTLV-II in the Amazon region of Brazil.

  13. HTLV-I carrier with unusual brain MR imaging findings

    Energy Technology Data Exchange (ETDEWEB)

    Yata, Shinsaku; Ogawa, Toshihide; Sugihara, Shuji; Matsusue, Eiji; Fujii, Shinya; Kinoshita, Toshibumi [Tottori University, Department of Pathophysiological and Therapeutic Science, Yonago (Japan); Faculty of Medicine, Tottori University, Yonago (Japan)

    2004-09-01

    We describe unusual brain MR imaging findings in a patient who is an HTLV-I carrier without myelopathy. T2-weighted MR images showed hyperintense signal abnormalities in the pyramidal tract, superior and middle cerebellar peduncles, and decussation of the superior cerebellar peduncles, in addition to subcortical white matter involvement. Diffusion-weighted images also showed hyperintense signal abnormalities in the same regions by T2 shine-through effect. (orig.)

  14. HIV-1, HTLV-I and the interleukin-2 receptor: insights into transcriptional control.

    Science.gov (United States)

    Böhnlein, E; Lowenthal, J W; Wano, Y; Franza, B R; Ballard, D W; Greene, W C

    1989-01-01

    In this study, we present direct evidence for the binding of the inducible cellular protein, HIVEN86A, to a 12-bp element present in the IL-2R alpha promoter. This element shares significant sequence similarity with the NF-kappa B binding sites present in the HIV-1 and kappa immunoglobulin enhancers. Transient transfection studies indicate that this kappa B element is both necessary and sufficient to confer tax or mitogen inducibility to a heterologous promoter. As summarized schematically in Fig. 5, the findings suggest that the HIVEN86A protein may play a central role in the activation of cellular genes required for T-cell growth, specifically the IL-2R alpha gene. In addition, the induced HIVEN86A protein also binds to a similar sequence present in the HIV-1 LTR leading to enhanced viral gene expression and ultimately T-cell death. Thus, mitogen activation of the HIV-1 LTR appears to involve the same inducible transcription factor(s) that normally regulates IL-2R alpha gene expression and T-cell growth. These findings further underscore the importance of the state of T-cell activation in the regulation of HIV-1 replication. Our results also demonstrate that HIVEN86A is induced by the tax protein of HTLV-I. Thus, in HTLV-I infected cells, normally the tight control of the transient expression of the IL-2R alpha gene is lost. The constitutive high-level display of IL-2 receptors may play a role in leukemic transformation mediated by HTLV-I (ATL). Apparently by the same mechanism, the tax protein also activates the HIV-1 LTR through the induction of HIVEN86A.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Dual infections with HIV-1, HIV-2 and HTLV-I are more common in older women than in men in Guinea-Bissau

    DEFF Research Database (Denmark)

    Holmgren, B; da Silva, Z; Larsen, Olav Ditlevsen

    2003-01-01

    OBJECTIVES: To investigate the association between the three human retroviruses, HIV-1, HIV-2 and HTLV-I. DESIGN: Community-based follow-up studies of retrovirus infections in two cohorts. METHODS: A total of 2057 individuals aged 35 years and over were eligible for inclusion. Participants were...... interviewed and had a blood sample drawn. Samples were analysed for HIV-1, HIV-2 and HTLV infections. Uni- and multivariate analyses that included behavioural and socio-economic factors were performed using logistic regression and Poisson regression models. RESULTS: A total of 1686 individuals participated...... with a blood sample in the HIV prevalence analyses and 1581 individuals participated in the HTLV-I prevalence analyses. The overall prevalence was 2.1% for HIV-1, 13.5% for HIV-2 and 7.1% for HTLV-I. Comparing the

  16. HTLV-I associated myelopathy with multiple spotty areas in cerebral white matter and brain stem by MRI

    Energy Technology Data Exchange (ETDEWEB)

    Hara, Yasuo; Takahashi, Mitsuo; Yoshikawa, Hiroo; Yorifuji, Shirou; Tarui, Seiichiro

    1988-01-01

    A 48-year-old woman was admitted with complaints of urinary incontinence and gait disturbance, both of which had progressed slowly without any sign of remission. Family history was not contributory. Neurologically, extreme spasticity was recoginized in the lower limbs. Babinski sign was positive bilaterally. Flower-like atypical lymphocytes were seen in blood. Positive anti-HTLV-I antibody was confirmed in serum and spinal fluid by western blot. She was diagnosed as having HTLV-I associated myelopathy (HAM). CT reveald calcification in bilateral globus pallidus, and MRI revealed multiple spotty areas in cerebral white matter and brain stem, but no spinal cord lesion was detectable. Electrophysiologically, brain stem auditory evoked potential (BAEP) suggested the presence of bilateral brain stem lesions. Neither median nor posterior tibial nerve somatosensory evoked potentials were evoked, a finding suggesting the existence of spinal cord lesion. In this case, the lesion was not confined to spinal cord, it was also observed in brain stem and cerebral white matter. Such distinct lesions in cerebral white matter and brain stem have not been reported in patients with HAM. It is suggested that HTLV-I is probably associated with cerebral white matter and brain stem.

  17. Frequency of the CCRD32 allele in Brazilians: a study in colorectal cancer and in HTLV-I infection

    Directory of Open Access Journals (Sweden)

    Pereira Rinaldo W.

    2000-01-01

    Full Text Available The identification of a 32-bp deletion in the cc-chemokine receptor-5 gene (CCR5delta32 allele that renders homozygous individuals highly resistant to HIV infection has prompted worldwide investigations of the frequency of the CCR5delta32 allele in regional populations. It is important to ascertain if CCR5delta32 is a factor to be considered in the overall epidemiology of HIV in individual populations. With this in mind we determined the CCR5delta32 allele frequency in a large sample (907 individuals of the southeastern Brazilian urban population, stratified as follows: 322 healthy unrelated individuals, 354 unselected colorectal cancer patients, and 229 blood donors. The three groups displayed essentially identical allelic frequencies of CCR5delta32 and pairwise comparisons did not show significant differences. Thus, our results can be pooled to provide a reliable estimate of the CCR5delta32 allele frequency in the southeastern Brazil of 0.053 ± 0.005. The blood donors comprised 50 HTLV-I serologically negative individuals, 115 non-symptomatic individuals HTLV-I positive by ELISA but with indeterminate Western blot results, 49 healthy blood donors HTLV-I positive both at ELISA and Western blot and 15 patients with clinical spinal cord disease (HAM. A suggestive trend was observed, with the CCR5delta32 frequencies decreasing progressively in these four categories. However, when we applied Fischer's exact test no significant differences emerged. We believe that further studies in larger cohorts should be performed to ascertain whether the CCR5delta32 allele influences the chance of becoming infected or developing clinical symptoms of HTLV-I infection.

  18. Laboratory test differences associated with HTLV-I and HTLV-II infection

    NARCIS (Netherlands)

    Murphy, EL; Glynn, S; Watanabe, K; Fridey, J; Sacher, R; Schreiber, G; Luban, N

    1998-01-01

    Reports of laboratory abnormalities associated with HTLV-I and HTLV-II infection are inconsistent. We assessed complete blood counts and selected serum chemistry measures at enrollment in a cohort of 153 HTLV-I-seropositive, 386 HTLV-II-seropositive, and 795 HTLV-seronegative blood donors. Linear

  19. Defective human T-cell lymphotropic virus type I (HTLV-I) provirus in seronegative tropical spastic paraparesis/HTLV-I-associated myelopathy (TSP/HAM) patients.

    Science.gov (United States)

    Ramirez, E; Fernandez, J; Cartier, L; Villota, C; Rios, M

    2003-02-01

    Infection with human T-cell lymphotropic virus type I (HTLV-I) have been associated with the development of the tropical spastic paraparesis/HTLV-I-associated myelopathy (TSP/HAM). We studied the presence of HTLV-I provirus in peripheral blood mononuclear cells (PBMC) from 72 Chilean patients with progressive spastic paraparesis by polymerase chain reaction: 32 seropositive and 40 seronegative cases. We amplified different genomic regions of HTLV-I using primers of 5' ltr, tax, env/tax, pX, pol and env genes. These genes were detected from all seropositive patients. The seronegative patients were negative with 5' ltr, pol, env, and pX primers. However, amplified product of tax and env/tax genes was detected from 16 and four seronegative patients, respectively. Three of them were positive with both genetic regions. The results of this study show that the complete HTLV-I provirus is found in 100% of seropositive cases. In seronegative cases, clinically very similar of seropositive cases, was found only tax gene in 42.5% (17/40) of patients. These results suggest the presence of a defective HTLV-I provirus in some seronegative patients with progressive spastic paraparesis, and suggest a pathogenic role of this truncate provirus for a group of TSP/HAM.

  20. A new sensitive and quantitative HTLV-I-mediated cell fusion assay in T cells

    International Nuclear Information System (INIS)

    Pare, Marie-Eve; Gauthier, Sonia; Landry, Sebastien; Sun Jiangfeng; Legault, Eric; Leclerc, Denis; Tanaka, Yuetsu; Marriott, Susan J.; Tremblay, Michel J.; Barbeau, Benoit

    2005-01-01

    Similar to several other viruses, human T cell leukemia virus type I (HTLV-I) induces the formation of multinucleated giant cells (also known as syncytium) when amplified in tissue culture. These syncytia result from the fusion of infected cells with uninfected cells. Due to the intrinsic difficulty of infecting cells with cell-free HTLV-I virions, syncytium formation has become an important tool in the study of HTLV-I infection and transmission. Since most HTLV-I-based cell fusion assays rely on the use of non-T cells, the aim of this study was to optimize a new HTLV-I-induced cell fusion assay in which HTLV-I-infected T cell lines are co-cultured with T cells that have been transfected with an HTLV-I long terminal repeat (LTR) luciferase reporter construct. We demonstrate that co-culture of various HTLV-I-infected T cells with different transfected T cell lines resulted in induction of luciferase activity. Cell-to-cell contact and expression of the viral gp46 envelope protein was crucial for this induction while other cell surface proteins (including HSC70) did not have a significant effect. This quantitative assay was shown to be very sensitive. In this assay, the cell fusion-mediated activation of NF-κB and the HTLV-I LTR occurred through previously described Tax-dependent signaling pathways. This assay also showed that cell fusion could activate Tax-inducible cellular promoters. These results thus demonstrate that this new quantitative HTLV-I-dependent cell fusion assay is versatile, highly sensitive, and can provide an important tool to investigate cellular promoter activation and intrinsic signaling cascades that modulate cellular gene expression

  1. Involvement of HTLV-I Tax and CREB in aneuploidy: a bioinformatics approach

    Directory of Open Access Journals (Sweden)

    Pumfery Anne

    2006-07-01

    Full Text Available Abstract Background Adult T-cell leukemia (ATL is a complex and multifaceted disease associated with human T-cell leukemia virus type 1 (HTLV-I infection. Tax, the viral oncoprotein, is considered a major contributor to cell cycle deregulation in HTLV-I transformed cells by either directly disrupting cellular factors (protein-protein interactions or altering their transcription profile. Tax transactivates these cellular promoters by interacting with transcription factors such as CREB/ATF, NF-κB, and SRF. Therefore by examining which factors upregulate a particular set of promoters we may begin to understand how Tax orchestrates leukemia development. Results We observed that CTLL cells stably expressing wild-type Tax (CTLL/WT exhibited aneuploidy as compared to a Tax clone deficient for CREB transactivation (CTLL/703. To better understand the contribution of Tax transactivation through the CREB/ATF pathway to the aneuploid phenotype, we performed microarray analysis comparing CTLL/WT to CTLL/703 cells. Promoter analysis of altered genes revealed that a subset of these genes contain CREB/ATF consensus sequences. While these genes had diverse functions, smaller subsets of genes were found to be involved in G2/M phase regulation, in particular kinetochore assembly. Furthermore, we confirmed the presence of CREB, Tax and RNA Polymerase II at the p97Vcp and Sgt1 promoters in vivo through chromatin immunoprecipitation in CTLL/WT cells. Conclusion These results indicate that the development of aneuploidy in Tax-expressing cells may occur in response to an alteration in the transcription profile, in addition to direct protein interactions.

  2. Modelling the role of Tax expression in HTLV-I persistence in vivo.

    Science.gov (United States)

    Li, Michael Y; Lim, Aaron G

    2011-12-01

    Human T-lymphotropic virus type I (HTLV-I) is a persistent human retrovirus characterized by life-long infection and risk of developing HAM/TSP, a progressive neurological and inflammatory disease, and adult T-cell leukemia (ATL). Chronically infected individuals often harbor high proviral loads despite maintaining a persistently activated immune response. Based on a new hypothesis for the persistence of HTLV-I infection, a three-dimensional compartmental model is constructed that describes the dynamic interactions among latently infected target cells, target-cell activation, and immune responses to HTLV-I, with an emphasis on understanding the role of Tax expression in the persistence of HTLV-I.

  3. Nucleotide sequence analysis of HTLV-I isolated from cerebrospinal fluid of a patient with TSP/HAM: comparison to other HTLV-I isolates.

    Science.gov (United States)

    Mukhopadhyaya, R; Sadaie, M R

    1993-02-01

    Human T-cell leukemia virus type I (HTLV-I) has been associated with adult T-cell leukemia/lymphoma and the chronic neurologic disorder tropical spastic paraparesis/HTLV-I-associated myelopathy (TSP/HAM). To study the genetic structure of the virus associated with TSP/HAM, we have obtained and sequenced a partial genomic clone from an HTLV-I-positive cell line established from cerebrospinal fluid (CSF) of a Jamaican patient with TSP/HAM. This clone consisted of a 4.3-kb viral sequence containing the 5' long terminal repeat (LTR), gag, and N-terminal portion of the pol gene, with an overall 1.3% sequence variation resulting from mostly nucleotide substitutions, as compared to the prototype HTLV-I ATK-1. The gag and pol regions showed only 1.4% and 1.2% nucleotide variations, respectively. However, the U3 region of the LTR showed the highest sequence variation (3.6%), where several changes appear to be common among certain TSP/HAM isolates. Several of these changes reside within the 21-bp boundaries and the Tax-responsive element. It would be important to determine if the observed changes are sufficient to cause neurologic disorders similar to the murine leukemia virus system or simply reflect the divergent pool of HTLV-I from different geographic locations. At this time, we cannot rule out the possibility that the observed changes have either direct or indirect significance for the HTLV-I pathogenesis in TSP/HAM.

  4. Seroprevalence of HIV, HTLV-I/II and other perinatally-transmitted pathogens in Salvador, Bahia Soroprevalência do HIV, HTLV-I/II e outros patógenos de transmissão perinatal em Salvador, Bahia

    Directory of Open Access Journals (Sweden)

    Jairo Ivo dos Santos

    1995-08-01

    Full Text Available Generation of epidemiological data on perinatally-transmitted infections is a fundamental tool for the formulation of health policies. In Brazil, this information is scarce, particularly in Northeast, the poorest region of the country. In order to gain some insights of the problem we studied the seroprevalence of some perinatally-transmitted infections in 1,024 low income pregnant women in Salvador, Bahia. The prevalences were as follow: HIV-1 (0.10%, HTLV-I/II (0.88%, T.cruzi (2.34%. T.pallidum (3.91%, rubella virus (77.44%. T.gondii IgM (2.87% and IgG (69.34%, HBs Ag (0.6% and anti-HBs (7.62%. Rubella virus and T.gondii IgG antibodies were present in more than two thirds of pregnant women but antibodies against other pathogens were present at much lower rates. We found that the prevalence of HTLV-I/II was nine times higher than that found for HIV-1. In some cases such as T.cruzi and hepatitis B infection there was a decrease in the prevalence over the years. On the other hand, there was an increase in the seroprevalence of T.gondii infection. Our data strongly recommend mandatory screening tests for HTLV-I/II, T.gondii (IgM, T.pallidum and rubella virus in prenatal routine for pregnant women in Salvador. Screening test for T.cruzi, hepatitis and HIV-1 is recommended whenever risk factors associated with these infections are suspected. However in areas with high prevalence for these infections, the mandatory screening test in prenatal care should be considered.A obtenção de dados epidemiológicos é de fundamental importância para o estabelecimento de políticas em Saúde Pública. No Brasil, essas informações são escassas, principalmente na região Nordeste. Para se obter alguns destes dados, avaliamos a soroprevalência de algumas infecções de transmissão perinatal, em cerca de 1024 gestantes de baixa renda, em Salvador, Bahia. Os resultados encontrados foram os seguintes: HIV-1 (0,10%, HTLV-I/II (0,88%, T.cruzi (2,34%, T.pallidum (3

  5. Quantification of Human T-lymphotropic virus type I (HTLV-I) provirus load in a rural West African population: no enhancement of human immunodeficiency virus type 2 pathogenesis, but HTLV-I provirus load relates to mortality

    DEFF Research Database (Denmark)

    Ariyoshi, K; Berry, N; Cham, F

    2003-01-01

    Human T-lymphotropic virus type I (HTLV-I) provirus load was examined in a cohort of a population in Guinea-Bissau among whom human immunodeficiency virus (HIV) type 2 is endemic. Geometric mean of HIV-2 RNA load among HTLV-I-coinfected subjects was significantly lower than that in subjects...... infected with HIV-2 alone (212 vs. 724 copies/mL; P=.02). Adjusted for age, sex, and HIV status, the risk of death increased with HTLV-I provirus load; mortality hazard ratio was 1.59 for each log10 increase in HTLV-I provirus copies (P=.038). There is no enhancing effect of HTLV-I coinfection on HIV-2...... disease, but high HTLV-I provirus loads may contribute to mortality....

  6. Evaluation of a combined lysate/recombinant antigen anti-HTLV-I/II ELISA in high and low endemic areas of HTLV-I/II infection

    NARCIS (Netherlands)

    Vrielink, H.; Sisay, Y.; Reesink, H. W.; Woerdeman, M.; Winkel, C.; de Leeuw, S. J.; Lelie, P. N.; van der Poel, C. L.

    1995-01-01

    The Wellcozyme HTLV-I/II ELISA (Murex Diagnostics) was evaluated in 7800 samples of various serum panels. Repeat activity was found by Wellcozyme in (A) 1/2181 (0.05%) Dutch blood donors, (B) 44/3036 (1.4%) Curaçao (Caribbean area) blood donors, (C) 46/2533 (1.8%) individuals of different Ethiopian

  7. Population-based Seroprevalence of HTLV-I Infection in Golestan Province, South East of Caspian Sea, Iran.

    Science.gov (United States)

    Kalavi, Khodaberdi; Moradi, Abdolvahab; Tabarraei, Alijan

    2013-03-01

    Human T-cell lymphotropic virus type-1 is an oncornavirus that causes adult T cell leukemia (ATL) HTLV-I-associated myelopathy⁄tropical spastic paraparesis (HAM/TSP). Golestan province is located in North West of Khorasan province known as an endemic area for HTLV-I in Iran. This study aimed to evaluate seroprevalence of HTLV-I in Golestan province. In this cross-sectional descriptive study in 2007, blood samples were collected from 2034 healthy people residing in different parts of Golestan province. Sera were assessed for HTLV-I/II-specific antibodies by ELISA method and reactive samples were confirmed by Western blot. Demographic and serologic data were entered in SPSS version 11.5 and statistical analysis was performed. An overall HTLV-I/II prevalence of 0.7% was observed in 15 cases by ELISA. Six out of 15 were confirmed as HTLV-I by western blot. Regional variation in the prevalence of HTLV-I was observed; 0%, 0%, 0.1%, 1.9%, 0.3%, 0%, and 2.6% tested HTLV-I-positive from west to east of Golestan Province regions, respectively. Seropositivity increased with age. No association between HTLV-I infection and sex status was detected. Highest rate of HTLV-I seroprevalence was shown in east of this region located in neighborhood with Khorasan province, the only confirmed endemic area in Iran. It seems that eastern area of our province is endemic for HTLV-I. Further comprehensive detailed epidemiological and molecular studies are recommended.

  8. Neutralization epitopes on HIV pseudotyped with HTLV-I: Conservation of carbohydrate Epitopes

    DEFF Research Database (Denmark)

    Sørensen, A M; Nielsen, C; Arendrup, M

    1994-01-01

    One mechanism for expanding the cellular tropism of human immunodeficiency virus (HIV) in vitro is through formation of phenotypically mixed particles (pseudotypes) with human T lymphotropic virus type I (HTLV-I). In this study we found that pseudotypes allow penetration of HIV particles into CD4......-negative cells, previously nonsusceptible to HIV infection. The infection of CD4-negative cells with pseudotypes could be blocked with anti-HTLV-I serum but failed to be significantly inhibited with anti-HIV serum or a V3-neutralizing anti-gp120 monoclonal antibody. This may represent a possibility...... by cell-free pseudotypes in CD4-negative cells. We suggest that although viral cofactors might expand the tropism of HIV in vivo, HIV and HTLV-I seem to induce common carbohydrate neutralization epitopes....

  9. Neutralization epitopes on HIV pseudotyped with HTLV-I: conservation of carbohydrate epitopes

    DEFF Research Database (Denmark)

    Sørensen, A M; Nielsen, C; Arendrup, M

    1994-01-01

    One mechanism for expanding the cellular tropism of human immunodeficiency virus (HIV) in vitro is through formation of phenotypically mixed particles (pseudotypes) with human T lymphotropic virus type I (HTLV-I). In this study we found that pseudotypes allow penetration of HIV particles into CD4......-negative cells, previously nonsusceptible to HIV infection. The infection of CD4-negative cells with pseudotypes could be blocked with anti-HTLV-I serum but failed to be significantly inhibited with anti-HIV serum or a V3-neutralizing anti-gp120 monoclonal antibody. This may represent a possibility...... by cell-free pseudotypes in CD4-negative cells. We suggest that although viral cofactors might expand the tropism of HIV in vivo, HIV and HTLV-I seem to induce common carbohydrate neutralization epitopes....

  10. Quantification of Human T-lymphotropic virus type I (HTLV-I) provirus load in a rural West African population: no enhancement of human immunodeficiency virus type 2 pathogenesis, but HTLV-I provirus load relates to mortality

    NARCIS (Netherlands)

    Ariyoshi, Koya; Berry, Neil; Cham, Fatim; Jaffar, Shabbar; Schim van der Loeff, Maarten; Jobe, Ousman; N'Gom, Pa Tamba; Larsen, Olav; Andersson, Sören; Aaby, Peter; Whittle, Hilton

    2003-01-01

    Human T-lymphotropic virus type I (HTLV-I) provirus load was examined in a cohort of a population in Guinea-Bissau among whom human immunodeficiency virus (HIV) type 2 is endemic. Geometric mean of HIV-2 RNA load among HTLV-I-coinfected subjects was significantly lower than that in subjects infected

  11. Erectile insufficiency as first symptom of HTLV-I/II associated myelopathy: case report Insuficiência erétil como primeiro sintoma da mielopatia associada ao HTLV I/II: relato de caso

    Directory of Open Access Journals (Sweden)

    JOSÉ TEOTONIO OLIVEIRA

    1998-03-01

    Full Text Available A case of HTLV-I/II myelopathy in which the initial complaint was erectile insufficiency (EI is reported. The only abnormalities found on the neurological exam were discrete weakness of the psoas and increased knee jerk reflexes. Diagnosis was made by demonstrating antibodies anti-HTLV I/II in the serum and cerebrospinal fluid (with the techniques of ELISA and Western blot, with confirmation by the polymerase chain reaction (PCR. EI can thus be the first symptom of HTLV-I/II infection and patients with EI of unknown etiology should be tested for HTLV-I/II in endemic areas.É relatado um caso de mielopatia associada ao HTLV I/II cuja primeira manifestação foi insuficiência erétil (IE. O exame neurológico do paciente apresentava somente discreta fraqueza dos psoas e aumento dos reflexos patelares. O diagnóstico foi feito pelo achado de anticorpos anti-HTLV I/II no soro e no líquor (com as técnicas de ELISA e Western blot e confirmado pela reação em cadeia da polimerase (PCR. Insuficiência erétil pode ser a primeira manifestação clínica de infecção pelo HTLV I/II e pacientes com IE de etiologia desconhecida devem ser testados para HTLV-I/II em áreas endêmicas.

  12. Spread of human T-cell leukemia virus (HTLV-I) in the Dutch homosexual community

    NARCIS (Netherlands)

    Goudsmit, J.; de Wolf, F.; van de Wiel, B.; Smit, L.; Bakker, M.; Albrecht-van Lent, N.; Coutinho, R. A.

    1987-01-01

    Sequential sera of 697 homosexual men, participating in a prospective study (1984-1986) of the risk to acquire human immunodeficiency virus (HIV) or AIDS, were tested for antibodies to human T-cell leukaemia virus (HTLV-I) by particle agglutination and immunoblotting. No intravenous drug users were

  13. A transgenic model of transactivation by the Tax protein of HTLV-I.

    Science.gov (United States)

    Bieberich, C J; King, C M; Tinkle, B T; Jay, G

    1993-09-01

    The human T-lymphotropic virus type I (HTLV-I) Tax protein is a transcriptional regulatory protein that has been suggested to play a causal role in the development of several HTLV-I-associated diseases. Tax regulates expression of its own LTR and of certain cellular promoters perhaps by usurping the function of the host transcriptional machinery. We have established a transgenic mouse model system to define the spectrum of tissues in vivo that are capable of supporting Tax-mediated transcriptional transactivation. Transgenic mice carrying the HTLV-I LTR driving expression of the Escherichia coli beta-galactosidase (beta gal) gene were generated, and this LTR-beta gal gene was transcriptionally inactive in all tissues. When LTR-beta gal mice were mated to transgenic mice carrying the same LTR driving expression of the HTLV-I tax gene, mice that carried both transgenes showed restricted expression of the beta gal reporter gene in several tissues including muscle, bone, salivary glands, skin, and nerve. In addition, a dramatic increase in the number of beta gal-expressing cells was seen in response to wounding. These observations provide direct evidence for viral transactivation in vivo, delimit the tissues capable of supporting that transactivation, and provide a model system to study the mechanism of gene regulation by Tax.

  14. Pulmonary function testing in HTLV-I and HTLV-II infected humans: a cohort study

    Directory of Open Access Journals (Sweden)

    Garratty George

    2003-07-01

    Full Text Available Abstract Background HTLV-I infection has been linked to lung pathology and HTLV-II has been associated with an increased incidence of pneumonia and acute bronchitis. However it is unknown whether HTLV-I or -II infection alters pulmonary function. Methods We performed pulmonary function testing on HTLV-I, HTLV-II and HTLV seronegative subjects from the HTLV outcomes study (HOST, including vital capacity (VC, forced expiratory volume in one second (FEV1, and diffusing lung capacity for carbon monoxide (DLCO corrected for hemoglobin and lung volume. Multivariable analysis adjusted for differences in age, gender, race/ethnicity, height and smoking history. Results Mean (standard deviation pulmonary function values among the 257 subjects were as follows: FVC = 3.74 (0.89 L, FEV1 = 2.93 (0.67 L, DLCOcorr = 23.82 (5.89 ml/min/mmHg, alveolar ventilation (VA = 5.25 (1.20 L and DLCOcorr/VA = 4.54 (0.87 ml/min/mmHg/L. There were no differences in FVC, FEV1 and DLCOcorr/VA by HTLV status. For DLCOcorr, HTLV-I and HTLV-II subjects had slightly lower values than seronegatives, but neither difference was statistically significant after adjustment for confounding. Conclusions There was no difference in measured pulmonary function and diffusing capacity in generally healthy HTLV-I and HTLV-II subjects compared to seronegatives. These results suggest that previously described HTLV-associated abnormalities in bronchoalveolar cells and fluid may not affect pulmonary function.

  15. Reexamination of human T cell lymphotropic virus (HTLV-I/II) prevalence.

    Science.gov (United States)

    Zucker-Franklin, D; Pancake, B A; Marmor, M; Legler, P M

    1997-06-10

    In the United States, blood donors are being screened for infection with human T cell lymphotropic viruses I and II (HTLV-I/II) by serologic means, which detect antibodies to the structural proteins of these viruses. Because patients with mycosis fungoides (MF) usually do not have such antibodies even though their cells harbor HTLV-I Tax and/or pol proviral sequences, it was questioned whether the prevalence of HTLV infection among healthy blood donors may also be underestimated by current means of testing. To examine this possibility, a study on specimens of relatives of mycosis fungoides patients (MFR) was begun. In addition, to collect data more expeditiously, a cohort of former injection drug users (IDUs) was tested by routine serologic methods, as well as by PCR/Southern blot analysis for Tax, pol, and gag proviral sequences and Western blot analysis for antibodies to the Tax gene product. To date, 6/8 MFRs and 42/81 (51.8%) of HIV-negative IDUs proved to be positive for HTLV, whereas routine serology identified none of the MFR and only 18/81 (22.2%) of the IDUs. Among the latter test subjects, the incidence of HTLV-I also proved to be 10 times higher than expected. Therefore, it is likely that among healthy blood donors infection with HTLV-I/II is more prevalent than is currently assumed. Since Tax is the transforming sequence of HTLV-I/II, testing for Tax sequences and antibodies to its gene product may be desirable in blood transfusion and tissue donor facilities.

  16. Reexamination of human T cell lymphotropic virus (HTLV-I/II) prevalence

    Science.gov (United States)

    Zucker-Franklin, Dorothea; Pancake, Bette A.; Marmor, Michael; Legler, Patricia M.

    1997-01-01

    In the United States, blood donors are being screened for infection with human T cell lymphotropic viruses I and II (HTLV-I/II) by serologic means, which detect antibodies to the structural proteins of these viruses. Because patients with mycosis fungoides (MF) usually do not have such antibodies even though their cells harbor HTLV-I Tax and/or pol proviral sequences, it was questioned whether the prevalence of HTLV infection among healthy blood donors may also be underestimated by current means of testing. To examine this possibility, a study on specimens of relatives of mycosis fungoides patients (MFR) was begun. In addition, to collect data more expeditiously, a cohort of former injection drug users (IDUs) was tested by routine serologic methods, as well as by PCR/Southern blot analysis for Tax, pol, and gag proviral sequences and Western blot analysis for antibodies to the Tax gene product. To date, 6/8 MFRs and 42/81 (51.8%) of HIV-negative IDUs proved to be positive for HTLV, whereas routine serology identified none of the MFR and only 18/81 (22.2%) of the IDUs. Among the latter test subjects, the incidence of HTLV-I also proved to be 10 times higher than expected. Therefore, it is likely that among healthy blood donors infection with HTLV-I/II is more prevalent than is currently assumed. Since Tax is the transforming sequence of HTLV-I/II, testing for Tax sequences and antibodies to its gene product may be desirable in blood transfusion and tissue donor facilities. PMID:9177230

  17. HTLV-I in the general population of Salvador, Brazil: a city with African ethnic and sociodemographic characteristics.

    Science.gov (United States)

    Dourado, Inês; Alcantara, Luiz C J; Barreto, Maurício L; da Gloria Teixeira, Maria; Galvão-Castro, Bernardo

    2003-12-15

    The city of Salvador has the highest prevalence of HTLV-I among blood donors in Brazil. To study the prevalence of HTLV-I among the general population of Salvador, 30 "sentinel surveillance areas" were selected for the investigation of various infectious diseases, and 1385 individuals within these areas were surveyed according to a simple random sample procedure. ELISA was used to screen plasma samples for antibodies to HTLV-I, and the positive samples were tested by a confirmatory assay (Western blotting). The overall prevalence of HTLV-I was 1.76% (23/1385). Infection rates were 1.2% for males and 2.0% for females. Specific prevalence demonstrated an increasing linear trend with age. No one younger than 13 years of age was infected. Multivariate analysis estimated adjusted odds ratios for the association of HTLV-I with age of 9.7 (3.3; 30.4) for females and 12.3 (1.47; 103.1) for males. Less education and income might be associated with HTLV-I infection in females. Phylogenetic analysis of the long terminal repeat fragments showed that most of the samples belonged to the Latin American cluster of the Transcontinental subgroup (Cosmopolitan subtype). For the entire city of Salvador, it is estimated that approximately 40000 individuals are infected with HTLV-I. Our results suggest multiple post-Colombian introductions of African HTLV-Ia strains in Salvador.

  18. High incidence of antibodies to HTLV-I tax in blood relatives of adult T cell leukemia patients.

    Science.gov (United States)

    Okayama, A; Chen, Y M; Tachibana, N; Shioiri, S; Lee, T H; Tsuda, K; Essex, M

    1991-01-01

    Adult T cell leukemia (ATL) is caused by the human T cell leukemia virus type I (HTLV-I). Although the mechanisms of the leukemogenic process are unknown, the tax gene may have a role in this process. Because clustering occurs with HTLV-I and ATL, members of ATL families were examined for antibodies to the tax protein and compared with matched HTLV-I-positive blood donors. To investigate the antibody response to this protein, a plasmid, pBHX-4, was constructed to express a recombinant tax protein (r-tax). For ATL patients and their HTLV-I antibody-positive blood relatives, the rate of seroreactivity with the r-tax protein was 67.3% (35/52), compared with 51.6% (97/188) for HTLV-I antibody-positive control blood donors (P less than .05). The difference between direct offspring of ATL patients and matched HTLV-I blood donors was even greater (84.2% [16/91] vs. 44.2% [42/95]; P less than .005). Thus, tax antibody positivity in direct offspring of ATL patients may reflect differences in time or route of HTLV-I infection. Alternatively, it might reflect genetic differences in host susceptibility or virus strain.

  19. Evidence of preferential female prevalence of HTLV-I associated tropical spastic paraparesis in Bahia-Brazil

    Directory of Open Access Journals (Sweden)

    O. A. Moreno-Carvalho

    1992-06-01

    Full Text Available In order to evaluate the prevalence of HTLV-I infection and its association with tropical spastic paraparesis (TSP in Bahia, a Northeastern State of Brazil, CSF and sera from TSP patients and CSF and/or sera from some selected groups of individuals were studied. The results seem to indicate a higher prevalence of HTLV-I infection in women than men with TSP and among individuals of HIV risk groups. Some alterations of routine analysis of CSF can suggest HTLV-I infection in TSP patients.

  20. Seroprevalencia de HTLV-I/II en hombres gays y trabajadoras sexuales de la Isla de Margarita, Venezuela HTLV-I/II seroprevalence among gay men and female sex workers from Margarita Island, Venezuela

    Directory of Open Access Journals (Sweden)

    E. Castro de Batänjer

    1998-08-01

    Full Text Available Sabida la importante seroprevalencia en la Isla de Margarita para el HIV-1 nos propusimos conocer la seroprevalencia de HTLV-I/II en muestras de grupos epidemiologicamente importantes en su transmisión. El estudio se desarrolló con 141 trabajadoras sexuales y 40 hombres gays entre 1994 y 1997. Nuestros resultados permitieron establecer infección por HTLV-I en un hombre. Este es el primer reporte conocido sobre pesquisa epidemiológica de la infección por HTLV-I/II en la Isla de Margarita.In attention to the important HIV-1 seroprevalence observed in Margarita Island, we carried out this study to establish HTLV-I/II seroprevalence into target groups for sexual transmission. Therefore the survey was done with 141 female sex workers and 40 gay men between 1994 and 1997. We found HTLV-I infection in one man. This is the first known report to describe epidemiological features of HTLV-I/II infection in Margarita Island.

  1. HTLV-I infection in the South West Indian Ocean islands, particularly in La Réunion and the Seychelles.

    Science.gov (United States)

    Aubry, P; Bovet, P; Vitrac, D; Schooneman, F; Hollanda, J; Malvy, D; Gaüzère, B-A

    2013-10-01

    Data on HTLV-I are scarce in several Southwest Indian Ocean islands except for La Réunion and The Seychelles. The two cases of HTLV-I have been confirmed by Western-Blot in La Réunion, among blood donors. In Seychelles (87 400 inhabitants in 2012), where blood donors and some other cases are screened, HTLV-I was confirmed with a line immune assay in 43 persons and at least 10-20 patients are known to have tropical spastic paraparesia or adult T-cell lymphoma associated with HTLV-I. In the south-west Indian Ocean, a possibly important other issue may be co-infection of HTLV-1 with the Strongyloides stercoralis roundworm, which is endemic in all countries of the region and which can sometimes lead to severe symptomatic infestation.

  2. Atypical presentation of syphilis in an HTLV-I infected patient

    Directory of Open Access Journals (Sweden)

    Carnaúba Jr Dimas

    2003-01-01

    Full Text Available We report the case of a 44 year-old female, who presented a long-lasting, clinically atypical, secondary syphilis ("malignant syphilis" in the right foot, which started six months before medical evaluation. The patient had a serological diagnosis of HTLV-I infection and syphilis two years before the onset of the skin lesions, following a blood donation. As she believed she was allergic to penicillin, she initially received sulfamethoxazole + trimethoprim, without any improvement of the clinical picture. After failure of this first treatment regimen, she was given penicillin, which promoted complete healing of the lesion. We found evidence that infection by HTLV-I is capable of modifying the clinical course of secondary syphilis.

  3. The HTLV-I tax protein transcriptionally modulates OX40 antigen expression.

    Science.gov (United States)

    Pankow, R; Dürkop, H; Latza, U; Krause, H; Kunzendorf, U; Pohl, T; Bulfone-Paus, S

    2000-07-01

    OX40 is a member of the TNF receptor family, expressed on activated T cells. It is the only costimulatory T cell molecule known to be specifically up-regulated in human T cell leukemia virus type-I (HTLV-I)-producing cells. In a T cell line, OX40 surface expression was shown to be induced by HTLV-I Tax alone. To understand molecular mechanisms of OX40 gene regulation and modulation by HTLV-I Tax, we have cloned the human OX40 gene and analyzed its 5'-flanking region. By reporter gene analysis with progressive 5' deletions from nucleotides -1259 to -64, we have defined a 157-bp DNA fragment as a minimal promoter for constitutive expression. In addition, we show that in the OX40+ cell line, Co, Tax is able to further increase OX40 surface expression. Up-regulation of OX40 promoter activity by Tax requires two upstream NF-kappaB sites, which are not active in the constitutive OX40 expression. Their deletion abrogates Tax responsiveness in reporter gene analysis. The site-directed mutagenesis of each NF-kappaB site demonstrates that cooperative NF-kappaB binding is a prerequisite for Tax-directed activity as neither site alone is sufficient for a full Tax responsiveness of the OX40 promoter. Upon Tax expression, both sites bind p65 and c-Rel. These data provide new insight into the direct regulation of OX40 by Tax and add to our understanding of the possible role of the OX40/OX40 ligand system in the proliferation of HTLV-I+ T cells.

  4. Interest of LQAS method in a survey of HTLV-I infection in Benin (West Africa).

    Science.gov (United States)

    Houinato, Dismand; Preux, Pierre-Marie; Charriere, Bénédicte; Massit, Bruno; Avodé, Gilbert; Denis, François; Dumas, Michel; Boutros-Toni, Fernand; Salamon, Roger

    2002-02-01

    HTLV-I is heterogeneously distributed in Sub-Saharan Africa. Traditional survey methods as cluster sampling could provide information for a country or region of interest. However, they cannot identify small areas with higher prevalences of infection to help in the health policy planning. Identification of such areas could be done by a Lot Quality Assurance Sampling (LQAS) method, which is currently used in industry to identify a poor performance in assembly lines. The LQAS method was used in Atacora (Northern Benin) between March and May 1998 to identify areas with a HTLV-I seroprevalence higher than 4%. Sixty-five subjects were randomly selected in each of 36 communes (lots) of this department. Lots were classified as unacceptable when the sample contained at least one positive subject. The LQAS method identified 25 (69.4 %) communes with a prevalence higher than 4%. Using stratified sampling theory, the overall HTLV-I seroprevalence was 4.5% (95% CI: 3.6-5.4%). These data show the interest of LQAS method application under field conditions to detect clusters of infection.

  5. Detection of the HTLV-I gene on cytologic smear slides.

    Science.gov (United States)

    Kashima, Kenji; Nagahama, Junji; Sato, Keiji; Tanamachi, Hiroyuki; Gamachi, Ayako; Daa, Tsutomu; Nakayama, Iwao; Yokoyama, Shigeo

    2002-01-01

    To apply the polymerase chain reaction (PCR) for detection of the HTLV-I gene from cytologic smear slides. Samples were from seven cases of serum anti-ATL antibody (ATLA)-positive T-cell lymphoma and three from ATLA-negative T-cell lymphoma. Six of the seven ATLA-positive cases were confirmed to be ATLL by Southern blotting. From the seventh case a fresh sample for blotting could not obtained. DNA was extracted from the cytologic smear slides of all 10 cases; they had been stained with Papanicolaou or May-Giemsa stain, digested with proteinase K and precipitated with phenol and ethanol. The target sequence in the pX region of the HTLV-I gene was amplified by PCR. All seven ATLA-positive cases, including one that had not yet been confirmed by Southern blotting, showed a single band, as predicted, while the three ATLA-negative cases showed no band. If cytologic smear slides are available but a fresh sample is not, the PCR method should provide evidence that the virus is present since in our study sufficient DNA templates were successfully extracted from the stained cytologic smear slides for detection of the virus.

  6. A Case Report of Positive HTLV-I Infection with Bilateral Facial Weakness and Myelitis

    Directory of Open Access Journals (Sweden)

    M. Mazdeh

    2005-04-01

    Full Text Available Infection with human T cell lymphotropic virus type I (HTLV-I causes multiple neurologic disorder , due to the retroviruses.Spinal cord disease of this type is named TSP (tropical spastic paraparesis that were drawn to the attention of neurologists 45 years ago. The clinical picture is one of the slowly progressive paraparesis with increased tendon reflexes & Babinski signs ; disorder of sphincteric control is usually an early change. Paresthesia , reduced vibratory & position senses, & ataxia have been described. The diagnosis is confirmed by the detection the antibodies to the virus in serum . There are anecdotal reports of improvement with IV-administration of gammaglobulin. But HTLV1-infection has other clinical manifestations. This report presents a rare case with bilateral facial weakness as primary manifestation. This case is related to a 41 years old woman. The clinical picture was bilateral facial weekness and approximately after 2 months, she referred to hospital with myelitis. In primary exams and evaluation, the diagnose was HTLV-I infection. The diagnosis was confirmed by the detection of the antibodies against the virus in her serum. She dead after 2.5 months of the first sign due to disease severity and bulbar palsy. Possible transmission routes and the risk of encountering the disease outside endemic areas must be attended , and it is recommended to evaluate antibodies in the children of the patients.

  7. Prevalencia de infeccion por HTLV-I/II en donantes de sangre de la provincia de Santa Fe, Argentina Prevalence of HTLV-I/II infection among blood donors in Santa Fe Province, Argentina

    Directory of Open Access Journals (Sweden)

    Roque O. Brun

    2004-04-01

    Full Text Available Subsecuentemente a que en 1997 el Programa Nacional de SIDA implementó un Programa deVigilancia Epidemiológica a escala nacional, se comenzaron a detectar anticuerpos anti-HTLV-I/II en donantes de sangre de la Provincia de Santa Fe. En base a ese hallazgo inicial, se consideró pertinente estimar la seroprevalencia de HTLV-I/II en donantes santafecinos en el curso de los 4 años siguientes. Así, desde 1997 hasta 2002, se estudiaron 9425 muestras provenientes de 17 de los 19 departamentos de la Provincia. Del total de muestras, 38 resultaron reactivas por técnicas de tamizaje, y de ellas 18 fueron confirmadas por western blot (WB. De esas muestras, 10 fueron HTLV-I/II seropositivas con una prevalencia final de 0.1% (10/9425, en tanto que 7 resultaron indeterminadas y 1 negativa. De las seropositivas, 2 (0.02 % eran HTLV, 3 (0.03 % HTLV-I, y 5 (0.05 % HTLV-II. Cabe destacar que por primera vez se constató la presencia de infección por HTLV-I/II en donantes de sangre de Santa Fe, y con una prevalencia mayor a las referidas para donantes de sangre de áreas no endémicas de Argentina. Estos datos fundamentan la necesidad de un screening sistemático para la infección por HTLV-I/II mediante normas regulatorias en bancos de sangre de esta provincia.Subsequent to the National Epidemiologic Surveillance Program developed in 1997 by the National AIDS Program, anti-HTLV-I/II antibodies among blood donors in Santa Fe Province started to be detected. On the basis of this initial finding, it was regarded of interest to evaluate the true HTLV-I/II seroprevalence in this population during a four-year survey. Thus, from 1997 up to 2002, 9425 samples were studied from 17 out of the 19 provincial departments. Out of the total sampling, 38 proved reactive by agglutination techniques, 18 of which were confirmed by western blot (WB. Out of the latter, 10 were HTLV-I/II seropositive with a final prevalence of 0.1% (10/9425, whereas 7 were indeterminate and 1

  8. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  9. Mielopatia associada ao HTLV-I / paraparesia espástica tropical: relato dos primeiros casos em Sergipe HTLV-I associated myelopathy, tropical spastic paraparesis: report of the first cases in Sergipe-Brazil

    Directory of Open Access Journals (Sweden)

    HÉLIO ARAUJO OLIVEIRA

    1998-03-01

    Full Text Available Mielopatia associada ao HTLV-I / paraparesia espástica tropical (MAH/PET, tem sido descrita em quase todas as regiões do Brasil.Os autores apresentam oito casos clinicamente definidos como MAH/PET, os primeiros relatados no Estado de Sergipe .Todos foram positivos para HTLV-I, através do método ELISA, realizado duas vezes; em apenas dois casos foi possível a confirmação por Western Blot. De acordo com protocolo de investigação clínico-laboratorial, todos os pacientes apresentaram acometimento do tracto piramidal, com mínimo comprometimeto da sensibilidade e alterações esfincterianas. Os autores chamam a atenção para a endemicidade do HTLV-I no Estado, cuja prevalência entre doadores de sangue é significativa (0,43%.HTLV-I associated myelopathy/ tropical spastic paraparesis (HAM/TSP has been decribed in practically all regions of Brazil. The authors present eight clinically defined cases of HAM/TSP, as being the first reported in Sergipe (Northeastern Brazil. All of them were confirmed through ELISA in two examinations, although only two were confirmed by Western Blot. According to clinical/laboratorial investigation protocol, all patients presented involvement of the pyramidal tract with minimal sensory loss and sphincter alteration. The authors call the attention for the endemicity of HTLV-I in the region, whose prevalence amongst blood donors is significant (0.43%.

  10. Tuberculous meningoencephalomyelitis and coinfection with HTLV-I + HTLV-II: case report Meningoencefalomielite tuberculosa e coinfecção por HTLV-I + HTLV-II: relato de caso

    Directory of Open Access Journals (Sweden)

    Marcio Menna-Barreto

    2006-03-01

    Full Text Available HTLV-I and HTLV-II are endemic in some areas of Brazil, where an associated disease, HTLV-I-associated myelopathy/tropical spastic paraparesis (HAM/TSP have been diagnosed in significant number of infected individuals. Tuberculosis has been demonstrated among those individuals, with higher prevalence than in the general population, suggesting that there is an increased risk for this comorbidity. We report the case of an individual coinfected with HTLV-I and HTLV-II, suffering from an insidious meningoencephalomyelitis caused by Mycobacterium tuberculosis. The patient was a 44 years old man successfully treated with steroids and antituberculous drugs, improving clinically and turning to a negative PCR and to a normal blood-cerebrospinal fluid barrier.Os vírus HTLV-I e HTLV-II são endêmicos em algumas regiões do Brasil, onde uma das doenças associadas, a paraparesia espástica tropical/mielopatia associada ao HTLV (PET/MAH, tem sido diagnosticada em significativo número de pacientes infectados. Nesses indivíduos, a prevalência de tuberculose é maior que na população geral, sugerindo que possa haver um maior risco para esta comorbidade. Relatamos o caso de um homem de 44 anos coinfectado HTLV-I + HTLV-II que desenvolveu meningoencefalomielite por Mycobacterium tuberculosis. O paciente apresentou recuperação clínica parcial, correção da disfunção de barreira hemato-liquórica e negativação no PCR, mediante o tratamento com corticoesteróides e tuberculostáticos.

  11. Development and Evaluation of a Novel ELISA for Detection of Antibodies against HTLV-I Using Chimeric Peptides.

    Science.gov (United States)

    Mosadeghi, Parvin; Heydari-Zarnagh, Hafez

    2018-04-01

    We aimed to develope a peptide-based indirect ELISA to detect antibodies against Human T-lymphotropic virus type I (HTLV-I). Two chimeric peptides (CP-1 and CP-2) were designed using linear immunodominant epitopes of gp-46-I, and gp21-I proteins, according to the sequence from Uniprot database. These peptides were studied initially in the ELISA using infected sera. The most promising peptideCP-1, was used to develop a peptide ELISA for detection of HTLV-I infected sera. The optimal conditions for CP-1ELISA were: the optimum coating buffer was 100mM NaHCO3, pH 9.6; coating peptide concentration was 10 µg/mL; the optimal blocking buffer was5% fetal bovine serum (FBS); the secondary antibody concentration was 1:2000; and serum dilution was 1:20. 20serum samples from HTLV-I infected patients were evaluated by ELISA developed. CP-1 showed high antigenicity while lacking any cross-reactivity with normal human sera. The results of evaluations indicated that in comparison with commercial ELISA, CP-1 ELISA showed good sensitivity and specificity. With further validation, CP-1as described in the present study could be introduced as novel reliable and cost-effective candidates for the high-specific screening of HTLV-I/-II infections in endemic regions.

  12. Association of HTLV-I with Arnold Chiari syndrome and syringomyelia

    Directory of Open Access Journals (Sweden)

    Graça Maria de Castro Viana

    Full Text Available HTLV-I is associated with a broad spectrum of manifestations, including tropical spastic paraparesis and adult T-cell leukemia/lymphoma. Arnold Chiari syndrome is a condition characterized by herniation of the cerebellar tonsils through the foramen magnum. This condition should be suspected in all patients with headache and impaired motor coordination. Syringomyelia is a developmental anomaly that leads to the formation of an intramedullary cavity. Its clinical presentation is classically characterized by syringomyelic dissociation of sensation, with suspended distribution in the proximal portion of the trunk and upper limbs and preservation in other regions. We report here a case of association of the three diseases, which is rare in clinical practice, illustrating the difficulty in the diagnosis and therapeutic management of these conditions.

  13. Dual infections with HIV-1, HIV-2 and HTLV-I are more common in older women than in men in Guinea-Bissau

    DEFF Research Database (Denmark)

    Holmgren, B; da Silva, Z; Larsen, Olav Ditlevsen

    2003-01-01

    OBJECTIVES: To investigate the association between the three human retroviruses, HIV-1, HIV-2 and HTLV-I. DESIGN: Community-based follow-up studies of retrovirus infections in two cohorts. METHODS: A total of 2057 individuals aged 35 years and over were eligible for inclusion. Participants were...... interviewed and had a blood sample drawn. Samples were analysed for HIV-1, HIV-2 and HTLV infections. Uni- and multivariate analyses that included behavioural and socio-economic factors were performed using logistic regression and Poisson regression models. RESULTS: A total of 1686 individuals participated...... increased with age for all three retroviruses. Dual infections were more common in women than in men. Assuming independent distribution of the viruses, the observed prevalence of dual infections in women was significantly higher than expected, while the prevalence was not increased in men. The prevalence...

  14. HTLV-I and HTLV-II infections in hematologic disorder patients, cancer patients, and healthy individuals from Rio de Janeiro, Brazil.

    Science.gov (United States)

    Farias de Carvalho, S M; Pombo de Oliveira, M S; Thuler, L C; Rios, M; Coelho, R C; Rubim, L C; Silva, E M; Reis, A M; Catovsky, D

    1997-07-01

    To clarify the seroprevalence of human T-cell lymphotropic virus type I (HTLV-I) among hematologic and cancer patients in the State of Rio de Janeiro, Brazil, we investigated sera from 2430 individuals from the following groups: 152 patients with T-cell diseases, 250 with B-cell disorders, 67 with myeloid leukemia, 41 with Hodgkin's disease, 351 with a history of multiple blood transfusions, 235 patients with solid tumors of different types, and 109 family members of HTLV-I-infected patients. Antibodies to HTLV-I were screened by enzyme-linked immunosorbent assay or particle agglutination assays (or both). Repeatedly reactive samples were tested by Western blot and polymerase chain reaction assay to differentiate HTLV-I from HTLV-II. We found an increased seroprevalence rate of HTLV-I among those with lymphoid malignancies, mainly in T-cell diseases (28.9%), and these results were important in characterizing 44 cases of adult T-cell leukemia/lymphoma. We confirmed the presence of HTLV-I and HTLV-II infections in blood donors (0.4% and 0.1%, respectively), in patients exposed to multiple blood transfusions (10.2% and 0.8%, respectively), and in 30 (27.5%) of 109 family members of HTLV-I- or HTLV-II-infected patients. We also confirmed the high rate occurrence of adult T-cell leukemia/lymphoma among lymphoproliferative disorders in Rio de Janeiro, Brazil.

  15. Epidemiologia, fisiopatogenia e diagnóstico laboratorial da infecção pelo HTLV-I Epidemiology, physiopathogenesis and laboratorial diagnosis of the HTLV-I infection

    Directory of Open Access Journals (Sweden)

    Fred Luciano Neves Santos

    2005-04-01

    Full Text Available O HTLV-I foi descoberto no início dos anos 1980 e associado a leucemia/linfoma de células T (LLTA e paraparesia espástica tropical (PET. O HTLV pertence à família Retroviridae e tem um genoma de RNA de fita simples com uma estrutura genética similar à dos demais retrovírus, possuindo os genes gag, pol, env e pX. Este último contém os genes reguladores tax e rex. Tax e Rex são as principais proteínas reguladoras do genoma viral, sendo que Tax regula a transcrição do genoma proviral indiretamente ao interagir com diferentes proteínas regulatórias celulares, principalmente genes de citocinas e protoncogenes, e Rex atua como um regulador pós-transcricional do genoma do HTLV-I ao controlar o processamento (splicing do RNAm viral. Essa infecção é endêmica em diversas regiões do mundo, tais como Japão, vários países da África, Caribe e América do Sul. No Brasil, Salvador é a cidade de maior prevalência, atingindo 1,7% da população geral. A maioria dos indivíduos infectados pelo HTLV-I permanece assintomática no decorrer de suas vidas, correspondendo a aproximadamente 95%. Dos indivíduos sintomáticos, alguns desenvolvem PET e outros, LLTA, sem que suas fisiopatogenias estejam perfeitamente esclarecidas. O diagnóstico rotineiro da infecção causada pelo HTLV-I baseia-se na detecção sorológica de anticorpos específicos para antígenos das diferentes porções do vírus ou através da pesquisa de seqüências genômicas provirais em células mononucleares periféricas. Ainda não existe nenhum estudo epidemiológico com bases populacionais e com metodologias adequadas sobre a infecção pelo HTLV-I que permita conhecer sua real prevalência no Brasil.Human T-cell lymphotropic virus type I (HTLV-I has been identified as the causative agent of both adult T-cell leukemia (ATL and HTLV-I-associated myelopathy/tropical spastic paraparesis (HAM/TSP. Similar to other retroviruses, HTLV-I has a positive strand RNA diploid

  16. Adult T-cell leukemia-lymphoma in a patient with HTLV-I/II associated myelopathy Leucemia - linfoma de células T do adulto em um paciente com mielopatia associada a HTLV-I/II

    Directory of Open Access Journals (Sweden)

    Virgínia Freitas

    1997-06-01

    Full Text Available Chronic myelopathy associated with T-lymphotropic virus type I (HAM has been described as an endemic disease in several areas of the world, meanwhile there are few papers describing the association between HAM and adult T cell leukemia-lymphoma. We report the case of a man that, after four years of progressive spastic paraparesis and neurogenic bladder, developed a clinical picture of a lymphoproliferative disorder characterized by dermal and systemic envolvement, mimicking mycosis fungoides/Sézary syndrome.Apesar da infecção pelo HTLV-I ser endêmica em várias regiões do mundo, poucos são os relatos da associação entre leucemia-linfoma de células T do adulto (ATLL e encefalomieloneuropatia pelo HTLV-I. No presente artigo é descrito um paciente que no curso do comprometimento neurológico pelo HTLV-I desenvolveu quadro de leucemia com infiltração de tecido dérmico semelhante ao encontrado na micose fungóide/síndrome de Sézary.

  17. HTLV-I en población de alto riesgo sexual de Pisco, Ica, Perú.

    Directory of Open Access Journals (Sweden)

    Patricia GARRIDO

    1997-07-01

    Full Text Available Objetivo: Se estudiaron 141 personas con alto riesgo sexual en la ciudad de Pisco para detectar infección por HTLV-I. Material y Métodos: Se encuestaron y se tomaron muestras de sangre a 141 personas que involucró a trabajadoras sexuales (32, varones homosexuales (54, y varones bisexuales(55. Resultados: Tres de treintidós (10.4% trabajadoras sexuales fueron positivas; uno de cincuenticuatro (1.9% de varones homosexuales y ninguno de 55 bisexuales. Hubo una elevada frecuencia de parejas, así como el antecedente de enfermedades de transmisión sexual (ETS en estos grupos con comportamiento de riesgo. Conclusiones: El HTLV-I es una infección frecuente en grupos de alto riesgo sexual de Pisco-Perú. (Rev Med Hered 1997; 8:104-107.

  18. Establishment of HTLV-I-infected cell lines from peripheral blood mononuclear cells of Brazilian patients Estabelecimento de linhagens celulares infectadas por HTLV-I a partir de células mononucleares periféricas de pacientes brasileiros

    Directory of Open Access Journals (Sweden)

    Carolina V. Pannuti

    2004-08-01

    Full Text Available To investigate epidemiological and pathogenetic features of HTLV-I infection, a cohort of carriers has been followed at the USP Teaching Hospital since 1991. This study describes the establishment of cell lines from peripheral blood mononuclear cells (PBMC of infected subjects. Ex vivo PBMC were cultured with those from a seronegative donor and morphologic evidence of cell transformation was obtained after 90 days with detection of multinucleated cells exhibiting cerebriform nuclei. Integration of HTLV-I proviral DNA and expression of viral antigens was demonstrated in culture by PCR and immunofluorescence. Cell lines were maintained for 240 days, gradually weaned from exogenous IL-2. Immunophenotyping of cell lines on flow cytometry yielded evidence of cell activation. Establishment of HTLV-I-infected cell lines from ex vivo PBMC is feasible and may be useful for studies on lymphocyte phenotypic changes and on mechanisms of HTLV-induced cell proliferation. Moreover they may be used with diagnostic purposes in immunofluorescence tests.Para investigar a epidemiologia e patogênese da infecção por HTLV-I seguimos coorte de portadores dessa retrovirose no HC-FMUSP desde 1991. Este estudo descreve o estabelecimento de linhagens celulares a partir de células mononucleares periféricas (CMP de indivíduos infectados. As CMP foram cultivadas com as de doador soronegativo, verificando-se após 90 dias evidência morfológica de transformação celular com detecção de células multinucleadas com núcleos cerebriformes. Demonstrou-se integração do DNA proviral e expressão in vitro de antígenos virais pela PCR e imunofluorescência. As linhagens celulares transformadas foram mantidas por 240 dias, com retirada gradual de IL-2 exógena. A imunofenotipagem por citometria de fluxo revelou ativação celular. O estabelecimento de linhagens celulares infectadas por HTLV-I a partir de CMP ex-vivo é exeqüível e pode ser útil na investigação de

  19. Intravenous methylprednisolone in HTLV-I associated myelopathy/tropical spastic paraparesis (HAM/TSP Metilprednisolona endovenosa na mielopatia associada ao HTLV-I/Paraparesia Espástica Tropical (MAH/PET

    Directory of Open Access Journals (Sweden)

    Abelardo Q-C Araújo

    1993-09-01

    Full Text Available HTLV-I (Human T-lymphotropic virus type I associated myelopathy/tropical spastic paraparesis (HAM/TSP is an immunomediated myelopathy induced by the HTLV-I. Some patients, specially those from Japan, seem to have a good response to steroid treatment. However, this has not been found in other regions of the world. High dose intravenous methylprednisolone has been used with success in patients with relapses of multiple sclerosis (MS, another autoimmune disease of the central nervous system. To test the effectiveness of methylprednisolone in patients with HAM/TSP, we devised an open trial in 23 patients. We found a very limited benefit of this form of treatment in these patients. Only one patient, who had the shortest disease duration (five months in the whole group, showed a sustained benefit. We speculate that those patients with a shorter history, with presumably less demye-lination and more inflammatory lesions, would show a better response to immunossupressive treatments.A mielopatia associada ao protovírus T-linfotrópico humano (HTLV-I, também conhecida como paraparesia espástica tropical associada ao HTLV-I (MAH/PET, constitui enfermidade imunomediada desencadeada pela infecção pelo HTLV-I. Nesta condição tem sido demonstrada, particularmente em pacientes japoneses, boa resposta clínica à terapêutica com corticosteróides. Este efeito benéfico todavia não foi encontrado em todas as regiões do mundo. Pulsoterapia com metilprednisolona endovenosa tem sido utilizada com sucesso em pacientes com esclerose múltipla, outro exemplo de doença auto-imune do sistema nervoso central, especialmente durante as fases de exacerbação da doença. Objetivando testar a eficácia da pulsoterapia com metilprednisolona em pacientes com MAH/PET, conduzimos estudo aberto em 23 doentes. Não constatamos efeito benéfico significativo desta forma de tratamento na maioria dos enfermos estudados. Apenas um dos pacientes, o qual exibia o menor tempo de

  20. Proprioceptive neuromuscular facilitation in HTLV-I-associated myelopathy/tropical spastic paraparesis

    Directory of Open Access Journals (Sweden)

    Vera Lúcia Santos de Britto

    2014-01-01

    Full Text Available Introduction: Human T cell lymphotropic virus type I-associated myelopathy/tropical spastic paraparesis (HAM/TSP can impact the independence and motricity of patients. The aims of this study were to estimate the effects of physiotherapy on the functionality of patients with HAM/TSP during the stable phase of the disease using proprioceptive neuromuscular facilitation (PNF and to compare two methods of treatment delivery. Methods: Fourteen patients with human T cell lymphotropic virus type I (HTLV-I were randomly allocated into two groups. In group I (seven patients, PNF was applied by the therapist, facilitating the functional activities of rolling, sitting and standing, walking and climbing and descending stairs. In group II (seven patients, PNF was self-administered using an elastic tube, and the same activities were facilitated. Experiments were conducted for 1h twice per week for 12 weeks. Low-back pain, a modified Ashworth scale, the functional independence measure (FIM and the timed up and go test (TUG were assessed before and after the interventions. Results: In the within-group evaluation, low-back pain was significantly reduced in both groups, the FIM improved in group II, and the results of the TUG improved in group I. In the inter-group analysis, only the tone was lower in group II than in group I. Conclusions: Both PNF protocols were effective in treating patients with HAM/TSP.

  1. Abundant tax protein expression in CD4+ T cells infected with human T-cell lymphotropic virus type I (HTLV-I) is prevented by cytotoxic T lymphocytes.

    Science.gov (United States)

    Hanon, E; Hall, S; Taylor, G P; Saito, M; Davis, R; Tanaka, Y; Usuku, K; Osame, M; Weber, J N; Bangham, C R

    2000-02-15

    The role of the cellular immune response in human T-cell leukemia virus type I (HTLV-I) infection is not fully understood. A persistently activated cytotoxic T lymphocyte (CTL) response to HTLV-I is found in the majority of infected individuals. However, it remains unclear whether this CTL response is protective or causes tissue damage. In addition, several observations paradoxically suggest that HTLV-I is transcriptionally silent in most infected cells and, therefore, not detectable by virus-specific CTLs. With the use of a new flow cytometric procedure, we show here that a high proportion of naturally infected CD4+ peripheral blood mononuclear cells (PBMC) (between 10% and 80%) are capable of expressing Tax, the immunodominant target antigen recognized by virus-specific CTLs. Furthermore, we provide direct evidence that autologous CD8+ T cells rapidly kill CD4+ cells naturally infected with HTLV-I and expressing Tax in vitro by a perforin-dependent mechanism. Consistent with these observations, we observed a significant negative correlation between the frequency of Tax(11-19)-specific CD8+ T cells and the percentage of CD4+ T cells in peripheral blood of patients infected with HTLV-I. Those results are in accordance with the view that virus-specific CTLs participate in a highly efficient immune surveillance mechanism that persistently destroys Tax-expressing HTLV-I-infected CD4+ T cells in vivo. (Blood. 2000;95:1386-1392)

  2. HTLV-I en población de alto riesgo sexual de Pisco, Ica, Perú.

    OpenAIRE

    GARRIDO, Patricia; ANICAMA, Rolando; GOTUZZO, Eduardo; CHAUCA, Gloria; WATTS, Douglas

    2013-01-01

    Objetivo: Se estudiaron 141 personas con alto riesgo sexual en la ciudad de Pisco para detectar infección por HTLV-I. Material y Métodos: Se encuestaron y se tomaron muestras de sangre a 141 personas que involucró a trabajadoras sexuales (32), varones homosexuales (54), y varones bisexuales(55). Resultados: Tres de treintidós (10.4%) trabajadoras sexuales fueron positivas; uno de cincuenticuatro (1.9%) de varones homosexuales y ninguno de 55 bisexuales. Hubo una elevada frecuencia de parejas,...

  3. Identification of Human T-lymphotropic Virus Type I (HTLV-I Subtypes Using Restrited Fragment Length Polymorphism in a Cohort of Asymptomatic Carriers and Patients with HTLV-I-associated Myelopathy/tropical Spastic Paraparesis from São Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    Segurado Aluisio AC

    2002-01-01

    Full Text Available Although human T-lymphotropic virus type I (HTLV-I exhibits high genetic stability, as compared to other RNA viruses and particularly to human immunodeficiency virus (HIV, genotypic subtypes of this human retrovirus have been characterized in isolates from diverse geographical areas. These are currently believed not to be associated with different pathogenetic outcomes of infection. The present study aimed at characterizing genotypic subtypes of viral isolates from 70 HTLV-I-infected individuals from São Paulo, Brazil, including 42 asymptomatic carriers and 28 patients with HTLV-1-associated myelopathy/tropical spastic paraparesis (HAM/TSP, using restricted fragment length polymorphism (RFLP analysis of long terminal repeat (LTR HTLV-I proviral DNA sequences. Peripheral blood mononuclear cell lysates were amplified by nested polymerase chain reaction (PCR and amplicons submitted to enzymatic digestion using a panel of endonucleases. Among HTLV-I asymptomatic carriers, viral cosmopolitan subtypes A, B, C and E were identified in 73.8%, 7.1%, 7.1% and 12% of tested samples, respectively, whereas among HAM/TSP patients, cosmopolitan A (89.3%, cosmopolitan C (7.1% and cosmopolitan E (3.6% subtypes were detected. HTLV-I subtypes were not statistically significant associated with patients' clinical status. We also conclude that RFLP analysis is a suitable tool for descriptive studies on the molecular epidemiology of HTLV-I infections in our environment.

  4. HTLV-I Associated uveitis, myelopathy, rheumatoid arthritis and Sjögren's syndrome Uveite, mielopatia, artrite reumatóide e sindrome de Sjogren associadas ao HTLV-I

    Directory of Open Access Journals (Sweden)

    Sônia Regina A. A. Pinheiro

    1995-12-01

    Full Text Available A 62 year-old white female presented with a 10-year-history of slowly progressive spastic paraparesis, pain and dysesthesia in the lower limbs and sphincter disturbance. A few years after the onset of the neurologic symptoms she developped migratory arthritis with swelling of the knees and pain on palpation of knees and fingers, dry eyes, mouth and skin. Two months before admission she presented bilateral nongranulomatous anterior uveitis. Examination revealed spastic paraparesis with bilateral Babinski sign, a decreased sensation level below L3, decreased vibration sense in the lower extremities, and a postural tremor of the upper limbs. Laboratory work-up disclosed HTLV-I positive tests in the blood and cerebrospinal fluid (CSF, and a mild pleocytosis in the CSF with a normal protein content. Nerve conduction velocity studies were normal. The present case shows the association of uveitis, arthritis and Sjögren's syndrome in a patient with tropical spastic paraparesis / human T-cell lymphotropic virus type I (HTLV-I associated myelopathy (TSP/HAM, and illustrates the wide spectrum of clinical manifestations which may accompany this infection with this virus.Uma mulher branca de 62 anos foi internada apresentando história de paraparesia lentamente progressiva durante 10 anos. Dois meses antes da internação ela apresentou uveíte anterior não granulomatosa bilateral. Poucos anos após o início dos sintomas neurológicos, ela desenvolveu artrite migratória com edema dos joelhos e dor a palpação dos joelhos e dedos dos pés, boca, pele e olhos secos. Ao exame físico foi observado paraparesia espástica com sinal de Babinski positivo, sensibilidade diminuída abaixo de L3, diminuição da sensação de vibração nas extremidades inferiores, e tremor postural dos membros superiores. Apresentou testes positivos para o HTLV-I no sangue. O estudo do líquido cefalorraquidiano mostrou discreta pleocitose, proteínas normais e ELISA e Western

  5. Microtubule proteins and their post-translational forms in the cerebrospinal fluid of patients with paraparesis associated with HTLV-I infection and in SH-SY5Y cells: An in vitro model of HTLV-I-induced disease

    Directory of Open Access Journals (Sweden)

    HORACIO MALDONADO

    2008-01-01

    Full Text Available HTLV-I-associated myelopathy/tropical spastic paraparesis (HAM/TSP is characterized by axonal degeneration of the corticospinal tracts. The specific requirements for transport of proteins and organelles to the distal part of the long axon are crucial in the corticospinal tracts. Microtubule dysfunction could be involved in this disease, configuring an axonal transport disease. We measured tubulin and its post-translational modified forms (acetylated and tyrosinated in CSF of patients and controls, as well as tau and its phosphorylated forms. There were no significant differences in the contents of tubulin and acetyl-tubulin between patients and controls; tyrosyl-tubulin was not detected. In HAM/TSP, tau levéis were significantly reduced, while the ratio of pT181/total tau was higher in patients than in controls, this being completely different from what is reported in other neurodegenerative diseases. Phosphorylation at T181 was also confirmed by Mass Spectrometry analysis. Western Blotting with monospecific polyclonal antibodies against pS199, pT205, pT231, pS262, pS356, pS396, pS404 and pS422 did not show differences in phosphorylation in these residues between patients and controls. Treating human SH-SY5Y neuroblastoma cells, a well-known in vitro neurite retraction model, with culture supernatant of MT-2 cells (HTLV-I infected cell line that secretes the viral Tax protein we observed neurite retraction and an increase in tau phosphorylation at T181. A disruption of normal phosphorylation of tau protein in T181 could result in its dysfunction, contributing to axonal damage.

  6. Human T-cell Lymphotropic Virus types I and II (HTLV-I/II in French Guiana: clinical and molecular epidemiology

    Directory of Open Access Journals (Sweden)

    Kazanji Mirdad

    2003-01-01

    Full Text Available We review here the epidemiological studies performed by our group on human retrovirus HTLV-I and HTLV-II infections and the associated diseases in French Guiana since 1984. French Guiana is an overseas French administrative district located between Brazil and Surinam. Its population is characterized by a large variety of ethnic groups, including several populations of African origin and various populations of Amerindian origin. Several epidemiological studies of large samples of pregnant women and in remote villages showed that HTLV-I is highly endemic in this area but is restricted to groups of African origin, especially the Noir-Marrons. In this endemic population, the results of segregation analysis in a genetic epidemiological study were consistent with the presence of a dominant major gene predisposing to HTLV-I infection, especially in children. In contrast, HTLV-II infection appears to be rare in French Guiana, having been found in only a few individuals of Brazilian origin. From a molecular point of view, the HTLV-I strains present in the Noir-Marrons, Creoles and Amerindians appear to originate from Africa, as they belong to the large cosmopolitan molecular subtype A.

  7. Probability based load factors for design of concrete containment structures

    International Nuclear Information System (INIS)

    Hwang, H.; Kagami, S.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1985-01-01

    This paper describes a procedure for developing probability-based load combinations for the design of concrete containments. The proposed criteria are in a load and resistance factor design (LRFD) format. The load factors and resistance factors are derived for use in limit states design and are based on a target limit state probability. In this paper, the load factors for accident pressure and safe shutdown earthquake are derived for three target limit state probabilities. Other load factors are recommended on the basis of prior experience with probability-based design criteria for ordinary building construction. 6 refs

  8. Prevalence of human T cell leukemia virus-I (HTLV-I antibody among populations living in the Amazon region of Brazil (preliminary report

    Directory of Open Access Journals (Sweden)

    C. M. Nakauchi

    1990-03-01

    Full Text Available Forty-tree (31.4% out of 137 serum samples obtained from two Indian communities living in the Amazon region were found to be positive for HTLV-I antibody, as tested by enzyme-linked immunosorbent assay (Elisa. Eighty-two sera were collected from Mekranoiti Indians, yielding 39% of positivity, whereas 11 (20.0% or the 55 Tiriyo serum samples had antibody to HTLV-I. In addition, positive results occurred in 10 (23.2% out of 43 sera obtained from patients living in the Belem area, who were suffering from cancer affecting different organs. Five (16.7% out of 30 Elisa positive specimens were also shown to be positive by either Western blot analysis (WB or indirect immunogold electron microscopy (IIG-EM.

  9. El HTLV-I y la PET/HAM un modelo de investigación en virología y biología molecular

    OpenAIRE

    Felipe García Vallejo; Martha C. Domínguez

    2004-01-01

    En la actualidad la infección por el virus linfotrópico humano tipo 1 (HTLV-1) ha sido confirmada epidemiológicamente en la Leucemia/Linfoma de las Células T del Adulto (ATLL) y en la Paraparesia Espástica Tropical/ Mielopatía Asociada al HTLV-I (PET/MAH) (1). El HTLV-I es endémico en varias áreas geográficas del mundo y representa un problema de salud pública global. En Colombia las áreas mas afectadas incluye...

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. Dynamics of human T-cell lymphotropic virus I (HTLV-I) infection of CD4+ T-cells.

    Science.gov (United States)

    Katri, Patricia; Ruan, Shigui

    2004-11-01

    Stilianakis and Seydel (Bull. Math. Biol., 1999) proposed an ODE model that describes the T-cell dynamics of human T-cell lymphotropic virus I (HTLV-I) infection and the development of adult T-cell leukemia (ATL). Their model consists of four components: uninfected healthy CD4+ T-cells, latently infected CD4+ T-cells, actively infected CD4+ T-cells, and ATL cells. Mathematical analysis that completely determines the global dynamics of this model has been done by Wang et al. (Math. Biosci., 2002). In this note, we first modify the parameters of the model to distinguish between contact and infectivity rates. Then we introduce a discrete time delay to the model to describe the time between emission of contagious particles by active CD4+ T-cells and infection of pure cells. Using the results in Culshaw and Ruan (Math. Biosci., 2000) in the analysis of time delay with respect to cell-free viral spread of HIV, we study the effect of time delay on the stability of the endemically infected equilibrium. Numerical simulations are presented to illustrate the results.

  12. Properties of HTLV-I transformed CD8+ T-cells in response to HIV-1 infection.

    Science.gov (United States)

    Gulzar, N; Shroff, A; Buberoglu, B; Klonowska, D; Kim, J E; Copeland, K F T

    2010-10-25

    HIV-1 infection studies of primary CD8(+) T-cells are hampered by difficulty in obtaining a significant number of targets for infection and low levels of productive infection. Further, there exists a paucity of CD8-expressing T-cell lines to address questions pertaining to the study of CD8(+) T-cells in the context of HIV-1 infection. In this study, a set of CD8(+) T-cell clones were originated through HTLV-I transformation in vitro, and the properties of these cells were examined. The clones were susceptible to T-cell tropic strains of the virus and exhibited HIV-1 production 20-fold greater than primary CD4(+) T-cells. Productive infection resulted in a decrease in expression of CD8 and CXCR4 molecules on the surface of the CD8(+) T-cell clones and antibodies to these molecules abrogated viral binding and replication. These transformed cells provide an important tool in the study of CD8(+) T-cells and may provide important insights into the mechanism(s) behind HIV-1 induced CD8(+) T-cell dysfunction. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Ubiquitination of HTLV-I Tax in response to DNA damage regulates nuclear complex formation and nuclear export

    Directory of Open Access Journals (Sweden)

    Marriott Susan J

    2007-12-01

    Full Text Available Abstract Background The HTLV-I oncoprotein, Tax, is a pleiotropic protein whose activity is partially regulated by its ability to interact with, and perturb the functions of, numerous cellular proteins. Tax is predominantly a nuclear protein that localizes to nuclear foci known as Tax Speckled Structures (TSS. We recently reported that the localization of Tax and its interactions with cellular proteins are altered in response to various forms of genotoxic and cellular stress. The level of cytoplasmic Tax increases in response to stress and this relocalization depends upon the interaction of Tax with CRM1. Cellular pathways and signals that regulate the subcellular localization of Tax remain to be determined. However, post-translational modifications including sumoylation and ubiquitination are known to influence the subcellular localization of Tax and its interactions with cellular proteins. The sumoylated form of Tax exists predominantly in the nucleus while ubiquitinated Tax exists predominantly in the cytoplasm. Therefore, we hypothesized that post-translational modifications of Tax that occur in response to DNA damage regulate the localization of Tax and its interactions with cellular proteins. Results We found a significant increase in mono-ubiquitination of Tax in response to UV irradiation. Mutation of specific lysine residues (K280 and K284 within Tax inhibited DNA damage-induced ubiquitination. In contrast to wild-type Tax, which undergoes transient nucleocytoplasmic shuttling in response to DNA damage, the K280 and K284 mutants were retained in nuclear foci following UV irradiation and remained co-localized with the cellular TSS protein, sc35. Conclusion This study demonstrates that the localization of Tax, and its interactions with cellular proteins, are dynamic following DNA damage and depend on the post-translational modification status of Tax. Specifically, DNA damage induces the ubiquitination of Tax at K280 and K284

  14. Combined Cytolytic Effects of a Vaccinia Virus Encoding a Single Chain Trimer of MHC-I with a Tax-Epitope and Tax-Specific CTLs on HTLV-I-Infected Cells in a Rat Model

    Directory of Open Access Journals (Sweden)

    Takashi Ohashi

    2014-01-01

    Full Text Available Adult T cell leukemia (ATL is a malignant lymphoproliferative disease caused by human T cell leukemia virus type I (HTLV-I. To develop an effective therapy against the disease, we have examined the oncolytic ability of an attenuated vaccinia virus (VV, LC16m8Δ (m8Δ, and an HTLV-I Tax-specific cytotoxic T lymphocyte (CTL line, 4O1/C8, against an HTLV-I-infected rat T cell line, FPM1. Our results demonstrated that m8Δ was able to replicate in and lyse tumorigenic FPM1 cells but was incompetent to injure 4O1/C8 cells, suggesting the preferential cytolytic activity toward tumor cells. To further enhance the cytolysis of HTLV-I-infected cells, we modified m8Δ and obtained m8Δ/RT1AlSCTax180L, which can express a single chain trimer (SCT of rat major histocompatibility complex class I with a Tax-epitope. Combined treatment with m8Δ/RT1AlSCTax180L and 4O1/C8 increased the cytolysis of FPM1V.EFGFP/8R cells, a CTL-resistant subclone of FPM1, compared with that using 4O1/C8 and m8Δ presenting an unrelated peptide, suggesting that the activation of 4O1/C8 by m8Δ/RT1AlSCTax180L further enhanced the killing of the tumorigenic HTLV-I-infected cells. Our results indicate that combined therapy of oncolytic VVs with SCTs and HTLV-I-specific CTLs may be effective for eradication of HTLV-I-infected cells, which evade from CTL lysis and potentially develop ATL.

  15. Thermal disadvantage factor calculation by the multiregion collision probability method

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2004-01-01

    A multi-region collision probability formulation that is capable of applying white boundary condition directly is presented and applied to thermal neutron transport problems. The disadvantage factors computed are compared with their counterparts calculated by S N methods with both direct and indirect application of white boundary condition. The results of the ABH and collision probability method with indirect application of white boundary condition are also considered and comparisons with benchmark Monte Carlo results are carried out. The studies show that the proposed formulation is capable of calculating thermal disadvantage factor with sufficient accuracy without resorting to the fictitious scattering outer shell approximation associated with the indirect application of the white boundary condition in collision probability solutions

  16. Human T-cell Lymphotropic Virus types I and II (HTLV-I/II in French Guiana: clinical and molecular epidemiology Os Vírus T-Linfotrópicos Humanos tipo I (HTLV-I e tipo II (HTLV-II na Guiana Francesa: epidemiologia clínica e molecular

    Directory of Open Access Journals (Sweden)

    Mirdad Kazanji

    2003-10-01

    Full Text Available We review here the epidemiological studies performed by our group on human retrovirus HTLV-I and HTLV-II infections and the associated diseases in French Guiana since 1984. French Guiana is an overseas French administrative district located between Brazil and Surinam. Its population is characterized by a large variety of ethnic groups, including several populations of African origin and various populations of Amerindian origin. Several epidemiological studies of large samples of pregnant women and in remote villages showed that HTLV-I is highly endemic in this area but is restricted to groups of African origin, especially the Noir-Marrons. In this endemic population, the results of segregation analysis in a genetic epidemiological study were consistent with the presence of a dominant major gene predisposing to HTLV-I infection, especially in children. In contrast, HTLV-II infection appears to be rare in French Guiana, having been found in only a few individuals of Brazilian origin. From a molecular point of view, the HTLV-I strains present in the Noir-Marrons, Creoles and Amerindians appear to originate from Africa, as they belong to the large cosmopolitan molecular subtype A.Os autores apresentam uma revisão dos estudos epidemiológicos realizados pelo seu grupo de pesquisa sobre a infecção pelos vírus T-linfotrópicos humanos tipo I (HTLV-I e tipo II (HTLV-II e doenças associadas na Guiana Francesa, desde 1984. A Guiana Francesa é um Departamento de Ultramar da França, situado entre o Brasil e o Suriname. A população é caracterizada por uma grande variedade de grupos étnicos, incluindo diversas comunidades de origem africana e outras de origem indígena. Diversos inquéritos epidemiológicos sobre gestantes e em aldeias remotas mostraram que o HTLV-I é altamente endêmico nessas áreas, mas que o vírus é restrito a grupos de origem africana, particularmente os Noir-Marrons. Nessa população endêmica, os resultados de uma an

  17. Identification of clonally rearranged T-cell receptor beta chain genes in HTLV-I carriers as a potential instrument for early detection of neoplasia

    Directory of Open Access Journals (Sweden)

    M.M. Sales

    2005-05-01

    Full Text Available We analyzed the genetic recombination pattern of the T-cell receptor beta-chain gene (TCR-beta in order to identify clonal expansion of T-lymphocytes in 17 human T-lymphotropic virus type I (HTLV-I-positive healthy carriers, 7 of them with abnormal features in the peripheral blood lymphocytes. Monoclonal or oligoclonal expansion of T-cells was detected in 5 of 7 HTLV-I-positive patients with abnormal lymphocytes and unconfirmed diagnosis by using PCR amplification of segments of TCR-beta gene, in a set of reactions that target 102 different variable (V segments, covering all members of the 24 V families available in the gene bank, including the more recently identified segments of the Vbeta-5 and Vbeta-8 family and the two diversity beta segments. Southern blots, the gold standard method to detect T-lymphocyte clonality, were negative for all of these 7 patients, what highlights the low sensitivity of this method that requires a large amount of very high quality DNA. To evaluate the performance of PCR in the detection of clonality we also analyzed 18 leukemia patients, all of whom tested positive. Clonal expansion was not detected in any of the negative controls or healthy carriers without abnormal lymphocytes. In conclusion, PCR amplification of segments of rearranged TCR-beta is reliable and highly suitable for the detection of small populations of clonal T-cells in asymptomatic HTLV-I carriers who present abnormal peripheral blood lymphocytes providing an additional instrument for following up these patients with potentially higher risk of leukemia.

  18. Prevalence of HTLV-I antibody among two distinct ethnic groups inhabiting the Amazon region of Brazil Prevalência do anticorpo HTLV-I em dois grupos étnicos distintos habitando a região da Amazônia Brasileira

    Directory of Open Access Journals (Sweden)

    C.M. Nakauchi

    1992-08-01

    Full Text Available HTLV-I seroprevalences of 3.63% (02/55, 12.19% (10/82 and 13.88% (10/72 were demonstrated among Tiryio, Mekranoiti and Xicrin Amazonian Indians, respectively, by the Western blotting enzyme assay (WBEI. By indirect immuno electron microscopy (IIEM, 2 Tiriyo, 9 Mekranoiti and 6 Xicrin Amerindians were reactive. Of 44 serum samples from Japanese immigrants, none reacted by any of the techniques before mentioned. One, 8 and 6 serum samples from Tiryio, Mekranoiti and Xicrin Indians, respectively, were both WBEI and IIEM positive. Our results strongly suggest that HTLV-I and/or an HTLV-I antigenic variant circulate (s among populations living in the Amazon region of Brazil.Soroprevalências para HTLV-I de 3,63% (02/55, 12,9% (10/82 e 13,88% (10/72 foram demonstradas entre os Tiryió, Mekranoiti e Xicrin, respectivamente - indígenas habitantes da Amazônia -, utilizando-se a técnica de "Western Blot" (WBEI. Por outro lado, a imunomicroscopia eletrônica indireta (IIME revelou como positivos 2 Tiryió, 9 Mekranoiti e 6 Xicrins. Das 44 amostras de soro oriundas de migrantes japoneses, nenhuma resultou positiva pelas duas técnicas antes mencionadas. Foram reativos por ambos os métodos, 1, 8 e 6 amostras dos índios Tiryió, Mekranoiti e Xicrin, respectivamente. Nossos resultados representam uma forte evidência de que o HTV-I e/ou variante(s antigenicamente similar(es circula(m entre populações que habitam a região amazônica do Brasil.

  19. Immunization against HTLV-I with chitosan and tri-methylchitosan nanoparticles loaded with recombinant env23 and env13 antigens of envelope protein gp46.

    Science.gov (United States)

    Amirnasr, Maryam; Fallah Tafti, Tannan; Sankian, Mojtaba; Rezaei, Abdorrahim; Tafaghodi, Mohsen

    2016-08-01

    To prevent the spread of HTLV-I (Human T-lymphotropic virus type 1), a safe and effective vaccine is required. To increase immune responses against the peptide antigens can be potentiated with polymer-based nanoparticles, like chitosan (CHT) and trimethylchitosan (TMC), as delivery system/adjuvant. CHT and TMC nanoparticles loaded with recombinant proteins (env23 & env13) of gp46 were prepared by direct coating of antigens with positively charged polymers. The size of CHT and TMC nanoparticles (NPs) loaded with each antigen was about 400 nm. The physical stability of NPs was followed for 4 weeks. Both formulations showed to be stable for about 15 days. The immunogenicity of NPs loaded with antigens was studied after nasal and subcutaneous immunization in mice. Three immunizations (7.5 μg antigen) were performed with 2 weeks intervals. Two weeks after the last booster dose, sera IgG subtypes were measured. After subcutaneous administration, for both nanoparticulate antigens, serum IgG1 and IgGtotal levels were higher than antigen solution (P nanoparticles showed good immunoadjuvant potential. Env23 antigen was a better candidate for vaccination against HTLV-I, as it induced higher cellular immune responses, compared with env13. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  1. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  2. Asociación entre infección por el virus linfotrópico humano de células T tipo I (HTLV-I y mortalidad en pacientes hospitalizados con tuberculosis

    Directory of Open Access Journals (Sweden)

    Kristien Verdonck Bosteels

    2004-10-01

    Full Text Available El Perú es un país de alta prevalencia de tuberculosis (TBC y endémico para la infección por el virus linfotrópico humano de células T tipo I (HTLV-I. Objetivo: Determinar la asociación entre la infección por HTLV-I y la mortalidad de los pacientes hospitalizados por TBC. Material y métodos: Los pacientes que ingresaron consecutivamente con el diagnóstico de TBC a los servicios de hospitalización de los Departamentos de Medicina Interna y de Enfermedades Infecciosas, Tropicales y Dermatológicas del Hospital Nacional Cayetano Heredia fueron entrevistados y sometidos a una prueba diagnóstica para la infección por HTLV-I. Se revisaron sus historias clínicas y los libros de altas para definir el resultado de la hospitalización. Las variables clínicas y epidemiológicas que estuvieron asociadas con mortalidad durante la hospitalización en el análisis univariado fueron incluidos en un modelo de regresión logística múltiple. Resultados: Se incluyeron 193 pacientes hospitalizados con TBC; 14 tuvieron infección por HTLV-I (7.3%. En el análisis multivariado, la infección por HTLV-I (OR ajustado 9.4; IC 2.2 - 40.6, TBC meníngea (OR ajustado 3.8; IC 1.3 - 11.5 y la condición de infección por VIH desconocido (OR ajustado 0.2; IC 0.04 - 0.6 se encontraron asociadas con la mortalidad durante la hospitalización. Conclusión: Este estudio demuestra que la infección por HTLV-I es frecuente entre los pacientes hospitalizados con TBC y que existe una relación independiente entre esta infección y la mortalidad durante la hospitalización.(Rev Med Hered 2004;15:197-202.

  3. El HTLV-I y la PET/HAM un modelo de investigación en virología y biología molecular

    Directory of Open Access Journals (Sweden)

    Felipe García Vallejo

    2004-03-01

    Full Text Available

    En la actualidad la infección por el virus linfotrópico humano tipo 1 (HTLV-1 ha sido confirmada epidemiológicamente en la Leucemia/Linfoma de las Células T del Adulto (ATLL y en la Paraparesia Espástica Tropical/ Mielopatía Asociada al HTLV-I (PET/MAH (1. El HTLV-I es endémico en varias áreas geográficas del mundo y representa un problema de salud pública global. En Colombia las áreas mas afectadas incluyen diferentes poblaciones de la costa pacífica y del sur occidente. En el laboratorio de Biología Molecular y Patogénesis de la Facultad de Salud de la Universidad de del Valle, nos hemos planteado las siguientes preguntas para las cuales hemos realizado una serie de estudios moleculares:

    • Cual fue el origen y cómo se dispersó el virus en Sur América y especialmente en Colombia.

    • Cuales son los principales mecanismos moleculares involucrados en la progresión de la PET/MAH.

    • Como es la integración de los provirus durante la progresión de la PET/MAH y cuales serían nuevos blancos moleculares virales y principios activos para el diseño de una nueva estrategia antirretroviral.

    En la primera, nuestros datos filogenéticos sobre las regiones genómicas virales 3´LTR, Env y Tax obtenidos, permitieron determinar que el subtipo mas prevalente en Colombia es el cosmopolita, en el que los genotipos moleculares africanos son los más abundantes en la costa pacífica; en general nuestros resultados mostraron que la actual diversidad genética del HTLV-I en Colombia es compleja y es el resultado de varios eventos de introducción temporalmente separados (2-4.

  4. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  5. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  6. Study of the peptide length and amino acid specific substitution in the antigenic activity of the chimeric synthetic peptides, containing the p19 core and gp46 envelope proteins of the HTLV-I virus.

    Science.gov (United States)

    Marin, Milenen Hernández; Rodríguez-Tanty, Chryslaine; Higginson-Clarke, David; Bocalandro, Yadaris Márquez; Peña, Lilliam Pozo

    2005-10-28

    Four chimeric synthetic peptides (Q5, Q6, Q7(multiply sign in circle), and Q8(multiply sign in circle)), incorporating immunodominant epitopes of the core p19 (105-124 a.a.) and envelope gp46 proteins (175-205 a.a.), of HTLV-I were obtained. Also, two gp46 monomeric peptides M4 and M5(multiply sign in circle) (Ser at position 192) were synthesized. The analysis of the influence of the peptide lengths and the proline to serine substitution on the chimeric and monomeric peptides' antigenicity, with regard to the chimeric peptides Q1, Q2, Q3(multiply sign in circle), and Q4(multiply sign in circle), reported previously, for HTLV-I was carried out. The peptides' antigenicity was evaluated in an ultramicroenzyme-linked immunosorbent assay (UMELISA) using sera of HTLV-I/II. The peptides' antigenicity was affected appreciably by the change of the peptide length and amino acid substitutions into the immunodominant sequence of gp46 peptide.

  7. Deregulation of calcium fluxes in HTLV-I infected CD4-positive T-cells plays a major role in malignant transformation.

    Science.gov (United States)

    Akl, Haidar; Badran, Bassam; El Zein, Nabil; Dobirta, Gratiela; Burny, Arsene; Martiat, Philippe

    2009-01-01

    The CD4+ T-cell malignancy induced by human T-cell leukemia virus type 1 (HTLV-I) infection and termed; Adult T-cell Leukemia lymphoma (ATLL), is caused by defects in the mechanisms underlying cell proliferation and cell death. In the CD4+ T-cells, calcium ions are central for both phenomena. ATLL is associated with a marked hypercalcemia in many patients. The consequence of a defect in the Ca2+ signaling pathway for lymphocyte activation is characterized by an impaired NFAT activation and transcription of cytokines, chemokines and many other NFAT target genes whose transcription is essential for productive immune defense. Fresh ATLL cells lack the TCR/CD3 and CD7 molecules on their surface. Whereas CD7 is a calcium transporter, reduction in calcium influx in response to T-cell activation was reported as a functional consequence of TCR/CD3 expression deficiency. Understanding these changes and identifying the molecular players involved might provide further insights on how to improve ATLL treatment.

  8. Modelación molecular y variación estructural de las integrasas de dos retrovirus humanos: HTLV-I y VIH-1

    Directory of Open Access Journals (Sweden)

    Felipe García Vallejo

    2009-01-01

    Materiales y métodos: Tanto la integrasa del HTLV-I como la del VIH-1 son proteínas compuestas por 288 residuos de aminoácidos. Se encontró un parecido de estructuras terciarias entre los dominios catalíticos de las IN de VIH-1, ASV y RSV con la del HTLVI. A partir de 103 secuencias completas de la integrasa del VIH-1 se registraron, en 46 codones, un total de 53 sustituciones que se localizaron en diferentes posiciones de la proteína nativa; las más frecuentes fueron: N27G (32,1%, A265V (30,1%, L101I (31,1% y T123A (27,0%. Ninguna de las sustituciones más frecuentemente encontradas generó un cambio en el plegamiento nativo de la correspondiente región. Conclusión: La estructura tridimensional del dominio central catalítico de la integrasa condicionaría su actividad y su relación con moléculas potencialmente inhibidoras. Las sustituciones observadas fueron neutrales sin alterar la estructura nativa. Los resultados obtenidos confirman que la integrasa es un nuevo y promisorio blanco para el desarrollo de terapias antirretrovirales más efectivas en el siglo xxi.

  9. On the weighting of accident probabilities for evident emotive factors

    International Nuclear Information System (INIS)

    Dukes, J.A.

    1979-01-01

    Problems in risk management of the additive property of; accident risk costs, the special case of the infrequent disaster, and the correct amount to spend on accident prevention, are considered. The need for weighting by additional emotive factors is discussed. Such factors here considered are; the scale factor relating to the number of people who as a result of the accident are killed, the age factor which takes into account the novelty of the situation against the background of common human experience, and the comprehension factor which is a weighting associated with the extent to which the 'man in the street' may be expected to understand the mechanism of the accident. A table shows how these factors combine for a set of accident scenarios including radioactive spills and a loss of coolant reactor accident. (U.K.)

  10. Seropositividad al virus linfotrópico de células T humanas tipos I y II en donantes del Banco Municipal de Sangre de Caracas y factores de riesgo asociados

    Directory of Open Access Journals (Sweden)

    Graciela León

    2003-03-01

    Full Text Available OBJETIVOS: Conocer la proporción de sangre descartada por seropositividad al virus linfotrópico de células T humanas (HTLV tipos I y II, la prevalencia de dicha infección y los probables factores de riesgo en donantes del Banco Municipal de Sangre de Caracas (BMSC. MÉTODOS: Se evaluaron serológicamente mediante ensayos de inmunoadsorción enzimática (ELISA 23 413 donantes atendidos entre julio del año 2000 y abril de 2001 en el BMSC. Las muestras repetidamente reactivas (RR se estudiaron por inmunoblot de Western (WB, como prueba suplementaria. Los donantes positivos o indeterminados por WB fueron citados a la consejería para realizar la confirmación mediante la amplificación de ácidos nucleicos por reacción en cadena de la polimerasa (PCR, recoger datos sobre sus antecedentes de riesgo y asesorarlos acerca de su estado. RESULTADOS. El 0,2% de las donaciones resultaron RR; de ellas 52,1% resultaron positivas en el WB (23 a HTLV I y 2 a HTLV II; 4,1% indeterminadas por WB; 29,2% negativas; y el 14,6% no pudo ser evaluado. Asistieron a la consejería 16 donantes (14 WB positivos a HTLV I, 1 a HTLV II y 1 indeterminado. Todos resultaron positivos en la RCP. No se encontraron diferencias significativas con el grupo control en cuanto a edad, sexo, tipo de donación, número de donaciones previas, antecedentes de transfusiones y comportamiento sexual. Se observaron diferencias significativas según los antecedentes de consumo de drogas no intravenosas (P < 0,05, y altamente significativas (P < 0,001 según los antecedentes de lactancia materna larga. Las madres estudiadas de seis de los donantes positivos que manifestaron haber tenido una larga lactancia materna resultaron positivas, al igual que el hijo mayor de la única pareja positiva de las 13 evaluadas. CONCLUSIONES. Se descartó el 0,2% de la sangre por resultar positiva al HTLV I/II. La prevalencia entre los donantes fue de 0,11%. En el 37,5% de los casos se pudo determinar la

  11. Seropositividad al virus linfotrópico de células T humanas tipos I y II en donantes del Banco Municipal de Sangre de Caracas y factores de riesgo asociados

    Directory of Open Access Journals (Sweden)

    León Graciela

    2003-01-01

    Full Text Available OBJETIVOS: Conocer la proporción de sangre descartada por seropositividad al virus linfotrópico de células T humanas (HTLV tipos I y II, la prevalencia de dicha infección y los probables factores de riesgo en donantes del Banco Municipal de Sangre de Caracas (BMSC. MÉTODOS: Se evaluaron serológicamente mediante ensayos de inmunoadsorción enzimática (ELISA 23 413 donantes atendidos entre julio del año 2000 y abril de 2001 en el BMSC. Las muestras repetidamente reactivas (RR se estudiaron por inmunoblot de Western (WB, como prueba suplementaria. Los donantes positivos o indeterminados por WB fueron citados a la consejería para realizar la confirmación mediante la amplificación de ácidos nucleicos por reacción en cadena de la polimerasa (PCR, recoger datos sobre sus antecedentes de riesgo y asesorarlos acerca de su estado. RESULTADOS. El 0,2% de las donaciones resultaron RR; de ellas 52,1% resultaron positivas en el WB (23 a HTLV I y 2 a HTLV II; 4,1% indeterminadas por WB; 29,2% negativas; y el 14,6% no pudo ser evaluado. Asistieron a la consejería 16 donantes (14 WB positivos a HTLV I, 1 a HTLV II y 1 indeterminado. Todos resultaron positivos en la RCP. No se encontraron diferencias significativas con el grupo control en cuanto a edad, sexo, tipo de donación, número de donaciones previas, antecedentes de transfusiones y comportamiento sexual. Se observaron diferencias significativas según los antecedentes de consumo de drogas no intravenosas (P < 0,05, y altamente significativas (P < 0,001 según los antecedentes de lactancia materna larga. Las madres estudiadas de seis de los donantes positivos que manifestaron haber tenido una larga lactancia materna resultaron positivas, al igual que el hijo mayor de la única pareja positiva de las 13 evaluadas. CONCLUSIONES. Se descartó el 0,2% de la sangre por resultar positiva al HTLV I/II. La prevalencia entre los donantes fue de 0,11%. En el 37,5% de los casos se pudo determinar la

  12. Maternal factors and the probability of a planned home birth

    NARCIS (Netherlands)

    Anthony, S.; Buitendijk, S.E.; Offerhaus, P.M.; Dommelen, P. van; Pal-de Bruin, K.M. van der

    2005-01-01

    Objectives: In the Netherlands, approximately one-third of births are planned home births, mostly supervised by a midwife. The relationship between maternal demographic factors and home births supervised by midwives was examined. Design: Cross-sectional study. Setting: Dutch national perinatal

  13. Maternal factors and the probability of a planned home birth

    NARCIS (Netherlands)

    Anthony, S.; Buitendijk, S. E.; Offerhaus, P. M.; Dommelen, P.; Pal-de Bruin, K. M.

    2005-01-01

    OBJECTIVES: In the Netherlands, approximately one-third of births are planned home births, mostly supervised by a midwife. The relationship between maternal demographic factors and home births supervised by midwives was examined. DESIGN: Cross-sectional study. Setting Dutch national perinatal

  14. The mediating effect of psychosocial factors on suicidal probability among adolescents.

    Science.gov (United States)

    Hur, Ji-Won; Kim, Won-Joong; Kim, Yong-Ku

    2011-01-01

    Suicidal probability is an actual tendency including negative self-evaluation, hopelessness, suicidal ideation, and hostility. The purpose of this study was to examine the role of psychosocial variances in the suicidal probability of adolescents, especially the role of mediating variance. This study investigated the mediating effects of psychosocial factors such as depression, anxiety, self-esteem, stress, and social support on the suicidal probability among 1,586 adolescents attending middle and high schools in the Kyunggi Province area of South Korea. The relationship between depression and anxiety/suicidal probability was mediated by both social resources and self-esteem. Furthermore, the influence of social resources was mediated by interpersonal and achievement stress as well as self-esteem. This study suggests that suicidal probability in adolescents has various relationships, including mediating relations, with several psychosocial factors. The interventions on suicidal probability in adolescents should focus on social factors as well as clinical symptoms.

  15. An analytical evaluation for spatial-dependent intra-pebble Dancoff factor and escape probability

    International Nuclear Information System (INIS)

    Kim, Songhyun; Kim, Hong-Chul; Kim, Jong Kyung; Kim, Soon Young; Noh, Jae Man

    2009-01-01

    The analytical evaluation of spatial-dependent intra-pebble Dancoff factors and their escape probabilities is pursued by the model developed in this study. Intra-pebble Dancoff factors and their escape probabilities are calculated as a function of fuel kernel radius, number of fuel kernels, and fuel region radius. The method in this study can be easily utilized to analyze the tendency of spatial-dependent intra-pebble Dancoff factor and spatial-dependent fuel region escape probability for the various geometries because it is faster than the MCNP method as well as good accuracy. (author)

  16. The value of csf analysis for the differential diagnosis of HTLV-I associated myelopathy and multiple sclerosis Valor da análise do LCR para o diagnóstico diferencial de mielopatia associada ao HTLV-I e esclerose múltipla

    Directory of Open Access Journals (Sweden)

    Marzia Puccioni-Sohler

    1995-12-01

    Full Text Available Cerebrospinal fluid (CSF and serum of 17 patients with HAM/TSP (HTLV-I associated myelopathy/ tropical spastic paraparesis, six with multiple sclerosis and six with idiopathic epilepsy (non inflammatory control from Brazil were analysed for the presence of intrathecal synthesis of virus-specific antibodies against measles, rubella, varicella zoster virus and herpes simplex virus by enzyme-linked immunosorbent assay (ELISA. All HAM/TSP and multiple sclerosis cases had an intrathecal immune response (oligoclonal IgG. In HAM/TSP, only 1/17 case showed a polyspecific intrathecal immune response against measles and rubella virus. In multiple sclerosis, specific antibodies against measles and rubella (MRZ response were observed in all patients but not in the control with idiopathic epilepsy. The diagnostic and theoretical relevance of mono- and polyspecific immune responses is discussed for these chronic neurological diseases.Amostras de líquido cefalorraquidiano (LCR e soro de 17 pacientes brasileiros com HAM/TSP, seis com esclerose múltipla e seis com epilepsia idiopática (controle não-inflamatório foram analisadas para a presença de anticorpos para os vírus do sarampo, rubéola, varicela zoster e herpes simples pelo método de ELISA. Todos os casos de HAM/TSP e esclerose múltipla tinham resposta imune intratecal (IgG oligoclonal. Somente 1/17 casos de HAM/TSP apresentavam resposta imune poliespecifica intratecal para sarampo e rubéola. Anticorpos específicos para sarampo e rubeola (resposta MRZ foram observados em todos os pacientes com esclerose múltipla, mas não nos controles com epilepsia idiopática. A relevância das respostas poliespecifica e monoespecifica é discutida para essas doenças neurológicas crônicas.

  17. Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation

    International Nuclear Information System (INIS)

    Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto

    2015-01-01

    It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)

  18. The joint probability distribution of structure factors incorporating anomalous-scattering and isomorphous-replacement data

    International Nuclear Information System (INIS)

    Peschar, R.; Schenk, H.

    1991-01-01

    A method to derive joint probability distributions of structure factors is presented which incorporates anomalous-scattering and isomorphous-replacement data in a unified procedure. The structure factors F H and F -H , whose magnitudes are different due to anomalous scattering, are shown to be isomorphously related. This leads to a definition of isomorphism by means of which isomorphous-replacement and anomalous-scattering data can be handled simultaneously. The definition and calculation of the general term of the joint probability distribution for isomorphous structure factors turns out to be crucial. Its analytical form leads to an algorithm by means of which any particular joint probability distribution of structure factors can be constructed. The calculation of the general term is discussed for the case of four isomorphous structure factors in P1, assuming the atoms to be independently and uniformly distributed. A main result is the construction of the probability distribution of the 64 triplet phase sums present in space group P1 amongst four isomorphous structure factors F H , four isomorphous F K and four isomorphous F -H-K . The procedure is readily generalized in the case where an arbitrary number of isomorphous structure factors are available for F H , F K and F -H-K . (orig.)

  19. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    Science.gov (United States)

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  20. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    Science.gov (United States)

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. A study of coarse mesh collision probability correction factors in slab lattices

    International Nuclear Information System (INIS)

    Buckler, A.N.

    1975-07-01

    Calculations of collision probability leakage estimates are performed in one dimensional slab geometry with one neutron group to gain some insight into methods of correction for the coarseness of the mesh H. The chief result is that the correction factor, beta, can be written as CD/H where C → 4 for the diffusion limit. An explicit expression for C is derived in terms of the E 3 function, for a linear flux variation across the slabs. (author)

  2. Prevalence of masturbation and associated factors in a British national probability survey

    OpenAIRE

    Gerressu, Makeda; Mercer, Catherine H.; Graham, Cynthia A.; Wellings, Kaye; Johnson, Anne M.

    2008-01-01

    This is the post-print version of the article. The official published version can be found at the link below. A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with repo...

  3. Disadvantage factors for square lattice cells using a collision probability method

    International Nuclear Information System (INIS)

    Raghav, H.P.

    1976-01-01

    The flux distribution in an infinite square lattice consisting of cylindrical fuel rods and moderator is calculated by using a collision probability method. Neutrons are assumed to be monoenergetic and the sources as well as scattering are assumed to be isotropic. Carlvik's method for the calculation of collision probability is used. The important features of the method are that the square boundary is treated exactly and the contribution of the surrounding cells is calculated explicitly. The method is programmed in a computer code CELLC. This carries out integration by Simpson's rule. The convergence and accuracy of CELLC is assessed by computing disadvantage factors for the well-known Thie lattices and comparing the results with Monte Carlo and other integral transport theory methods used elsewhere. It is demonstrated that it is not correct to apply the white boundary condition in the Wigner Seitz Cell for low pitch and low cross sections. (orig.) [de

  4. Disordered eating behaviors among transgender youth: Probability profiles from risk and protective factors.

    Science.gov (United States)

    Watson, Ryan J; Veale, Jaimie F; Saewyc, Elizabeth M

    2017-05-01

    Research has documented high rates of disordered eating for lesbian, gay, and bisexual youth, but prevalence and patterns of disordered eating among transgender youth remain unexplored. This is despite unique challenges faced by this group, including gender-related body image and the use of hormones. We explore the relationship between disordered eating and risk and protective factors for transgender youth. An online survey of 923 transgender youth (aged 14-25) across Canada was conducted, primarily using measures from existing youth health surveys. Analyses were stratified by gender identity and included logistic regressions with probability profiles to illustrate combinations of risk and protective factors for eating disordered behaviors. Enacted stigma (the higher rates of harassment and discrimination sexual minority youth experience) was linked to higher odds of reported past year binge eating and fasting or vomiting to lose weight, while protective factors, including family connectedness, school connectedness, caring friends, and social support, were linked to lower odds of past year disordered eating. Youth with the highest levels of enacted stigma and no protective factors had high probabilities of past year eating disordered behaviors. Our study found high prevalence of disorders. Risk for these behaviors was linked to stigma and violence exposure, but offset by social supports. Health professionals should assess transgender youth for disordered eating behaviors and supportive resources. © 2016 Wiley Periodicals, Inc.(Int J Eat Disord 2017; 50:515-522). © 2016 Wiley Periodicals, Inc.

  5. Probability Model of Allele Frequency of Alzheimer’s Disease Genetic Risk Factor

    Directory of Open Access Journals (Sweden)

    Afshin Fayyaz-Movaghar

    2016-06-01

    Full Text Available Background and Purpose: The identification of genetics risk factors of human diseases is very important. This study is conducted to model the allele frequencies (AFs of Alzheimer’s disease. Materials and Methods: In this study, several candidate probability distributions are fitted on a data set of Alzheimer’s disease genetic risk factor. Unknown parameters of the considered distributions are estimated, and some criterions of goodness-of-fit are calculated for the sake of comparison. Results: Based on some statistical criterions, the beta distribution gives the best fit on AFs. However, the estimate values of the parameters of beta distribution lead us to the standard uniform distribution. Conclusion: The AFs of Alzheimer’s disease follow the standard uniform distribution.

  6. Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.

    Science.gov (United States)

    Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C

    2003-03-01

    Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.

  7. Development and evaluation of probability density functions for a set of human exposure factors

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-06-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.

  8. Development and evaluation of probability density functions for a set of human exposure factors

    International Nuclear Information System (INIS)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-01-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors

  9. NOx emission calculations for bulk carriers by using engine power probabilities as weighting factors.

    Science.gov (United States)

    Cheng, Chih-Wen; Hua, Jian; Hwang, Daw-Shang

    2017-10-01

    An important marine pollution issue identified by the International Maritime Organization (IMO) is NO x emissions; however, the stipulated method for determining the NO x certification value does not reflect the actual high emission factors of slow-speed two-stroke diesel engines over long-term slow steaming. In this study, an accurate method is presented for calculating the NO x emission factors and total amount of NO x emissions by using the actual power probabilities of the diesel engines in four types of bulk carriers. The proposed method is suitable for all types and purposes of diesel engines, is not restricted to any operating modes, and is highly accurate. Moreover, it is recommended that the IMO-stipulated certification value calculation method be modified accordingly to genuinely reduce the amount of NO x emissions. The successful achievement of this level of reduction will help improve the air quality, especially in coastal and port areas, and the health of local residents. As per the IMO, the NO x emission certification value of marine diesel engines having a rated power over 130 kW must be obtained using specified weighting factor (WF)-based calculation. However, this calculation fails to represent the current actual situation. Effective emission reductions of 6.91% (at sea) and 31.9% (in ports) were achieved using a mathematical model of power probability functions. Thus, we strongly recommend amending the certification value of NO x Technical Code 2008 (NTC 2008) by removing the WF constraints, such that the NO x emissions of diesel engines is lower than the Tier-limits at any load level to obtain genuine NO x emission reductions.

  10. Regulation of IFN regulatory factor 4 expression in human T cell leukemia virus-I-transformed T cells.

    Science.gov (United States)

    Sharma, Sonia; Grandvaux, Nathalie; Mamane, Yael; Genin, Pierre; Azimi, Nazli; Waldmann, Thomas; Hiscott, John

    2002-09-15

    IFN regulatory factor (IRF)-4 is a lymphoid/myeloid-restricted member of the IRF transcription factor family that plays an essential role in the homeostasis and function of mature lymphocytes. IRF-4 expression is tightly regulated in resting primary T cells and is transiently induced at the mRNA and protein levels after activation by Ag-mimetic stimuli such as TCR cross-linking or treatment with phorbol ester and calcium ionophore (PMA/ionomycin). However, IRF-4 is constitutively upregulated in human T cell leukemia virus type I (HTLV-I) infected T cells as a direct gene target for the HTLV-I Tax oncoprotein. In this study we demonstrate that chronic IRF-4 expression in HTLV-I-infected T lymphocytes is associated with a leukemic phenotype, and we examine the mechanisms by which continuous production of IRF-4 is achieved in HTLV-I-transformed T cells. IRF-4 expression in HTLV-1-infected cells is driven through activation of the NF-kappaB and NF-AT pathways, resulting in the binding of p50, p65, and c-Rel to the kappaB1 element and p50, c-Rel, and NF-ATp to the CD28RE element within the -617 to -209 region of the IRF-4 promoter. Furthermore, mutation of either the kappaB1 or CD28RE sites blocks Tax-mediated transactivation of the human IRF-4 promoter in T cells. These experiments constitute the first detailed analysis of human IRF-4 transcriptional regulation within the context of HTLV-I infection and transformation of CD4(+) T lymphocytes.

  11. Analytical models of probability distribution and excess noise factor of solid state photomultiplier signals with crosstalk

    International Nuclear Information System (INIS)

    Vinogradov, S.

    2012-01-01

    Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.

  12. Arsenic concentrations, related environmental factors, and the predicted probability of elevated arsenic in groundwater in Pennsylvania

    Science.gov (United States)

    Gross, Eliza L.; Low, Dennis J.

    2013-01-01

    Analytical results for arsenic in water samples from 5,023 wells obtained during 1969–2007 across Pennsylvania were compiled and related to other associated groundwater-quality and environmental factors and used to predict the probability of elevated arsenic concentrations, defined as greater than or equal to 4.0 micrograms per liter (µg/L), in groundwater. Arsenic concentrations of 4.0 µg/L or greater (elevated concentrations) were detected in 18 percent of samples across Pennsylvania; 8 percent of samples had concentrations that equaled or exceeded the U.S. Environmental Protection Agency’s drinking-water maximum contaminant level of 10.0 µg/L. The highest arsenic concentration was 490.0 µg/L.

  13. Risk and protective factors of dissocial behavior in a probability sample.

    Science.gov (United States)

    Moral de la Rubia, José; Ortiz Morales, Humberto

    2012-07-01

    The aims of this study were to know risk and protective factors for dissocial behavior keeping in mind that the self-report of dissocial behavior is biased by the impression management. A probability sample of adolescents that lived in two neighborhoods with high indexes of gangs and offenses (112 male and 86 women) was collected. The 27-item Dissocial Behavior Scale (ECODI27; Pacheco & Moral, 2010), Balanced Inventory of Desirable Responding, version 6 (BIDR-6; Paulhus, 1991), Sensation Seeking Scale, form V (SSS-V; Zuckerman, Eysenck, & Eysenck, 1978), Parent-Adolescent Communication Scale (PACS; Barnes & Olson, 1982), 30-item Rathus Assertiveness Schedule (RAS; Rathus, 1973), Interpersonal Reactivity Index (IRI; Davis, 1983) and a social relationship questionnaire (SRQ) were applied. Binary logistic regression was used for the data analysis. A third of the participants showed dissocial behavior. Belonging to a gang in the school (schooled adolescents) or to a gang out of school and job (total sample) and desinhibition were risk factors; being woman, perspective taking and open communication with the father were protective factors. School-leaving was a differential aspect. We insisted on the need of intervention on these variables.

  14. Prevalence of masturbation and associated factors in a British national probability survey.

    Science.gov (United States)

    Gerressu, Makeda; Mercer, Catherine H; Graham, Cynthia A; Wellings, Kaye; Johnson, Anne M

    2008-04-01

    A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with reporting this behavior. Seventy-three percent of men and 36.8% of women reported masturbating in the 4 weeks prior to interview (95% confidence interval 71.5%-74.4% and 35.4%-38.2%, respectively). A number of sociodemographic and behavioral factors were associated with reporting masturbation. Among both men and women, reporting masturbation increased with higher levels of education and social class and was more common among those reporting sexual function problems. For women, masturbation was more likely among those who reported more frequent vaginal sex in the last four weeks, a greater repertoire of sexual activity (such as reporting oral and anal sex), and more sexual partners in the last year. In contrast, the prevalence of masturbation was lower among men reporting more frequent vaginal sex. Both men and women reporting same-sex partner(s) were significantly more likely to report masturbation. Masturbation is a common sexual practice with significant variations in reporting between men and women.

  15. A probable risk factor of female breast cancer: study on benign and malignant breast tissue samples.

    Science.gov (United States)

    Rehman, Sohaila; Husnain, Syed M

    2014-01-01

    The study reports enhanced Fe, Cu, and Zn contents in breast tissues, a probable risk factor of breast cancer in females. Forty-one formalin-fixed breast tissues were analyzed using atomic absorption spectrophotometry. Twenty malignant, six adjacent to malignant and 15 benign tissues samples were investigated. The malignant tissues samples were of grade 11 and type invasive ductal carcinoma. The quantitative comparison between the elemental levels measured in the two types of specimen (benign and malignant) tissues (removed after surgery) suggests significant elevation of these metals (Fe, Cu, and Zn) in the malignant tissue. The specimens were collected just after mastectomy of women aged 19 to 59 years from the hospitals of Islamabad and Rawalpindi, Pakistan. Most of the patients belong to urban areas of Pakistan. Findings of study depict that these elements have a promising role in the initiation and development of carcinoma as consistent pattern of elevation for Fe, Cu, and Zn was observed. The results showed the excessive accumulation of Fe (229 ± 121 mg/L) in malignant breast tissue samples of patients (p factor of breast cancer. In order to validate our method of analysis, certified reference material muscle tissue lyophilized (IAEA) MA-M-2/TM was analyzed for metal studied. Determined concentrations were quite in good agreement with certified levels. Asymmetric concentration distribution for Fe, Cu, and Zn was observed in both malignant and benign tissue samples.

  16. New experimental data on the influence of extranuclear factors on the probability of radioactive decay

    CERN Document Server

    Bondarevskij, S I; Skorobogatov, G A

    2002-01-01

    New experimental data on influence of various extranuclear factors on probability (lambda) of radioactive decay are presented. During redox processes in solutions containing sup 1 sup 3 sup 9 Ce relative change in lambda measured by the DELTA I/I method was [I(Ce sup I sup V)-I(Ce sup I sup I sup I)]/I sub m sub e sub a sub n +(1.4+-0.6)x10 sup - sup 4. Using a modification of the method based on displacement of the age-old radioactive equilibrium, when a source MgO( sup 1 sup 2 sup 1 sup m Te) was cooled to 78 K, growth of lambda of tellurium nuclear isomer by 0.04+-0.02% was detected. New experimental data on increase in gamma-radioactivity of sample Be sup 1 sup 2 sup 3 sup m Te at the expense of low-temperature induced reaction, i.e. collective nuclear superluminescence, are provided

  17. Operational NDT simulator, towards human factors integration in simulated probability of detection

    Science.gov (United States)

    Rodat, Damien; Guibert, Frank; Dominguez, Nicolas; Calmon, Pierre

    2017-02-01

    In the aeronautic industry, the performance demonstration of Non-Destructive Testing (NDT) procedures relies on Probability Of Detection (POD) analyses. This statistical approach measures the ability of the procedure to detect a flaw with regard to one of its characteristic dimensions. The inspection chain is evaluated as a whole, including equipment configuration, probe effciency but also operator manipulations. Traditionally, a POD study requires an expensive campaign during which several operators apply the procedure on a large set of representative samples. Recently, new perspectives for the POD estimation have been introduced using NDT simulation to generate data. However, these approaches do not offer straightforward solutions to take the operator into account. The simulation of human factors, including cognitive aspects, often raises questions. To address these diffculties, we propose a concept of operational NDT simulator [1]. This work presents the first steps in the implementation of such simulator for ultrasound phased array inspection of composite parts containing Flat Bottom Holes (FBHs). The final system will look like a classical ultrasound testing equipment with a single exception: the displayed signals will be synthesized. Our hardware (ultrasound acquisition card, 3D position tracker) and software (position analysis, inspection scenario, synchronization, simulations) environments are developed as a bench to test the meta-modeling techniques able to provide fast-simulated realistic ultra-sound signals. The results presented here are obtained by on-the-fly merging of real and simulated signals. They confirm the feasibility of our approach: the replacement of real signals by purely simulated ones has been unnoticed by operators. We believe this simulator is a great prospect for POD evaluation including human factors, and may also find applications for training or procedure set-up.

  18. Path probability distribution of stochastic motion of non dissipative systems: a classical analog of Feynman factor of path integral

    International Nuclear Information System (INIS)

    Lin, T.L.; Wang, R.; Bi, W.P.; El Kaabouchi, A.; Pujos, C.; Calvayrac, F.; Wang, Q.A.

    2013-01-01

    We investigate, by numerical simulation, the path probability of non dissipative mechanical systems undergoing stochastic motion. The aim is to search for the relationship between this probability and the usual mechanical action. The model of simulation is a one-dimensional particle subject to conservative force and Gaussian random displacement. The probability that a sample path between two fixed points is taken is computed from the number of particles moving along this path, an output of the simulation, divided by the total number of particles arriving at the final point. It is found that the path probability decays exponentially with increasing action of the sample paths. The decay rate increases with decreasing randomness. This result supports the existence of a classical analog of the Feynman factor in the path integral formulation of quantum mechanics for Hamiltonian systems

  19. Factors Influencing the Probability of a Diagnosis of Autism Spectrum Disorder in Girls versus Boys

    Science.gov (United States)

    Duvekot, Jorieke; van der Ende, Jan; Verhulst, Frank C.; Slappendel, Geerte; van Daalen, Emma; Maras, Athanasios; Greaves-Lord, Kirstin

    2017-01-01

    In order to shed more light on why referred girls are less likely to be diagnosed with autism spectrum disorder than boys, this study examined whether behavioral characteristics influence the probability of an autism spectrum disorder diagnosis differently in girls versus boys derived from a multicenter sample of consecutively referred children…

  20. Estimation of Partial Safety Factors and Target Failure Probability Based on Cost Optimization of Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Kim, Seung-Woo; Suh, Kyung-Duck; Burcharth, Hans F.

    2010-01-01

    The breakwaters are designed by considering the cost optimization because a human risk is seldom considered. Most breakwaters, however, were constructed without considering the cost optimization. In this study, the optimum return period, target failure probability and the partial safety factors...

  1. Integral transport multiregion geometrical shadowing factor for the approximate collision probability matrix calculation of infinite closely packed lattices

    International Nuclear Information System (INIS)

    Jowzani-Moghaddam, A.

    1981-01-01

    An integral transport method of calculating the geometrical shadowing factor in multiregion annular cells for infinite closely packed lattices in cylindrical geometry is developed. This analytical method has been programmed in the TPGS code. This method is based upon a consideration of the properties of the integral transport method for a nonuniform body, which together with Bonalumi's approximations allows the determination of the approximate multiregion collision probability matrix for infinite closely packed lattices with sufficient accuracy. The multiregion geometrical shadowing factors have been calculated for variations in fuel pin annular segment rings in a geometry of annular cells. These shadowing factors can then be used in the calculation of neutron transport from one annulus to another in an infinite lattice. The result of this new geometrical shadowing and collision probability matrix are compared with the Dancoff-Ginsburg correction and the probability matrix using constant shadowing on Yankee fuel elements in an infinite lattice. In these cases the Dancoff-Ginsburg correction factor and collision probability matrix using constant shadowing are in difference by at most 6.2% and 6%, respectively

  2. Theoretical determination of gamma spectrometry systems efficiency based on probability functions. Application to self-attenuation correction factors

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Manuel, E-mail: manuel.barrera@uca.es [Escuela Superior de Ingeniería, University of Cadiz, Avda, Universidad de Cadiz 10, 11519 Puerto Real, Cadiz (Spain); Suarez-Llorens, Alfonso [Facultad de Ciencias, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cadiz (Spain); Casas-Ruiz, Melquiades; Alonso, José J.; Vidal, Juan [CEIMAR, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cádiz (Spain)

    2017-05-11

    A generic theoretical methodology for the calculation of the efficiency of gamma spectrometry systems is introduced in this work. The procedure is valid for any type of source and detector and can be applied to determine the full energy peak and the total efficiency of any source-detector system. The methodology is based on the idea of underlying probability of detection, which describes the physical model for the detection of the gamma radiation at the particular studied situation. This probability depends explicitly on the direction of the gamma radiation, allowing the use of this dependence the development of more realistic and complex models than the traditional models based on the point source integration. The probability function that has to be employed in practice must reproduce the relevant characteristics of the detection process occurring at the particular studied situation. Once the probability is defined, the efficiency calculations can be performed in general by using numerical methods. Monte Carlo integration procedure is especially useful to perform the calculations when complex probability functions are used. The methodology can be used for the direct determination of the efficiency and also for the calculation of corrections that require this determination of the efficiency, as it is the case of coincidence summing, geometric or self-attenuation corrections. In particular, we have applied the procedure to obtain some of the classical self-attenuation correction factors usually employed to correct for the sample attenuation of cylindrical geometry sources. The methodology clarifies the theoretical basis and approximations associated to each factor, by making explicit the probability which is generally hidden and implicit to each model. It has been shown that most of these self-attenuation correction factors can be derived by using a common underlying probability, having this probability a growing level of complexity as it reproduces more precisely

  3. Calculation of Fire Severity Factors and Fire Non-Suppression Probabilities For A DOE Facility Fire PRA

    International Nuclear Information System (INIS)

    Elicson, Tom; Harwood, Bentley; Lucek, Heather; Bouchard, Jim

    2011-01-01

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. The fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: Development of time-dependent fire heat release rate profiles (required as input to CFAST), Calculation of fire severity factors based on CFAST detailed fire modeling, and Calculation of fire non-suppression probabilities.

  4. Factors influencing the probability of a diagnosis of autism spectrum disorder in girls versus boys.

    Science.gov (United States)

    Duvekot, Jorieke; van der Ende, Jan; Verhulst, Frank C; Slappendel, Geerte; van Daalen, Emma; Maras, Athanasios; Greaves-Lord, Kirstin

    2017-08-01

    In order to shed more light on why referred girls are less likely to be diagnosed with autism spectrum disorder than boys, this study examined whether behavioral characteristics influence the probability of an autism spectrum disorder diagnosis differently in girls versus boys derived from a multicenter sample of consecutively referred children aged 2.5-10 years. Based on information from the short version of the Developmental, Dimensional and Diagnostic Interview and the Autism Diagnostic Observation Schedule, 130 children (106 boys and 24 girls) received a diagnosis of autism spectrum disorder according to Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.) criteria and 101 children (61 boys and 40 girls) did not. Higher overall levels of parent-reported repetitive and restricted behavior symptoms were less predictive of an autism spectrum disorder diagnosis in girls than in boys (odds ratio interaction = 0.41, 95% confidence interval = 0.18-0.92, p = 0.03). In contrast, higher overall levels of parent-reported emotional and behavioral problems increased the probability of an autism spectrum disorder diagnosis more in girls than in boys (odds ratio interaction = 2.44, 95% confidence interval = 1.13-5.29, p = 0.02). No differences were found between girls and boys in the prediction of an autism spectrum disorder diagnosis by overall autistic impairment, sensory symptoms, and cognitive functioning. These findings provide insight into possible explanations for the assumed underidentification of autism spectrum disorder in girls in the clinic.

  5. Graft rejection episodes after Descemet stripping with endothelial keratoplasty: part two: the statistical analysis of probability and risk factors.

    Science.gov (United States)

    Price, M O; Jordan, C S; Moore, G; Price, F W

    2009-03-01

    To investigate risk factors and probability of initial immunological graft rejection episodes after Descemet stripping with endothelial keratoplasty (DSEK). Outcomes of 598 DSEK cases from a single tertiary referral centre were reviewed. Risk factors and probability of rejection were assessed by multivariate Cox proportional hazards modelling. Rejection episodes occurred in 54 eyes of 48 patients. Estimated probability of a rejection episode was 7.6% by 1 year and 12% by 2 years after grafting. Relative risk of rejection was five times higher for African-American patients compared with Caucasians (p = 0.0002). Eyes with pre-existing glaucoma (9%) or steroid-responsive ocular hypertension (27%) had twice the relative risk of rejection (p = 0.045) compared with eyes that did not have those problems. Patient age, sex and corneal diagnosis did not significantly influence rejection risk. Risk of rejection was not increased when fellow eyes were grafted within 1 year of the first eye (p = 0.62). Pre-existing glaucoma or steroid-responsive ocular hypertension and race were the two factors that independently influenced relative risk of rejection after DSEK. Rejection risk was not increased if the fellow eye was grafted within the prior year with DSEK.

  6. Unresolved resonance range cross section, probability tables and self shielding factor

    International Nuclear Information System (INIS)

    Sublet, J.Ch.; Blomquist, R.N.; Goluoglu, S.; Mac Farlane, R.E.

    2009-07-01

    The performance and methodology of 4 processing codes have been compared in the unresolved resonance range of a selected set of isotopes. Those isotopes have been chosen to encompass most cases encountered in the unresolved energy range contained in major libraries like Endf/B-7 or Jeff-3.1.1. The code results comparison is accompanied by data format and formalism examinations and processing code fine-interpretation study. After some improvements, the results showed generally good agreement, although not perfect with infinite dilute cross-sections. However, much larger differences occur when shelf-shielded effective cross-sections are compared. The infinitely dilute cross-section are often plot checked but it is the probability table derived and shelf-shielded cross sections that are used and interpreted in criticality and transport calculations. This suggests that the current evaluation data format and formalism, in the unresolved resonance range should be tightened up, ambiguities removed. In addition production of the shelf shielded cross-sections should be converged to a much greater accuracy. (author)

  7. Estimating factors influencing the detection probability of semiaquatic freshwater snails using quadrat survey methods

    Science.gov (United States)

    Roesler, Elizabeth L.; Grabowski, Timothy B.

    2018-01-01

    Developing effective monitoring methods for elusive, rare, or patchily distributed species requires extra considerations, such as imperfect detection. Although detection is frequently modeled, the opportunity to assess it empirically is rare, particularly for imperiled species. We used Pecos assiminea (Assiminea pecos), an endangered semiaquatic snail, as a case study to test detection and accuracy issues surrounding quadrat searches. Quadrats (9 × 20 cm; n = 12) were placed in suitable Pecos assiminea habitat and randomly assigned a treatment, defined as the number of empty snail shells (0, 3, 6, or 9). Ten observers rotated through each quadrat, conducting 5-min visual searches for shells. The probability of detecting a shell when present was 67.4 ± 3.0%, but it decreased with the increasing litter depth and fewer number of shells present. The mean (± SE) observer accuracy was 25.5 ± 4.3%. Accuracy was positively correlated to the number of shells in the quadrat and negatively correlated to the number of times a quadrat was searched. The results indicate quadrat surveys likely underrepresent true abundance, but accurately determine the presence or absence. Understanding detection and accuracy of elusive, rare, or imperiled species improves density estimates and aids in monitoring and conservation efforts.

  8. Risk factors of delay proportional probability in diphtheria-tetanus-pertussis vaccination of Iranian children; Life table approach analysis

    Directory of Open Access Journals (Sweden)

    Mohsen Mokhtari

    2015-01-01

    Full Text Available Despite success in expanded program immunization for an increase in vaccination coverage in the children of world, timeliness and schedule of vaccination remains as one of the challenges in public health. This study purposed to demonstrate the related factors of delayed diphtheria-tetanus-pertussis (DTP vaccination using life table approach. A historical cohort study conducted in the poor areas of five large Iran cities. Totally, 3610 children with 24-47 months old age who had documented vaccination card were enrolled. Time of vaccination for the third dose of DTP vaccine was calculated. Life table survival was used to calculate the proportional probability of vaccination in each time. Wilcoxon test was used for the comparison proportional probability of delayed vaccination based on studies factors. The overall median delayed time for DTP3 was 38.52 days. The Wilcoxon test showed that city, nationality, education level of parents, birth order and being in rural areas are related to the high probability of delay time for DTP3 vaccination (P 0.05. Being away from the capital, a high concentration of immigrants in the city borders with a low socioeconomic class leads to prolonged delay in DTP vaccination time. Special attention to these areas is needed to increase the levels of parental knowledge and to facilitate access to the health services care.

  9. Factors associated with high probability of target blood pressure non-achievement in hypertensive patients

    Directory of Open Access Journals (Sweden)

    S. P. Zhemanyuk

    2017-12-01

    Full Text Available One of the topic issue of modern cardiology is factors of target blood pressure level non-achievement clarifying due to a better understanding how we can reduce cardiovascular complications. The aim of the study is to determine the factors of poor blood pressure control using the ambulatory blood pressure monitoring parameters and adenosine 5'-diphosphate-induced platelet aggregation parameters in patients with arterial hypertension. Material and methods. The study involved 153 patients with essential hypertension (EH stage II, II degree. The ambulatory blood pressure monitoring (ABPM was performed in patients during at least two of first-line antihypertensive drugs in optimal daily doses usage by the ABPM bifunctional device (Incart, S.-P., R.F.. Platelet aggregation was carried out using light transmittance aggregation by optical analyzer (Solar, R.B. with adenosine 5'-diphosphate (Sigma-Aldrich at final concentration of 10.0 × 10-6 mol / L. The first group were inadequately controlled essential hypertensive individuals with high systolic or/and diastolic BP level according to the ABPM results, and the second one were patients with adequately controlled EH. Groups of patients were comparable in age (60.39 ± 10.74 years vs. 62.80 ± 9.63; p = 0.181, respectively. In the group of EH patients who reached the target level of blood pressure, women predominated (60% vs. 39.81%; p = 0.021, respectively. We used the binary logistic regression analysis to determine the predictors of target blood pressure level poor reaching using ABPM and platelet aggregation parameters. Results According to the univariate logistic regression analysis, the dependent factors influencing the target blood pressure level poor reaching are the average diurnal diastolic blood pressure (DBP (OR = 44.8; diurnal variability of systolic blood pressure (SBP (OR = 4.4; square index of hypertension for diurnal periods SBP (OR = 318.9; square index of hypertension for diurnal

  10. Freqüência das infecções pelo HIV-1, rubéola, sífilis, toxoplasmose, citomegalovírus, herpes simples, hepatite B, hepatite C, doença de Chagas e HTLV I/II em gestantes, do Estado de Mato Grosso do Sul Frequency of HIV-1, rubella, syphilis, toxoplasmosis, cytomegalovirus, simple herpes virus, hepatitis B, hepatitis C, Chagas’ disease and HTLV I/II infection in pregnant women of State of Mato Grosso do Sul

    Directory of Open Access Journals (Sweden)

    Ernesto Antonio Figueiró-Filho

    2007-04-01

    Full Text Available Objetivou-se avaliar a freqüência das infecções por sífilis, rubéola, hepatite B, hepatite C, toxoplasmose, doença de Chagas, HTLV I/II, herpes simples, HIV-1 e citomegalovírus em gestantes e relacionar a faixa etária das pacientes com a freqüência das infecções. Estudo transversal de 32.512 gestantes submetidas à triagem pré-natal no período de novembro de 2002 a outubro de 2003. As freqüências encontradas foram de 0,2% para infecção pelo vírus HIV-1, 0,03% para rubéola, 0,8% para sífilis, 0,4% para toxoplasmose, 0,05% para infecção aguda pelo citomegalovírus, 0,02% pelo vírus herpes simples, 0,3% para hepatite B (HBsAg, 0,1% para hepatite C, 0,1% para HTLV I/II e 0,1% para doença de Chagas. Houve associação significativa entre faixa etária e infecções por rubéola, citomegalovírus, doença de Chagas e herpes vírus. As freqüências de rubéola, sífilis, toxoplasmose, doença de Chagas e citomegalovírus nas gestantes encontram-se abaixo dos valores descritos na literatura.It was aimed to estimate the frequency of syphilis, rubella, hepatitis B, hepatitis C, toxoplasmosis, Chagas’ disease, HTLV I/II, simple herpes virus, HIV-1 and cytomegalovirus in pregnant women and to evaluate the relationship between age and the frequency of the infections studied. A transversal study of 32,512 pregnant women submitted to pre-natal sreening in the period of November 2002 to October 2003. The frequency of the tried infections among the pregnant women were 0.2% of HIV-1, 0.03% of rubella, 0.8% of syphilis, 0.4% of toxoplasmosis, 0.05% of cytomegalovirus, 0.02% of simple herpes virus, 0.3% of HBsAg, 0.1% of hepatitis C, 0.1% of HTLV and 0.1% of Chagas’ disease. There was significative statistical association between age and prenatal infection of rubella, cytomegalovirus, Chagas’ disease and herpes virus. The rates of frequency of rubella, syphilis, toxoplasmosis, Chagas’ disease and cytomegalovirus in pregnant women

  11. Environmental and anthropogenic factors affecting the probability of occurrence of Oncomegas wageneri (Cestoda: Trypanorhyncha) in the southern Gulf of Mexico.

    Science.gov (United States)

    Vidal-Martínez, Víctor M; Torres-Irineo, Edgar; Romero, David; Gold-Bouchot, Gerardo; Martínez-Meyer, Enrique; Valdés-Lozano, David; Aguirre-Macedo, M Leopoldina

    2015-11-26

    Understanding the environmental and anthropogenic factors influencing the probability of occurrence of the marine parasitic species is fundamental for determining the circumstances under which they can act as bioindicators of environmental impact. The aim of this study was to determine whether physicochemical variables, polyaromatic hydrocarbons or sewage discharge affect the probability of occurrence of the larval cestode Oncomegas wageneri, which infects the shoal flounder, Syacium gunteri, in the southern Gulf of Mexico. The study area included 162 sampling sites in the southern Gulf of Mexico and covered 288,205 km(2), where the benthic sediments, water and the shoal flounder individuals were collected. We used the boosted generalised additive models (boosted GAM) and the MaxEnt to examine the potential statistical relationships between the environmental variables (nutrients, contaminants and physicochemical variables from the water and sediments) and the probability of the occurrence of this parasite. The models were calibrated using all of the sampling sites (full area) with and without parasite occurrences (n = 162) and a polygon area that included sampling sites with a depth of 1500 m or less (n = 134). Oncomegas wageneri occurred at 29/162 sampling sites. The boosted GAM for the full area and the polygon area accurately predicted the probability of the occurrence of O. wageneri in the study area. By contrast, poor probabilities of occurrence were obtained with the MaxEnt models for the same areas. The variables with the highest frequencies of appearance in the models (proxies for the explained variability) were the polyaromatic hydrocarbons of high molecular weight (PAHH, 95 %), followed by a combination of nutrients, spatial variables and polyaromatic hydrocarbons of low molecular weight (PAHL, 5 %). The contribution of the PAHH to the variability was explained by the fact that these compounds, together with N and P, are carried by rivers that

  12. The evolutionary duplication and probable demise of an endodermal GATA factor in Caenorhabditis elegans.

    Science.gov (United States)

    Fukushige, Tetsunari; Goszczynski, Barbara; Tian, Helen; McGhee, James D

    2003-10-01

    We describe the elt-4 gene from the nematode Caenorhabditis elegans. elt-4 is predicted to encode a very small (72 residues, 8.1 kD) GATA-type zinc finger transcription factor. The elt-4 gene is located approximately 5 kb upstream of the C. elegans elt-2 gene, which also encodes a GATA-type transcription factor; the zinc finger DNA-binding domains are highly conserved (24/25 residues) between the two proteins. The elt-2 gene is expressed only in the intestine and is essential for normal intestinal development. This article explores whether elt-4 also has a role in intestinal development. Reporter fusions to the elt-4 promoter or reporter insertions into the elt-4 coding regions show that elt-4 is indeed expressed in the intestine, beginning at the 1.5-fold stage of embryogenesis and continuing into adulthood. elt-4 reporter fusions are also expressed in nine cells of the posterior pharynx. Ectopic expression of elt-4 cDNA within the embryo does not cause detectable ectopic expression of biochemical markers of gut differentiation; furthermore, ectopic elt-4 expression neither inhibits nor enhances the ectopic marker expression caused by ectopic elt-2 expression. A deletion allele of elt-4 was isolated but no obvious phenotype could be detected, either in the gut or elsewhere; brood sizes, hatching efficiencies, and growth rates were indistinguishable from wild type. We found no evidence that elt-4 provided backup functions for elt-2. We used microarray analysis to search for genes that might be differentially expressed between L1 larvae of the elt-4 deletion strain and wild-type worms. Paired hybridizations were repeated seven times, allowing us to conclude, with some confidence, that no candidate target transcript could be identified as significantly up- or downregulated by loss of elt-4 function. In vitro binding experiments could not detect specific binding of ELT-4 protein to candidate binding sites (double-stranded oligonucleotides containing single or multiple

  13. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    International Nuclear Information System (INIS)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J.

    2012-01-01

    Greenhouse gas (CO 2 , CH 4 and N 2 O, hereinafter GHG) and criteria air pollutant (CO, NO x , VOC, PM 10 , PM 2.5 and SO x , hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life-cycle modeling with GREET.

  14. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  15. Identification of an osteoclast transcription factor that binds to the human T cell leukemia virus type I-long terminal repeat enhancer element.

    Science.gov (United States)

    Inoue, D; Santiago, P; Horne, W C; Baron, R

    1997-10-03

    Transgenic mice expressing human T cell leukemia virus type I (HTLV-I)-tax under the control of HTLV-I-long terminal repeat (LTR) promoter develop skeletal abnormalities with high bone turnover and myelofibrosis. In these animals, Tax is highly expressed in bone with a pattern of expression restricted to osteoclasts and spindle-shaped cells within the endosteal myelofibrosis. To test the hypothesis that lineage-specific transcription factors promote transgene expression from the HTLV-I-LTR in osteoclasts, we first examined tax expression in transgenic bone marrow cultures. Expression was dependent on 1alpha,25-dihydroxycholecalciferol and coincided with tartrate-resistant acid phosphatase (TRAP) expression, a marker of osteoclast differentiation. Furthermore, Tax was expressed in vitronectin receptor-positive mononuclear precursors as well as in mature osteoclast-like cells (OCLs). Consistent with our hypothesis, electrophoretic mobility shift assays revealed the presence of an OCL nuclear factor (NFOC-1) that binds to the LTR 21-base pair direct repeat, a region critical for the promoter activity. This binding is further enhanced by Tax. Since NFOC-1 is absent in macrophages and conserved in osteoclasts among species including human, such a factor may play a role in lineage determination and/or in expression of the differentiated osteoclast phenotype.

  16. Spectroscopic factor and proton formation probability for the d3/2 proton emitter 151mLu

    Directory of Open Access Journals (Sweden)

    F. Wang

    2017-07-01

    Full Text Available The quenching of the experimental spectroscopic factor for proton emission from the short-lived d3/2 isomeric state in 151mLu was a long-standing problem. In the present work, proton emission from this isomer has been reinvestigated in an experiment at the Accelerator Laboratory of the University of Jyväskylä. The proton-decay energy and half-life of this isomer were measured to be 1295(5 keV and 15.4(8 μs, respectively, in agreement with another recent study. These new experimental data can resolve the discrepancy in the spectroscopic factor calculated using the spherical WKB approximation. Using the R-matrix approach it is found that the proton formation probability indicates no significant hindrance for the proton decay of 151mLu.

  17. Study on relationship of performance shaping factor in human error probability with prevalent stress of PUSPATI TRIGA reactor operators

    Science.gov (United States)

    Rahim, Ahmad Nabil Bin Ab; Mohamed, Faizal; Farid, Mohd Fairus Abdul; Fazli Zakaria, Mohd; Sangau Ligam, Alfred; Ramli, Nurhayati Binti

    2018-01-01

    Human factor can be affected by prevalence stress measured using Depression, Anxiety and Stress Scale (DASS). From the respondents feedback can be summarized that the main factor causes the highest prevalence stress is due to the working conditions that require operators to handle critical situation and make a prompt critical decisions. The relationship between the prevalence stress and performance shaping factors found that PSFFitness and PSFWork Process showed positive Pearson’s Correlation with the score of .763 and .826 while the level of significance, p = .028 and p = .012. These positive correlations with good significant values between prevalence stress and human performance shaping factor (PSF) related to fitness, work processes and procedures. The higher the stress level of the respondents, the higher the score of selected for the PSFs. This is due to the higher levels of stress lead to deteriorating physical health and cognitive also worsened. In addition, the lack of understanding in the work procedures can also be a factor that causes a growing stress. The higher these values will lead to the higher the probabilities of human error occur. Thus, monitoring the level of stress among operators RTP is important to ensure the safety of RTP.

  18. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  19. Factor contribution to fire occurrence, size, and burn probability in a subtropical coniferous forest in East China.

    Science.gov (United States)

    Ye, Tao; Wang, Yao; Guo, Zhixing; Li, Yijia

    2017-01-01

    The contribution of factors including fuel type, fire-weather conditions, topography and human activity to fire regime attributes (e.g. fire occurrence, size distribution and severity) has been intensively discussed. The relative importance of those factors in explaining the burn probability (BP), which is critical in terms of fire risk management, has been insufficiently addressed. Focusing on a subtropical coniferous forest with strong human disturbance in East China, our main objective was to evaluate and compare the relative importance of fuel composition, topography, and human activity for fire occurrence, size and BP. Local BP distribution was derived with stochastic fire simulation approach using detailed historical fire data (1990-2010) and forest-resource survey results, based on which our factor contribution analysis was carried out. Our results indicated that fuel composition had the greatest relative importance in explaining fire occurrence and size, but human activity explained most of the variance in BP. This implies that the influence of human activity is amplified through the process of overlapping repeated ignition and spreading events. This result emphasizes the status of strong human disturbance in local fire processes. It further confirms the need for a holistic perspective on factor contribution to fire likelihood, rather than focusing on individual fire regime attributes, for the purpose of fire risk management.

  20. Prevalence of risk factors for HIV infection among Mexican migrants and immigrants: probability survey in the north border of Mexico

    Directory of Open Access Journals (Sweden)

    Gudelia Rangel M.

    2006-01-01

    Full Text Available OBJECTIVE: To estimate the prevalence of risk factors for HIV infection among Mexican migrants and immigrants (MMIs in different geographic contexts, including the sending communities in Mexico, the receiving communities in the United States (US, and the Mexican North border region. MATERIAL AND METHODS: We conducted a probability survey among MMIs traveling through key border crossing sites in the Tijuana (Baja California, Mexico-San Diego (California, US border region (N=1 429. RESULTS: The survey revealed substantial rates of reported sexually transmitted infections, needle-sharing and sexual risk practices in all migration contexts. CONCLUSIONS: The estimated levels of HIV risk call for further binational research and preventive interventions in all key geographic contexts of the migration experience to identify and tackle the different personal, environmental, and structural determinants of HIV risk in each of these contexts.

  1. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    Science.gov (United States)

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  2. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Defraene, Gilles; Van den Bergh, Laura; Al-Mamgani, Abrahim; Haustermans, Karin; Heemsbergen, Wilma; Van den Heuvel, Frank; Lebesque, Joos V.

    2012-01-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D 50 . Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions

  3. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints

  4. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  5. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  6. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  7. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  8. Prevalence and potential factors associated with probable sleep or awake bruxism and dentin hypersensitivity in undergraduate students

    Directory of Open Access Journals (Sweden)

    Neusa Barros DANTAS-NETA

    Full Text Available OBJECTIVE: To measure the prevalence of probable sleep or awake bruxism and cervical dentin hypersensitivity of undergraduate students and to determine the symptoms associated with these conditions.METHODOLOGY: This was a cross-sectional study. A diagnosis of probable bruxism was reached when students reported clenching or grinding of the teeth during sleep and/or wakefulness, and when they also presented some of the signs and symptoms of bruxism and masseter muscle pain on palpation. Cervical dentinal hypersensitivity was diagnosed by testing for sensitivity to pain in the cervical region of the teeth. Pain was triggered either by touch (using a #5 probe or by an air jet spray. The sample consisted of 306 university students aged between 19 and 35 years old. The data were stored and analysed using SPSS software, version 15.0 for Windows.RESULT: The prevalence of probable bruxism was 34.3%, with no predominance regarding sex. Probable awake bruxism was more prevalent (61.9%, mostly occurring when the individual reported being in a state of mental concentration (63.1%. There was no association between probable sleep or awake bruxism and dentin hypersensitivity (p = 0.195. Individuals with probable sleep bruxism had increased odds of having muscular pain in the face upon waking (OR = 14.14, 95% CI 5.06-39.55, and those with probable awake bruxism had a increased odds of having facial muscle fatigue when chewing or talking for a long time (OR = 2.88, 95% CI 1.53-5.43 and muscular pain in the face upon waking (OR = 5.31, 95% CI 1.93-14.62.CONCLUSION: The prevalence of probable bruxism was 34.3% and that of HDC was 57.8%, with 22.2% of these subjects also showing probable bruxism. Individuals with probable bruxism tended to have a higher odds of facial pain when they awakened and when chewing or talking for long periods. There were no associations between probable sleep and awake bruxism and cervical dentin hypersensitivity.

  9. Survey Probability and Factors affecting Farmers Participation in Future and Option Markets Case Study: Cotton product in Gonbad kavos city

    Directory of Open Access Journals (Sweden)

    F. sakhi

    2016-03-01

    Full Text Available Introduction: Farmers are facing with a variety of natural and unnatural risks in agricultural activities, and thus their income is unstable. A wide range of risks such as risks of production, price risk, financial and human risks, influence the income of agricultural products. One of the major risks that farmers faced is the risk of price volatility of agricultural products. Cotton is one of the agricultural products with high real price volatility. Numerous tools for marketing and risk management for agricultural products in the face of price risks are available. Futures and options contracts may be the most important available tools (to reduce price volatility in agricultural products. The purpose of the current study was to look at the possibility of farmers participations in the future and option markets that presented as a means to reduce the cotton prices volatility. The dependent variable for this purpose had four categories and these included: participate in both the market, participation in the future market, participation in the option market and participation in both future and option markets. Materials and Methods: data gathered with interview and completing 200 questionnaires of cotton growers using simple random sampling. Multinomial Logit Regression Model was used for data analysis. Results and Discussion: To measure content validity of the preliminary study the validity of confirmatory factor analysis were used. For calculating reliability, the pre-test done with 30 questionnaires and reliability, coefficient Cronbach alpha was 0.79. The independence of dependent variables categories was confirmed by Hausman test results. The Likelihood ratio and Wald showed these categories are not combinable. Results indicated into period 2014 -2015 and the sample under study, 35% of cotton growers unwilling to participate in future and option markets. Farmers willingness to participate in future and option market was 19% and %21

  10. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  11. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  12. Factors associated with anti-human leukocyte antigen antibodies in patients supported with continuous-flow devices and effect on probability of transplant and post-transplant outcomes

    DEFF Research Database (Denmark)

    Alba, Ana C; Tinckam, Kathryn; Foroutan, Farid

    2015-01-01

    BACKGROUND: One major disadvantage of ventricular assist device (VAD) therapy is the development of human-leukocyte antigen (HLA) antibodies. We aimed to identify factors associated with HLA antibodies during continuous flow (CF)-VAD support and assess the effect on transplant probability...

  13. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  14. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  15. Analysis of HIV-1 intersubtype recombination breakpoints suggests region with high pairing probability may be a more fundamental factor than sequence similarity affecting HIV-1 recombination.

    Science.gov (United States)

    Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun

    2016-09-21

    With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our

  16. Factors affecting the probability of first year medical student dropout in the UK: a logistic analysis for the intake cohorts of 1980-92.

    Science.gov (United States)

    Arulampalam, Wiji; Naylor, Robin; Smith, Jeremy

    2004-05-01

    In the context of the 1997 Report of the Medical Workforce Standing Advisory Committee, it is important that we develop an understanding of the factors influencing medical school retention rates. To analyse the determinants of the probability that an individual medical student will drop out of medical school during their first year of study. Binomial and multinomial logistic regression analysis of individual-level administrative data on 51 810 students in 21 medical schools in the UK for the intake cohorts of 1980-92 was performed. The overall average first year dropout rate over the period 1980-92 was calculated to be 3.8%. We found that the probability that a student would drop out of medical school during their first year of study was influenced significantly by both the subjects studied at A-level and by the scores achieved. For example, achieving 1 grade higher in biology, chemistry or physics reduced the dropout probability by 0.38% points, equivalent to a fall of 10%. We also found that males were about 8% more likely to drop out than females. The medical school attended also had a significant effect on the estimated dropout probability. Indicators of both the social class and the previous school background of the student were largely insignificant. Policies aimed at increasing the size of the medical student intake in the UK and of widening access to students from non-traditional backgrounds should be informed by evidence that student dropout probabilities are sensitive to measures of A-level attainment, such as subject studied and scores achieved. If traditional entry requirements or standards are relaxed, then this is likely to have detrimental effects on medical schools' retention rates unless accompanied by appropriate measures such as focussed student support.

  17. A radioisotope dilution assay for unlabelled vitamin B12-intrinsic factor complex employing the binding intrinsic factor antibody: probable evidence for two types of binding antibody

    International Nuclear Information System (INIS)

    Jacob, E.; O'Brien, H.A.W.; Mollin, D.L.

    1977-01-01

    A new radioisotope dilution assay for vitamin B 12 -intrinsic factor complex is described. The method is based on the use of the binding type intrinsic antibody (the binding reagent), which when combined with the intrinsic factor-vitamin B 12 complex (labelled ligand), is quantitatively adsorbed onto zirconium phosphate gel pH 6.25. The new assay has been shown to provide a measure of intrinsic factor comparable with other intrinsic factor assays, but it has the important advantage of being able to measure the unlabelled vitamin B 12 -intrinsic factor complex (unlabelled ligand), and will, therefore, be valuable in the study of physiological events in the gastrointestinal tract. During the study, it was found that there is some evidence for at least two types of binding intrinsic factor antibody: One which combines preferentially with the intrinsic factor-vitamin B 12 complex and one which combines equally well with this complex or with free intrinsic factor. (author)

  18. Probability and amounts of yogurt intake are differently affected by sociodemographic, economic, and lifestyle factors in adults and the elderly-results from a population-based study.

    Science.gov (United States)

    Possa, Gabriela; de Castro, Michelle Alessandra; Marchioni, Dirce Maria Lobo; Fisberg, Regina Mara; Fisberg, Mauro

    2015-08-01

    The aim of this population-based cross-sectional health survey (N = 532) was to investigate the factors associated with the probability and amounts of yogurt intake in Brazilian adults and the elderly. A structured questionnaire was used to obtain data on demographics, socioeconomic information, presence of morbidities and lifestyle and anthropometric characteristics. Food intake was evaluated using two nonconsecutive 24-hour dietary recalls and a Food Frequency Questionnaire. Approximately 60% of the subjects were classified as yogurt consumers. In the logistic regression model, yogurt intake was associated with smoking (odds ratio [OR], 1.98), female sex (OR, 2.12), and age 20 to 39 years (OR, 3.11). Per capita family income and being a nonsmoker were factors positively associated with the amount of yogurt consumption (coefficients, 0.61 and 3.73, respectively), whereas the level of education of the head of household was inversely associated (coefficient, 0.61). In this study, probability and amounts of yogurt intake are differently affected by demographic, socioeconomic, and lifestyle factors in adults and the elderly. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  20. Nitrogen oxide emission calculation for post-Panamax container ships by using engine operation power probability as weighting factor: A slow-steaming case.

    Science.gov (United States)

    Cheng, Chih-Wen; Hua, Jian; Hwang, Daw-Shang

    2017-12-07

    In this study, the nitrogen oxide (NO x ) emission factors and total NO x emissions of two groups of post-Panamax container ships operating on a long-term slow-steaming basis along Euro-Asian routes were calculated using both the probability density function of engine power levels and the NO x emission function. The main engines of the five sister ships in Group I satisfied the Tier I emission limit stipulated in MARPOL (International Convention for the Prevention of Pollution from Ships) Annex VI, and those in Group II satisfied the Tier II limit. The calculated NO x emission factors of the Group I and Group II ships were 14.73 and 17.85 g/kWhr, respectively. The total NO x emissions of the Group II ships were determined to be 4.4% greater than those of the Group I ships. When the Tier II certification value was used to calculate the average total NO x emissions of Group II engines, the result was lower than the actual value by 21.9%. Although fuel consumption and carbon dioxide (CO 2 ) emissions were increased by 1.76% because of slow steaming, the NO x emissions were markedly reduced by 17.2%. The proposed method is more effective and accurate than the NO x Technical Code 2008. Furthermore, it can be more appropriately applied to determine the NO x emissions of international shipping inventory. The usage of operating power probability density function of diesel engines as the weighting factor and the NO x emission function obtained from test bed for calculating NO x emissions is more accurate and practical. The proposed method is suitable for all types and purposes of diesel engines, irrespective of their operating power level. The method can be used to effectively determine the NO x emissions of international shipping and inventory applications and should be considered in determining the carbon tax to be imposed in the future.

  1. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  2. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  3. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    Leal, L.C.; de Saussure, G.; Perez, R.B.

    1990-01-01

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling by Doppler broadened cross-sections. The various self-shielding factors are computer numerically as Lebesgue integrals over the cross-section probability tables

  4. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  5. The effect of combining recombinant human tumor necrosis factor-alpha with local radiation on tumor control probability of a human glioblastoma multiforme xenograft in nude mice

    International Nuclear Information System (INIS)

    Huang, Peigen; Allam, Ayman; Perez, Luis A.; Taghian, Alphonse; Freeman, Jill; Suit, Herman D.

    1995-01-01

    Purpose: To evaluate the antitumor activity of recombinant human tumor necrosis factor-alpha (rHuTNF-α) on a human glioblastoma multiforme (U87) xenograft in nude mice, and to study the effect of combining rHuTNF-α with local radiation on the tumor control probability of this tumor model. Methods and Materials: U87 xenograft was transplanted SC into the right hindleg of NCr/Sed nude mice (7-8 weeks old, male). When tumors reached a volume of about 110 mm 3 , mice were randomly assigned to treatment: rHuTNF-α alone compared with normal saline control; or local radiation plus rHuTNF-α vs. local radiation plus normal saline. Parameters of growth delay, volume doubling time, percentage of necrosis, and cell loss factor were used to assess the antitumor effects of rHuTNF-α on this tumor. The TCD 50 (tumor control dose 50%) was used as an endpoint to determine the effect of combining rHuTNF-α with local radiation. Results: Tumor growth in mice treated with a dose of 150 μg/kg body weight rHuTNF-α, IP injection daily for 7 consecutive days, was delayed about 8 days compared to that in controls. Tumors in the treatment group had a significantly longer volume doubling time, and were smaller in volume and more necrotic than matched tumors in control group. rHuTNF-α also induced a 2.3 times increase of cell loss factor. The administration of the above-mentioned dose of rHuTNF-α starting 24 h after single doses of localized irradiation under hypoxic condition, resulted in a significant reduction in TCD 50 from the control value of 60.9 Gy to 50.5 Gy (p 50 value in the treatment vs. the control groups

  6. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  7. The effect of combining recombinant human tumor necrosis factor-alpha with local radiation on tumor control probability of a human glioblastoma multiforme xenograft in nude mice

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Peigen; Allam, Ayman; Perez, Luis A; Taghian, Alphonse; Freeman, Jill; Suit, Herman D

    1995-04-30

    Purpose: To evaluate the antitumor activity of recombinant human tumor necrosis factor-alpha (rHuTNF-{alpha}) on a human glioblastoma multiforme (U87) xenograft in nude mice, and to study the effect of combining rHuTNF-{alpha} with local radiation on the tumor control probability of this tumor model. Methods and Materials: U87 xenograft was transplanted SC into the right hindleg of NCr/Sed nude mice (7-8 weeks old, male). When tumors reached a volume of about 110 mm{sup 3}, mice were randomly assigned to treatment: rHuTNF-{alpha} alone compared with normal saline control; or local radiation plus rHuTNF-{alpha} vs. local radiation plus normal saline. Parameters of growth delay, volume doubling time, percentage of necrosis, and cell loss factor were used to assess the antitumor effects of rHuTNF-{alpha} on this tumor. The TCD{sub 50} (tumor control dose 50%) was used as an endpoint to determine the effect of combining rHuTNF-{alpha} with local radiation. Results: Tumor growth in mice treated with a dose of 150 {mu}g/kg body weight rHuTNF-{alpha}, IP injection daily for 7 consecutive days, was delayed about 8 days compared to that in controls. Tumors in the treatment group had a significantly longer volume doubling time, and were smaller in volume and more necrotic than matched tumors in control group. rHuTNF-{alpha} also induced a 2.3 times increase of cell loss factor. The administration of the above-mentioned dose of rHuTNF-{alpha} starting 24 h after single doses of localized irradiation under hypoxic condition, resulted in a significant reduction in TCD{sub 50} from the control value of 60.9 Gy to 50.5 Gy (p < 0.01). Conclusion: rHuTNF-{alpha} exhibits an antitumor effect against U87 xenograft in nude mice, as evidenced by an increased delay in tumor growth as well as cell loss factor. Also, there was an augmentation of tumor curability when given in combination with radiotherapy, resulting in a significantly lower TCD{sub 50} value in the treatment vs. the

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  10. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  11. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  12. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  13. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  14. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  15. Prevalence, Incidence, Prognosis, Early Stroke Risk, and Stroke-Related Prognostic Factors of Definite or Probable Transient Ischemic Attacks in China, 2013

    Directory of Open Access Journals (Sweden)

    Bin Jiang

    2017-06-01

    Full Text Available The epidemiological characteristics of transient ischemic attacks (TIAs in China are unclear. In 2013, we conducted a nationally representative, door-to-door epidemiological survey on TIA in China using a complex, multistage, probability sampling design. Results showed that the weighted prevalence of TIA in China was 103.3 [95% confidence interval (CI: 83.9–127.2] per 100,000 in the population, 92.4 (75.0–113.8 per 100,000 among men, and 114.7 (87.2–151.0 per 100,000 among women. The weighted incidence of TIA was 23.9 (17.8–32.0 per 100,000 in the population, 21.3 (14.3–31.5 per 100,000 among men, and 26.6 (17.0–41.7 per 100,000 among women. No difference in average prognosis was found between TIA and stroke in the population. Weighted risk of stroke among TIA patients was 9.7% (6.5–14.3%, 11.1% (7.5–16.1%, and 12.3% (8.4–17.7% at 2, 30, and 90 days, respectively. The risk of stroke was higher among male patients with a history of TIA than among female patients with a history of TIA (OR: 2.469; 95% CI: 1.172–5.201; P = 0.018, and higher among TIA patients with hypertension than among TIA patients without hypertension (OR: 2.671; 1.547–4.613; P < 0.001. It can be concluded that there are an estimated 1.35 million TIA patients nationwide, with 0.31 million new cases of TIA annually in China. TIA patients were not better managed prior to a stroke event. Early risk of stroke among TIA patients is high. Sex and hypertension may be stroke-associated prognostic factors among TIA patients. TIA clinics and surveillance should be integrated into the national health-care system.

  16. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  17. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    Leal, L.C.; de Saussure, G.; Perez, R.B.

    1989-01-01

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self- indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling the Doppler broadened cross-section. The various shelf-shielded factors are computed numerically as Lebesgue integrals over the cross-section probability tables. 6 refs

  18. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  19. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  20. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  1. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  2. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  3. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  4. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  5. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  6. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  7. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  8. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  10. [Risk factor analysis of the patients with solitary pulmonary nodules and establishment of a prediction model for the probability of malignancy].

    Science.gov (United States)

    Wang, X; Xu, Y H; Du, Z Y; Qian, Y J; Xu, Z H; Chen, R; Shi, M H

    2018-02-23

    Objective: This study aims to analyze the relationship among the clinical features, radiologic characteristics and pathological diagnosis in patients with solitary pulmonary nodules, and establish a prediction model for the probability of malignancy. Methods: Clinical data of 372 patients with solitary pulmonary nodules who underwent surgical resection with definite postoperative pathological diagnosis were retrospectively analyzed. In these cases, we collected clinical and radiologic features including gender, age, smoking history, history of tumor, family history of cancer, the location of lesion, ground-glass opacity, maximum diameter, calcification, vessel convergence sign, vacuole sign, pleural indentation, speculation and lobulation. The cases were divided to modeling group (268 cases) and validation group (104 cases). A new prediction model was established by logistic regression analying the data from modeling group. Then the data of validation group was planned to validate the efficiency of the new model, and was compared with three classical models(Mayo model, VA model and LiYun model). With the calculated probability values for each model from validation group, SPSS 22.0 was used to draw the receiver operating characteristic curve, to assess the predictive value of this new model. Results: 112 benign SPNs and 156 malignant SPNs were included in modeling group. Multivariable logistic regression analysis showed that gender, age, history of tumor, ground -glass opacity, maximum diameter, and speculation were independent predictors of malignancy in patients with SPN( P prediction model for the probability of malignancy as follow: p =e(x)/(1+ e(x)), x=-4.8029-0.743×gender+ 0.057×age+ 1.306×history of tumor+ 1.305×ground-glass opacity+ 0.051×maximum diameter+ 1.043×speculation. When the data of validation group was added to the four-mathematical prediction model, The area under the curve of our mathematical prediction model was 0.742, which is greater

  11. Bidirectional enhancing activities between human T cell leukemia-lymphoma virus type I and human cytomegalovirus in human term syncytiotrophoblast cells cultured in vitro.

    Science.gov (United States)

    Tóth, F D; Aboagye-Mathiesen, G; Szabó, J; Liu, X; Mosborg-Petersen, P; Kiss, J; Hager, H; Zdravkovic, M; Andirkó, I; Aranyosi, J

    1995-12-01

    The syncytiotrophoblast layer of the human placenta has an important role in limiting transplacental viral spread from mother to fetus. Human cytomegalovirus (HCMV) is capable of establishing a latent infection in syncytiotrophoblast cells, with restriction of gene expression to immediate-early and early proteins. We analyzed the extent of replication of human T cell leukemia-lymphoma virus type I (HTLV-I) in human term syncytiotrophoblasts infected with HTLV-I alone or coinfected with HTLV-I and HCMV. Although syncytiotrophoblasts could be infected with cell-free HTLV-I, no viral protein expression was found in the singly infected cells. On the contrary, coinfection of the cells with HTLV-I and HCMV resulted in simultaneous replication of both viruses. Bidirectional enhancing activities between HTLV-I and HCMV were mediated primarily by the Tax and immediate-early proteins, respectively. The stimulatory effect of HTLV-I Tax on HCMV replication appeared to be mediated partly by tumor necrosis factor beta and transforming growth factor beta-1. We observed formation of pseudotypes with HTLV-I nucleocapsids within HCMV envelopes, whereas HCMV was not pseudotyped by HTLV-I envelopes in dually infected syncytiotrophoblast cells. Our data suggest that in vivo dual infection of syncytiotrophoblast cells with HTLV-I and HCMV may facilitate the transplacental transmission of both viruses.

  12. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  13. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  14. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  15. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  16. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  17. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  18. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  19. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  20. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  1. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  2. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  3. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  4. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  5. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  6. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  7. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  11. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  13. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  14. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  15. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  16. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  17. Multivariate analysis of factors affecting probability of pregnancy and live birth with in vitro fertilization: an analysis of the Society for Assisted Reproductive Technology Clinic Outcomes Reporting System.

    Science.gov (United States)

    Baker, Valerie L; Luke, Barbara; Brown, Morton B; Alvero, Ruben; Frattarelli, John L; Usadi, Rebecca; Grainger, David A; Armstrong, Alicia Y

    2010-09-01

    To evaluate factors predictive of clinical pregnancy and of pregnancy loss from assisted reproductive technology (ART) using data from the Society for Assisted Reproductive Technology database for 2004-2006. Retrospective cohort. Clinic-based data. The study population included 225,889 fresh embryo transfer cycles using autologous oocytes and partner semen. None. Clinical intrauterine gestation (presence of gestational sac) and live birth (>or=22 weeks gestation and >or=300 g birth weight). Increasing maternal age was significantly associated with a reduced odds of conception and increased fetal loss until 19 weeks gestation, but not with later pregnancy loss. Intracytoplasmic sperm injection (ICSI), assisted hatching, and increasing number of embryos transferred had significant positive effects on the odds of conception and pregnancy continuation through the first trimester, but did not affect the risk of later loss. Blacks, Asians, and Hispanics had significantly lower odds of clinical pregnancy compared with whites. Also compared with whites, Hispanics and Asians had a significantly greater risk of pregnancy loss in the second and third trimesters, and blacks had a significantly greater risk of pregnancy loss in all trimesters. Certain demographic and ART treatment parameters influenced chance of conception and early pregnancy loss, whereas black race and Hispanic ethnicity were also significantly associated with late pregnancy loss in ART-conceived pregnancies. Copyright (c) 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  18. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  19. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  20. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  1. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  2. A Comparison of Tropical Storm (TS) and Non-TS Gust Factors for Assessing Peak Wind Probabilities at the Eastern Range

    Science.gov (United States)

    Merceret, Francis J.; Crawford, Winifred C.

    2010-01-01

    Peak wind speed is an important forecast element to ensure the safety of personnel and flight hardware at Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) in East-Central Florida. The 45th Weather Squadron (45 WS), the organization that issues forecasts for the KSC/CCAFS area, finds that peak winds are more difficult to forecast than mean winds. This difficulty motivated the 45 WS to request two independent studies. The first (Merceret 2009) was the development of a reliable model for gust factors (GF) relating the peak to the mean wind speed in tropical storms (TS). The second (Lambert et al. 2008) was a climatological study of non-TS cool season (October-April) mean and peak wind speeds by the Applied Meteorology Unit (AMU; Bauman et al. 2004) without the use of GF. Both studies presented their statistics as functions of mean wind speed and height. Most of the few comparisons of TS and non-TS GF in the literature suggest that non-TS GF at a given height and mean wind speed are smaller than the corresponding TS GF. The investigation reported here converted the non-TS peak wind statistics calculated by the AMU to the equivalent GF statistics and compared them with the previous TS GF results. The advantage of this effort over all previously reported studies of its kind is that the TS and non-TS data were taken from the same towers in the same locations. This eliminates differing surface attributes, including roughness length and thermal properties, as a major source of variance in the comparison. The goal of this study is two-fold: to determine the relationship between the non-TS and TS GF and their standard deviations (GFSD) and to determine if models similar to those developed for TS data in Merceret (2009) could be developed for the non-TS environment. The results are consistent with the literature, but include much more detailed, quantitative information on the nature of the relationship between TS and non-TS GF and GFSD as a function

  3. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  4. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  5. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  6. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  7. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  8. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  9. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  10. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  11. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  12. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  13. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  14. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  15. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  16. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  17. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  18. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  19. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  20. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  1. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  2. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  3. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  4. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  5. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  6. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  7. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  8. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  9. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  10. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  11. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  12. Seroepidemiological Survey of HTLV-I/II in Blood Donors of Mazandaran in 1999.

    OpenAIRE

    N. Tabarestani; R. F. Hosseini; ِA. Ajami

    2000-01-01

    SummaryBackground and purpose: HTL-I/II viruses of the Retroviridae family are known to be the causes of various diseases. They are transmitted by blood transfusion, sexual contact and breast milk. As of contaminated mothers. These viral infections are endemic in certain regions, Epidemiological studies appear to be necessary in the country. Blood donors from different transfusion Centers were investigated in a pilot study.Materials and Methods: In this descriptive study, blood samples of 180...

  13. Quantification of HTLV-I proviral load in experimentally infected rabbits

    Directory of Open Access Journals (Sweden)

    Kindt Thomas J

    2005-05-01

    Full Text Available Abstract Background Levels of proviral load in HTLV-1 infected patients correlate with clinical outcome and are reasonably prognostic. Adaptation of proviral load measurement techniques is examined here for use in an experimental rabbit model of HTLV-1 infection. Initial efforts sought to correlate proviral load with route and dose of inoculation and with clinical outcome in this model. These methods contribute to our continuing goal of using the model to test treatments that alleviate virus infection. Results A real-time PCR assay was used to measure proviral load in blood and tissue samples from a series of rabbits infected using HTLV-1 inocula prepared as either cell-free virus particles, infected cells or blood, or by naked DNA injection. Proviral loads from asymptomatically infected rabbits showed levels corresponding to those reported for human patients with clinically silent HTLV-1 infections. Proviral load was comparably increased in 50% of experimentally infected rabbits that developed either spontaneous benign or malignant tumors while infected. Similarly elevated provirus was found in organs of rabbits with experimentally induced acute leukemia/lymphoma-like disease. Levels of provirus in organs taken at necropsy varied widely suggesting that reservoirs of infections exist in non-lymphoid organs not traditionally thought to be targets for HTLV-1. Conclusion Proviral load measurement is a valuable enhancement to the rabbit model for HTLV-1 infection providing a metric to monitor clinical status of the infected animals as well as a means for the testing of treatment to combat infection. In some cases proviral load in blood did not reflect organ proviral levels, revealing a limitation of this method for monitoring health status of HTLV-1 infected individuals.

  14. Seroprevalence of HTLV -I/II amongst Blood Donors in Osogbo ...

    African Journals Online (AJOL)

    Background: HTLV type I/II is a blood borne infection that can be transmitted via blood transfusion. Objective: To determine the seroprevalence of human T – lymphotropic virus among blood donors in Osogbo, Nigeria. Methods: Diagnosis of Human T. Lymphotropic virus antigen was carried out on 372 serum samples ...

  15. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  18. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  19. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  20. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  1. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  2. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  3. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  4. Transcriptional activation of immediate-early gene ETR101 by human T-cell leukaemia virus type I Tax

    DEFF Research Database (Denmark)

    Chen, Li; Ma, Shiliang; Li, Bo

    2003-01-01

    Human T-cell leukaemia virus type I (HTLV-I) Tax regulates viral and cellular gene expression through interactions with multiple cellular transcription pathways. This study describes the finding of immediate-early gene ETR101 expression in HTLV-I-infected cells and its regulation by Tax. ETR101...... was persistently expressed in HTLV-I-infected cells but not in HTLV-I uninfected cells. Expression of ETR101 was dependent upon Tax expression in the inducible Tax-expressing cell line JPX-9 and also in Jurkat cells transiently transfected with Tax-expressing vectors. Tax transactivated the ETR101 gene promoter......-DNA complex in HTLV-I-infected cell lines. EMSA with specific antibodies confirmed that the CREB transcription factor was responsible for formation of this specific protein-DNA complex. These results suggested that Tax directly transactivated ETR101 gene expression, mainly through a CRE sequence via the CREB...

  5. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  6. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  7. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  8. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  9. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  11. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  12. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  15. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  16. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  17. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  20. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  1. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  2. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  3. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  4. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  5. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  6. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  7. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  8. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  9. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  10. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  11. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  12. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  13. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  14. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  15. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  16. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  17. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  18. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  19. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  20. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  1. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  2. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  3. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  4. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  5. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  6. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  7. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  8. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  9. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  10. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  11. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  12. Prevalence of risk factors for HIV infection among Mexican migrants and immigrants: probability survey in the north border of Mexico Prevalencia de factores de riesgo para la infección por VIH entre migrantes mexicanos: encuesta probabilística en la frontera norte de México

    Directory of Open Access Journals (Sweden)

    M. Gudelia Rangel

    2006-02-01

    Full Text Available OBJECTIVE: To estimate the prevalence of risk factors for HIV infection among Mexican migrants and immigrants (MMIs in different geographic contexts, including the sending communities in Mexico, the receiving communities in the United States (US, and the Mexican North border region. MATERIAL AND METHODS: We conducted a probability survey among MMIs traveling through key border crossing sites in the Tijuana (Baja California, Mexico-San Diego (California, US border region (N=1 429. RESULTS: The survey revealed substantial rates of reported sexually transmitted infections, needle-sharing and sexual risk practices in all migration contexts. CONCLUSIONS: The estimated levels of HIV risk call for further binational research and preventive interventions in all key geographic contexts of the migration experience to identify and tackle the different personal, environmental, and structural determinants of HIV risk in each of these contexts.OBJETIVO: Estimar la prevalencia de prácticas de riesgo para la infección por VIH en migrantes mexicanos durante su estancia en distintos contextos geográficos, incluyendo sus comunidades de origen en México, las comunidades de destino en Estados Unidos de América (EUA, y la frontera Norte de México. MATERIAL Y MÉTODOS: Encuesta probabilística de migrantes mexicanos que transitan por la región fronteriza Tijuana (Baja California, México-San Diego (California, EUA (N=1 429. RESULTADOS: La encuesta reveló una alta prevalencia de infecciones de transmisión sexual, uso compartido de agujas, y prácticas sexuales de riesgo en todos los contextos geográficos estudiados. CONCLUSIONES: Los niveles de riesgo de infección por VIH estimados para migrantes mexicanos en diferentes contextos geográficos exigen estudios e intervenciones preventivas binacionales que identifiquen y aborden los distintos factores de riesgo personales, ambientales, y estructurales que contribuyen al riesgo de infección por VIH en cada

  13. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  14. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  15. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  16. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  17. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  18. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  19. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  20. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  1. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  2. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  3. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  4. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  5. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  6. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  7. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  8. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  9. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  10. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  11. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  12. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  13. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  14. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  15. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  16. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  17. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  18. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  19. Low prevalence of antibodies to human T-lymphotropic virus-I/II among blood donors in eastern Saudi Arabia.

    Science.gov (United States)

    Fawaz, Naglaa A; Tamim, Hala; Almawi, Wassim Y

    2005-04-01

    The seroprevalence of human T-lymphotropic virus (HTLV)-I/II was assessed in 13,443 consecutive blood donors in eastern Saudi Arabia between 1998 and 2001. Screening by enzyme-linked immunosorbent assay (ELISA) and confirmation by Western blot resulted in 8 (0.060%) positive cases, of which 5 (0.056%) belonged to Saudi and 3 (0.113%) to non-Saudi donors. The majority of the HTLV-positive donations (6/8) were for patients, and none had a history of known risk factor for HTLV-I/II transmission. Although the very low prevalence of HTLV-I/II among Saudi donors does not support routine screening, screening of donors from other nationalities may be initiated, especially those from HTLV-I/II endemic areas.

  20. Low blood selenium: A probable factor in essential hypertension ...

    African Journals Online (AJOL)

    Blood selenium (BSe) and plasma glutathione peroxidase (plGSH-Px) activity were measured as biochemical markers of selenium status of 103 hypertensive patients (44 males and 59 females) and 88 apparently healthy subjects (40 males and 48 females). The hypertensive patients were classified into three groups based ...

  1. Planetary interchange of bioactive material: probability factors and implications.

    Science.gov (United States)

    Clark, B C

    2001-01-01

    It is now well-accepted that both lunar and martian materials are represented in the meteorite collections. Early suggestions that viable organisms might survive natural transport between planets have not yet been thoroughly examined. The concept of Planetary Interchange of Bioactive Material (PIBM) is potentially relevant to the conditions under which life originated. PIBM has been also invoked to infer that the potential danger to Earth from martian materials is non-existent, an inference with, however, many pitfalls. Numerous impediments to efficient transfer of viable organisms exist. In this work, the lethality of space radiation during long transients and the biasing of launched objects toward materials unlikely to host abundant organisms are examined and shown to reduce the likelihood of successful transfer by orders of magnitude. It is also shown that martian meteorites studied to date assuredly have been subjected to sterilizing levels of ionizing radiation in space. PIBM considerations apply to both the solar system locale(s) of the origin of life and to the applicability of planetary protection protocols to preserve the biospheres of planetary bodies, including our own.

  2. Emptiness formation probability of XX-chain in diffusion process

    International Nuclear Information System (INIS)

    Ogata, Yoshiko

    2004-01-01

    We study the distribution of emptiness formation probability of XX-model in the diffusion process. There exits a Gaussian decay as well as an exponential decay. The Gaussian decay is caused by the existence of zero point in the Fermi distribution function. The correlation length for each point of scaling factor varies up to the initial condition, monotonically or non-monotonically

  3. Estimated probability of stroke among medical outpatients in Enugu ...

    African Journals Online (AJOL)

    Risk factors for stroke were evaluated using a series of laboratory tests, medical history and physical examinations. The 10‑year probability of stroke was determined by applying the Framingham stroke risk equation. Statistical analysis was performed with the use of the SPSS 17.0 software package (SPSS Inc., Chicago, IL, ...

  4. Parametric modeling of probability of bank loan default in Kenya ...

    African Journals Online (AJOL)

    This makes the study on probability of a customer defaulting very useful while analyzing the credit risk policies. In this paper, we use a raw data set that contains demographic information about the borrowers. The data sets have been used to identify which risk factors associated with the borrowers contribute towards default.

  5. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  6. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  7. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  8. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  9. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  10. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  11. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  12. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  13. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  14. Development of molecular methods for detection and epidemiological investigation of HIV-1, HIV-2, and HTLV-I/II infections

    NARCIS (Netherlands)

    Meijer A; Borleffs JCC; Roosendaal G; van Loon AM; VIR; AZU; Van Creveld Kliniek Utrecht

    1995-01-01

    Het onderzoek dat hier wordt gepresenteerd werd gestart om de mogelijkheden van moleculaire methoden voor detectie en epidemiologisch onderzoek van HIV en HTLV infecties te onderzoeken. We presenteren de resultaten van een literatuurstudie en beschrijven de ontwikkeling en gedeeltelijke evaluatie

  15. Imaging spinal cord atrophy in progressive myelopathies: HTLV-I-associated neurological disease (HAM/TSP) and multiple sclerosis (MS).

    Science.gov (United States)

    Azodi, Shila; Nair, Govind; Enose-Akahata, Yoshimi; Charlip, Emily; Vellucci, Ashley; Cortese, Irene; Dwyer, Jenifer; Billioux, B Jeanne; Thomas, Chevaz; Ohayon, Joan; Reich, Daniel S; Jacobson, Steven

    2017-11-01

    Previous work measures spinal cord thinning in chronic progressive myelopathies, including human T-lymphotropic virus 1 (HTLV-1)-associated myelopathy/tropical spastic paraparesis (HAM/TSP) and multiple sclerosis (MS). Quantitative measurements of spinal cord atrophy are important in fully characterizing these and other spinal cord diseases. We aimed to investigate patterns of spinal cord atrophy and correlations with clinical markers. Spinal cord cross-sectional area was measured in individuals (24 healthy controls [HCs], 17 asymptomatic carriers of HTLV-1 (AC), 47 HAM/TSP, 74 relapsing-remitting MS [RRMS], 17 secondary progressive MS [SPMS], and 40 primary progressive MS [PPMS]) from C1 to T10. Clinical disability scores, viral markers, and immunological parameters were obtained for patients and correlated with representative spinal cord cross-sectional area regions at the C2 to C3, C4 to C5, and T4 to T9 levels. In 2 HAM/TSP patients, spinal cord cross-sectional area was measured over 3 years. All spinal cord regions are thinner in HAM/TSP (56 mm 2 [standard deviation, 10], 59 [10], 23 [5]) than in HC (76 [7], 83 [8], 38 [4]) and AC (71 [7], 78 [9], 36 [7]). SPMS (62 [9], 66 [9], 32 [6]) and PPMS (65 [11], 68 [10], 35 [7]) have thinner cervical cords than HC and RRMS (73 [9], 77 [10], 37 [6]). Clinical disability scores (Expanded Disability Status Scale [p = 0.009] and Instituto de Pesquisas de Cananeia [p = 0.03]) and CD8 + T-cell frequency (p = 0.04) correlate with T4 to T9 spinal cord cross-sectional area in HAM/TSP. Higher cerebrospinal fluid HTLV-1 proviral load (p = 0.01) was associated with thinner spinal cord cross-sectional area. Both HAM/TSP patients followed longitudinally showed thoracic thinning followed by cervical thinning. Group average spinal cord cross-sectional area in HAM/TSP and progressive MS show spinal cord atrophy. We further hypothesize in HAM/TSP that is possible that neuroglial loss from a thoracic inflammatory process results in anterograde and retrograde degeneration of axons, leading to the temporal progression of thoracic to cervical atrophy described here. Ann Neurol 2017;82:719-728. © 2017 American Neurological Association.

  16. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  17. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  18. Probability based load combinations for design of category I structures

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1985-01-01

    This paper discusses a reliability analysis method and a procedure for developing the load combination design criteria for category I structures. For safety evaluation of category I concrete structures under various static and dynamic loads, a probability-based reliability analysis method has been developed. This reliability analysis method is also used as a tool for determining the load factors for design of category I structures. In this paper, the load combinations for design of concrete containments, corresponding to a target limit state probability of 1.0 x 10 -6 in 4 years, are described. A comparison of containments designed using the ASME code and the proposed design criteria is also presented

  19. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  20. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  1. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  2. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  3. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  4. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  5. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  6. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  7. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  8. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  9. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  10. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  11. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  12. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  13. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  14. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  15. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  16. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  17. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  18. Maximizing probable oil field profit: uncertainties on well spacing

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1997-01-01

    The influence of uncertainties in field development costs, well costs, lifting costs, selling price, discount factor, and oil field reserves are evaluated for their impact on assessing probable ranges of uncertainty on present day worth (PDW), oil field lifetime τ 2/3 , optimum number of wells (OWI), and the minimum (n-) and maximum (n+) number of wells to produce a PDW ≥ O. The relative importance of different factors in contributing to the uncertainties in PDW, τ 2/3 , OWI, nsub(-) and nsub(+) is also analyzed. Numerical illustrations indicate how the maximum PDW depends on the ranges of parameter values, drawn from probability distributions using Monte Carlo simulations. In addition, the procedure illustrates the relative importance of contributions of individual factors to the total uncertainty, so that one can assess where to place effort to improve ranges of uncertainty; while the volatility of each estimate allows one to determine when such effort is needful. (author)

  19. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  20. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  1. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  2. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  3. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  4. Cytologic diagnosis: expression of probability by clinical pathologists.

    Science.gov (United States)

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  5. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  6. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  7. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  8. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  9. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  10. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  11. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  12. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  13. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  14. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  15. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  16. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  17. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  20. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  1. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  2. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  3. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  4. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  5. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  6. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  7. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  8. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  9. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  10. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  11. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  12. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  13. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  14. Research advances in probability of causation calculation of radiogenic neoplasms

    International Nuclear Information System (INIS)

    Ning Jing; Yuan Yong; Xie Xiangdong; Yang Guoshan

    2009-01-01

    Probability of causation (PC) was used to facilitate the adjudication of compensation claims for cancers diagnosed following exposure to ionizing radiation. In this article, the excess cancer risk assessment models used for PC calculation are reviewed. Cancer risk transfer models between different populations, dependence of cancer risk on dose and dose rate, modification by epidemiological risk factors and application of PC are also discussed in brief. (authors)

  15. Pemodelan Markov Switching Dengan Time-varying Transition Probability

    OpenAIRE

    Savitri, Anggita Puri; Warsito, Budi; Rahmawati, Rita

    2016-01-01

    Exchange rate or currency is an economic variable which reflects country's state of economy. It fluctuates over time because of its ability to switch the condition or regime caused by economic and political factors. The changes in the exchange rate are depreciation and appreciation. Therefore, it could be modeled using Markov Switching with Time-Varying Transition Probability which observe the conditional changes and use information variable. From this model, time-varying transition probabili...

  16. Multimode Interference: Identifying Channels and Ridges in Quantum Probability Distributions

    OpenAIRE

    O'Connell, Ross C.; Loinaz, Will

    2004-01-01

    The multimode interference technique is a simple way to study the interference patterns found in many quantum probability distributions. We demonstrate that this analysis not only explains the existence of so-called "quantum carpets," but can explain the spatial distribution of channels and ridges in the carpets. With an understanding of the factors that govern these channels and ridges we have a limited ability to produce a particular pattern of channels and ridges by carefully choosing the ...

  17. What is the probability that radiation caused a particular cancer

    International Nuclear Information System (INIS)

    Voelz, G.L.

    1983-01-01

    Courts, lawyers, health physicists, physicians, and others are searching for a credible answer to the question posed in the title of this paper. The cases in which the question arises frequently stem from an individual that has cancer and they, or their next-of-kin, are convinced that a past radiation exposure - usually small - is responsible for causing it. An arithmetic expression of this problem is simple: the probability of causation by the radiation dose in question is equal to the risk of cancer from the radiation dose divided by the risk of cancer from all causes. The application of risk factors to this equation is not so simple. It must involve careful evaluation of the reliability of and variations in risk coefficients for development of cancer due to radiation exposure, other carcinogenic agents, and natural causes for the particular individual. Examination of our knowledge of these various factors indicates that a large range in the answers can result due to the variability and imprecision of the data. Nevertheless, the attempts to calculate and the probability that radiation caused the cancer is extremely useful to provide a gross perspective on the probability of causation. It will likely rule in or out a significant number of cases despite the limitations in our understandings of the etiology of cancer and the risks from various factors. For the remaining cases, a thoughtful and educated judgment based on selected data and circumstances of the case will also be needed before the expert can develop and support his opinion

  18. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  19. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  20. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  1. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  2. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  3. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  4. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  5. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  6. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  8. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  9. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  10. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  11. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  12. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  13. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  14. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  15. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  16. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  17. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  18. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  19. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  20. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  1. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  2. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  3. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  4. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  5. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  6. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  7. The Probability Heuristics Model of Syllogistic Reasoning.

    Science.gov (United States)

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  8. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  9. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  10. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  11. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.

  12. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  13. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  14. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  15. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  16. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  17. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  18. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  19. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  20. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  1. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  2. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  3. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  4. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  5. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  6. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  7. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  8. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  9. Quantum probability and conceptual combination in conjunctions.

    Science.gov (United States)

    Hampton, James A

    2013-06-01

    I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.

  10. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  11. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  12. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  13. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  14. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  15. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  16. Correlations between channel probabilities in collisional dissociation of D3+

    International Nuclear Information System (INIS)

    Abraham, S.; Nir, D.; Rosner, B.

    1984-01-01

    Measurements of the dissociation of D 3 + ions at 300--600 keV under single- and multiple-collision conditions in Ar- and H 2 -gas targets have been performed. A complete separation of all dissociation channels was achieved, including the neutral channels, which were resolved using a fine-mesh technique. Data analysis in the multiple-collision regime confirms the validity of the rate equations governing the charge exchange processes. In the single-collision region the analysis yields constant relations between channel probabilities. Data rearrangement shows probability factorization and suggests that collisional dissociation is a two-stage process, a fast electron exchange followed by rearrangement and branching to the exit channels

  17. Identification of Indicators’ Applicability to Settle Borrowers’ Probability of Default

    Directory of Open Access Journals (Sweden)

    Jurevičienė Daiva

    2016-06-01

    Full Text Available Borrowers default risk is one of the most relevant types of risk in commercial banking and its assessment is important to secure business profitability and avoid huge losses during economic turbulences. This leads to necessity to investigate topics related to assessment of borrowers’ default probability and applicability of factors, which would enable to capture the newest trends of borrowers’ markets. Leading economic indicators (in addition to financial and other economic indicators are often suggested as forward-looking in scientific literature. However, there is still a discussion going on applicability of financial ratios and economic indicators. As the problem is relevant in theoretical view as well as for practitioners, this article aims to identify applicability of leading economic indicators for the estimation of default probability. Further, the qualitative criteria for factor selection were identified and used when using detailing, grouping and SWOT analysis methods. Based on current scientific literature analysis, this paper concludes that although leading economic indicators are able to capture forward-looking signals, they should be used with careful analysis of its drawbacks and in combination with financial factors in order to avoid overshooting effects. The limitation of the article is the analysis of factors based on rather theoretical analysis than estimation of quantitative criteria. This suggests that every time using leading economic indicators requires using empirical study of particular indicators’ set.

  18. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  19. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  20. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  1. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  2. Human T Cell Leukemia Virus Type I Tax-Induced IκB-ζ Modulates Tax-Dependent and Tax-Independent Gene Expression in T Cells

    Directory of Open Access Journals (Sweden)

    Ryuichiro Kimura

    2013-09-01

    Full Text Available Human T cell leukemia virus type I (HTLV-I is the etiologic agent of adult T cell leukemia (ATL and various inflammatory disorders including HTLV-I-associated myelopathy/tropical spastic paraparesis. HTLV-I oncoprotein Tax is known to cause permanent activation of many cellular transcription factors including nuclear factor-κB (NF-κB, cyclic adenosine 3′,5′-monophosphate response element-binding protein, and activator protein 1 (AP-1. Here, we show that NF-κB-binding cofactor inhibitor of NF-κB-ζ (IκB-ζ is constitutively expressed in HTLV-I-infected T cell lines and ATL cells, and Tax transactivates the IκB-ζ gene, mainly through NF-κB. Microarray analysis of IκB-ζ-expressing uninfected T cells demonstrated that IκB-ζ induced the expression of NF-κB. and interferon-regulatory genes such as B cell CLL/lymphoma 3 (Bcl3, guanylate-binding protein 1, and signal transducer and activator of transcription 1. The transcriptional activation domain, nuclear localization signal, and NF-κB-binding domain of IκB-ζ were required for Bcl3 induction, and IκB-ζ synergistically enhanced Tax-induced Bcl3 transactivation in an NF-κB-dependent manner. Interestingly, IκB-ζ inhibited Tax-induced NF-κB, AP-1 activation, and HTLV-I transcription. Furthermore, IκB-ζ interacted with Tax in vitro and this interaction was also observed in an HTLV-I-transformed T cell line. These results suggest that IκB-ζ modulates Tax-dependent and Tax-independent gene transcription in T cells. The function of IκB-ζ may be of significance in ATL genesis and pathogenesis of HTLV-I-associated diseases.

  3. Human T Cell Leukemia Virus Type I Tax-Induced IκB-ζ Modulates Tax-Dependent and Tax-Independent Gene Expression in T Cells1

    Science.gov (United States)

    Kimura, Ryuichiro; Senba, Masachika; Cutler, Samuel J; Ralph, Stephen J; Xiao, Gutian; Mori, Naoki

    2013-01-01

    Human T cell leukemia virus type I (HTLV-I) is the etiologic agent of adult T cell leukemia (ATL) and various inflammatory disorders including HTLV-I-associated myelopathy/tropical spastic paraparesis. HTLV-I oncoprotein Tax is known to cause permanent activation of many cellular transcription factors including nuclear factor-κB (NF-κB), cyclic adenosine 3′,5′-monophosphate response element-binding protein, and activator protein 1 (AP-1). Here, we show that NF-κB-binding cofactor inhibitor of NF-κB-ζ (IκB-ζ) is constitutively expressed in HTLV-I-infected T cell lines and ATL cells, and Tax transactivates the IκB-ζ gene, mainly through NF-κB. Microarray analysis of IκB-ζ-expressing uninfected T cells demonstrated that IκB-ζ induced the expression of NF-κB. and interferon-regulatory genes such as B cell CLL/lymphoma 3 (Bcl3), guanylate-binding protein 1, and signal transducer and activator of transcription 1. The transcriptional activation domain, nuclear localization signal, and NF-κB-binding domain of IκB-ζ were required for Bcl3 induction, and IκB-ζ synergistically enhanced Tax-induced Bcl3 transactivation in an NF-κB-dependent manner. Interestingly, IκB-ζ inhibited Tax-induced NF-κB, AP-1 activation, and HTLV-I transcription. Furthermore, IκB-ζ interacted with Tax in vitro and this interaction was also observed in an HTLV-I-transformed T cell line. These results suggest that IκB-ζ modulates Tax-dependent and Tax-independent gene transcription in T cells. The function of IκB-ζ may be of significance in ATL genesis and pathogenesis of HTLV-I-associated diseases. PMID:24027435

  4. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  5. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  6. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  7. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  8. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  9. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  10. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  11. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  12. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  13. Probability Weighting as Evolutionary Second-best

    OpenAIRE

    Herold, Florian; Netzer, Nick

    2011-01-01

    The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...

  14. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  15. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  16. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  17. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  18. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  19. Clan structure analysis and rapidity gap probability

    International Nuclear Information System (INIS)

    Lupia, S.; Giovannini, A.; Ugoccioni, R.

    1995-01-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  20. Clan structure analysis and rapidity gap probability

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)

    1995-03-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)