WorldWideScience

Sample records for incident likelihood prediction

  1. Moral Identity Predicts Doping Likelihood via Moral Disengagement and Anticipated Guilt.

    Science.gov (United States)

    Kavussanu, Maria; Ring, Christopher

    2017-08-01

    In this study, we integrated elements of social cognitive theory of moral thought and action and the social cognitive model of moral identity to better understand doping likelihood in athletes. Participants (N = 398) recruited from a variety of team sports completed measures of moral identity, moral disengagement, anticipated guilt, and doping likelihood. Moral identity predicted doping likelihood indirectly via moral disengagement and anticipated guilt. Anticipated guilt about potential doping mediated the relationship between moral disengagement and doping likelihood. Our findings provide novel evidence to suggest that athletes, who feel that being a moral person is central to their self-concept, are less likely to use banned substances due to their lower tendency to morally disengage and the more intense feelings of guilt they expect to experience for using banned substances.

  2. Predicting incident size from limited information

    International Nuclear Information System (INIS)

    Englehardt, J.D.

    1995-01-01

    Predicting the size of low-probability, high-consequence natural disasters, industrial accidents, and pollutant releases is often difficult due to limitations in the availability of data on rare events and future circumstances. When incident data are available, they may be difficult to fit with a lognormal distribution. Two Bayesian probability distributions for inferring future incident-size probabilities from limited, indirect, and subjective information are proposed in this paper. The distributions are derived from Pareto distributions that are shown to fit data on different incident types and are justified theoretically. The derived distributions incorporate both inherent variability and uncertainty due to information limitations. Results were analyzed to determine the amount of data needed to predict incident-size probabilities in various situations. Information requirements for incident-size prediction using the methods were low, particularly when the population distribution had a thick tail. Use of the distributions to predict accumulated oil-spill consequences was demonstrated

  3. Age-specific incidence of A/H1N1 2009 influenza infection in England from sequential antibody prevalence data using likelihood-based estimation.

    Directory of Open Access Journals (Sweden)

    Marc Baguelin

    2011-02-01

    Full Text Available Estimating the age-specific incidence of an emerging pathogen is essential for understanding its severity and transmission dynamics. This paper describes a statistical method that uses likelihoods to estimate incidence from sequential serological data. The method requires information on seroconversion intervals and allows integration of information on the temporal distribution of cases from clinical surveillance. Among a family of candidate incidences, a likelihood function is derived by reconstructing the change in seroprevalence from seroconversion following infection and comparing it with the observed sequence of positivity among the samples. This method is applied to derive the cumulative and weekly incidence of A/H1N1 pandemic influenza in England during the second wave using sera taken between September 2009 and February 2010 in four age groups (1-4, 5-14, 15-24, 25-44 years. The highest cumulative incidence was in 5-14 year olds (59%, 95% credible interval (CI: 52%, 68% followed by 1-4 year olds (49%, 95% CI: 38%, 61%, rates 20 and 40 times higher respectively than estimated from clinical surveillance. The method provides a more accurate and continuous measure of incidence than achieved by comparing prevalence in samples grouped by time period.

  4. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  5. Estimating likelihood of future crashes for crash-prone drivers

    OpenAIRE

    Subasish Das; Xiaoduan Sun; Fan Wang; Charles Leboeuf

    2015-01-01

    At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the a...

  6. [Application of ARIMA model on prediction of malaria incidence].

    Science.gov (United States)

    Jing, Xia; Hua-Xun, Zhang; Wen, Lin; Su-Jian, Pei; Ling-Cong, Sun; Xiao-Rong, Dong; Mu-Min, Cao; Dong-Ni, Wu; Shunxiang, Cai

    2016-01-29

    To predict the incidence of local malaria of Hubei Province applying the Autoregressive Integrated Moving Average model (ARIMA). SPSS 13.0 software was applied to construct the ARIMA model based on the monthly local malaria incidence in Hubei Province from 2004 to 2009. The local malaria incidence data of 2010 were used for model validation and evaluation. The model of ARIMA (1, 1, 1) (1, 1, 0) 12 was tested as relatively the best optimal with the AIC of 76.085 and SBC of 84.395. All the actual incidence data were in the range of 95% CI of predicted value of the model. The prediction effect of the model was acceptable. The ARIMA model could effectively fit and predict the incidence of local malaria of Hubei Province.

  7. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  8. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Science.gov (United States)

    Heffron, Renee; Cohen, Craig R; Ngure, Kenneth; Bukusi, Elizabeth; Were, Edwin; Kiarie, James; Mugo, Nelly; Celum, Connie; Baeten, Jared M

    2015-01-01

    HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception) or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year. Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects. The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC) for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76). Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score. A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  9. Supervised maximum-likelihood weighting of composite protein networks for complex prediction

    Directory of Open Access Journals (Sweden)

    Yong Chern Han

    2012-12-01

    Full Text Available Abstract Background Protein complexes participate in many important cellular functions, so finding the set of existent complexes is essential for understanding the organization and regulation of processes in the cell. With the availability of large amounts of high-throughput protein-protein interaction (PPI data, many algorithms have been proposed to discover protein complexes from PPI networks. However, such approaches are hindered by the high rate of noise in high-throughput PPI data, including spurious and missing interactions. Furthermore, many transient interactions are detected between proteins that are not from the same complex, while not all proteins from the same complex may actually interact. As a result, predicted complexes often do not match true complexes well, and many true complexes go undetected. Results We address these challenges by integrating PPI data with other heterogeneous data sources to construct a composite protein network, and using a supervised maximum-likelihood approach to weight each edge based on its posterior probability of belonging to a complex. We then use six different clustering algorithms, and an aggregative clustering strategy, to discover complexes in the weighted network. We test our method on Saccharomyces cerevisiae and Homo sapiens, and show that complex discovery is improved: compared to previously proposed supervised and unsupervised weighting approaches, our method recalls more known complexes, achieves higher precision at all recall levels, and generates novel complexes of greater functional similarity. Furthermore, our maximum-likelihood approach allows learned parameters to be used to visualize and evaluate the evidence of novel predictions, aiding human judgment of their credibility. Conclusions Our approach integrates multiple data sources with supervised learning to create a weighted composite protein network, and uses six clustering algorithms with an aggregative clustering strategy to

  10. A new, accurate predictive model for incident hypertension

    DEFF Research Database (Denmark)

    Völzke, Henry; Fung, Glenn; Ittermann, Till

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures.......Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  11. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  12. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Directory of Open Access Journals (Sweden)

    Renee Heffron

    Full Text Available HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year.Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects.The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76. Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score.A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  13. Fatty liver incidence and predictive variables

    International Nuclear Information System (INIS)

    Tsuneto, Akira; Seto, Shinji; Maemura, Koji; Hida, Ayumi; Sera, Nobuko; Imaizumi, Misa; Ichimaru, Shinichiro; Nakashima, Eiji; Akahoshi, Masazumi

    2010-01-01

    Although fatty liver predicts ischemic heart disease, the incidence and predictors of fatty liver need examination. The objective of this study was to determine fatty liver incidence and predictive variables. Using abdominal ultrasonography, we followed biennially through 2007 (mean follow-up, 11.6±4.6 years) 1635 Nagasaki atomic bomb survivors (606 men) without fatty liver at baseline (November 1990 through October 1992). We examined potential predictive variables with the Cox proportional hazard model and longitudinal trends with the Wilcoxon rank-sum test. In all, 323 (124 men) new fatty liver cases were diagnosed. The incidence was 19.9/1000 person-years (22.3 for men, 18.6 for women) and peaked in the sixth decade of life. After controlling for age, sex, and smoking and drinking habits, obesity (relative risk (RR), 2.93; 95% confidence interval (CI), 2.33-3.69, P<0.001), low high-density lipoprotein-cholesterol (RR, 1.87; 95% CI, 1.42-2.47; P<0.001), hypertriglyceridemia (RR, 2.49; 95% CI, 1.96-3.15; P<0.001), glucose intolerance (RR, 1.51; 95% CI, 1.09-2.10; P=0.013) and hypertension (RR, 1.63; 95% CI, 1.30-2.04; P<0.001) were predictive of fatty liver. In multivariate analysis including all variables, obesity (RR, 2.55; 95% CI, 1.93-3.38; P<0.001), hypertriglyceridemia (RR, 1.92; 95% CI, 1.41-2.62; P<0.001) and hypertension (RR, 1.31; 95% CI, 1.01-1.71; P=0.046) remained predictive. In fatty liver cases, body mass index and serum triglycerides, but not systolic or diastolic blood pressure, increased significantly and steadily up to the time of the diagnosis. Obesity, hypertriglyceridemia and, to a lesser extent, hypertension might serve as predictive variables for fatty liver. (author)

  14. Memory Binding Test Predicts Incident Amnestic Mild Cognitive Impairment.

    Science.gov (United States)

    Mowrey, Wenzhu B; Lipton, Richard B; Katz, Mindy J; Ramratan, Wendy S; Loewenstein, David A; Zimmerman, Molly E; Buschke, Herman

    2016-07-14

    The Memory Binding Test (MBT), previously known as Memory Capacity Test, has demonstrated discriminative validity for distinguishing persons with amnestic mild cognitive impairment (aMCI) and dementia from cognitively normal elderly. We aimed to assess the predictive validity of the MBT for incident aMCI. In a longitudinal, community-based study of adults aged 70+, we administered the MBT to 246 cognitively normal elderly adults at baseline and followed them annually. Based on previous work, a subtle reduction in memory binding at baseline was defined by a Total Items in the Paired (TIP) condition score of ≤22 on the MBT. Cox proportional hazards models were used to assess the predictive validity of the MBT for incident aMCI accounting for the effects of covariates. The hazard ratio of incident aMCI was also assessed for different prediction time windows ranging from 4 to 7 years of follow-up, separately. Among 246 controls who were cognitively normal at baseline, 48 developed incident aMCI during follow-up. A baseline MBT reduction was associated with an increased risk for developing incident aMCI (hazard ratio (HR) = 2.44, 95% confidence interval: 1.30-4.56, p = 0.005). When varying the prediction window from 4-7 years, the MBT reduction remained significant for predicting incident aMCI (HR range: 2.33-3.12, p: 0.0007-0.04). Persons with poor performance on the MBT are at significantly greater risk for developing incident aMCI. High hazard ratios up to seven years of follow-up suggest that the MBT is sensitive to early disease.

  15. Bayesian Inference using Neural Net Likelihood Models for Protein Secondary Structure Prediction

    Directory of Open Access Journals (Sweden)

    Seong-Gon Kim

    2011-06-01

    Full Text Available Several techniques such as Neural Networks, Genetic Algorithms, Decision Trees and other statistical or heuristic methods have been used to approach the complex non-linear task of predicting Alpha-helicies, Beta-sheets and Turns of a proteins secondary structure in the past. This project introduces a new machine learning method by using an offline trained Multilayered Perceptrons (MLP as the likelihood models within a Bayesian Inference framework to predict secondary structures proteins. Varying window sizes are used to extract neighboring amino acid information and passed back and forth between the Neural Net models and the Bayesian Inference process until there is a convergence of the posterior secondary structure probability.

  16. The Role of Mechanical Variance and Spatial Clustering on the Likelihood of Tumor Incidence and Growth

    Science.gov (United States)

    Mirzakhel, Zibah

    When considering factors that contribute to cancer progression, modifications to both the biological and mechanical pathways play significant roles. However, less attention is placed on how the mechanical pathways can specifically contribute to cancerous behavior. Experimental studies have found that malignant cells are significantly softer than healthy, normal cells. In a tissue environment where healthy or malignant cells exist, a distribution of cell stiffness values is observed, with the mean values used to differentiate between these two populations. Rather than focus on the mean values, emphasis will be placed on the distribution, where instances of soft and stiff cells exist in the healthy tissue environment. Since cell deformability is a trait associated with cancer, the question arises as to whether the mechanical variation observed in healthy tissue cell stiffness distributions can influence any instances of tumor growth. To approach this, a 3D discrete model of cells is used, able to monitor and predict the behavior of individual cells while determining any instances of tumor growth in a healthy tissue. In addition to the mechanical variance, the spatial arrangement of cells will also be modeled, as cell interaction could further implicate any incidences of tumor-like malignant populations within the tissue. Results have shown that the likelihood of tumor incidence is driven by both by the increases in the mechanical variation in the distributions as well as larger clustering of cells that are mechanically similar, quantified primarily through higher proliferation rates of tumor-like soft cells. This can be observed though prominent negative shifts in the mean of the distribution, as it begins to transition and show instances of earlystage tumor growth. The model reveals the impact that both the mechanical variation and spatial arrangement of cells has on tumor progression, suggesting the use of these parameters as potential novel biomarkers. With a

  17. PREDICTIVE MODELS FOR SUPPORT OF INCIDENT MANAGEMENT PROCESS IN IT SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Martin SARNOVSKY

    2018-03-01

    Full Text Available ABSTRACT The work presented in this paper is focused on creating of predictive models that help in the process of incident resolution and implementation of IT infrastructure changes to increase the overall support of IT management. Our main objective was to build the predictive models using machine learning algorithms and CRISP-DM methodology. We used the incident and related changes database obtained from the IT environment of the Rabobank Group company, which contained information about the processing of the incidents during the incident management process. We decided to investigate the dependencies between the incident observation on particular infrastructure component and the actual source of the incident as well as the dependency between the incidents and related changes in the infrastructure. We used Random Forests and Gradient Boosting Machine classifiers in the process of identification of incident source as well as in the prediction of possible impact of the observed incident. Both types of models were tested on testing set and evaluated using defined metrics.

  18. Maximum Likelihood Method for Predicting Environmental Conditions from Assemblage Composition: The R Package bio.infer

    Directory of Open Access Journals (Sweden)

    Lester L. Yuan

    2007-06-01

    Full Text Available This paper provides a brief introduction to the R package bio.infer, a set of scripts that facilitates the use of maximum likelihood (ML methods for predicting environmental conditions from assemblage composition. Environmental conditions can often be inferred from only biological data, and these inferences are useful when other sources of data are unavailable. ML prediction methods are statistically rigorous and applicable to a broader set of problems than more commonly used weighted averaging techniques. However, ML methods require a substantially greater investment of time to program algorithms and to perform computations. This package is designed to reduce the effort required to apply ML prediction methods.

  19. Incidence and predicting factors of falls of older inpatients

    Directory of Open Access Journals (Sweden)

    Hellen Cristina de Almeida Abreu

    2015-01-01

    Full Text Available OBJECTIVE To estimate the incidence and predicting factors associated with falls among older inpatients. METHODS Prospective cohort study conducted in clinical units of three hospitals in Cuiaba, MT, Midwestern Brazil, from March to August 2013. In this study, 221 inpatients aged 60 or over were followed until hospital discharge, death, or fall. The method of incidence density was used to calculate incidence rates. Bivariate analysis was performed by Chi-square test, and multiple analysis was performed by Cox regression. RESULTS The incidence of falls was 12.6 per 1,000 patients/day. Predicting factors for falls during hospitalization were: low educational level (RR = 2.48; 95%CI 1.17;5.25, polypharmacy (RR = 4.42; 95%CI 1.77;11.05, visual impairment (RR = 2.06; 95%CI 1.01;4.23, gait and balance impairment (RR = 2.95; 95%CI 1.22;7.14, urinary incontinence (RR = 5.67; 95%CI 2.58;12.44 and use of laxatives (RR = 4.21; 95%CI 1.15;15.39 and antipsychotics (RR = 4.10; 95%CI 1.38;12.13. CONCLUSIONS The incidence of falls of older inpatients is high. Predicting factors found for falls were low education level, polypharmacy, visual impairment, gait and balance impairment, urinary incontinence and use of laxatives and antipsychotics. Measures to prevent falls in hospitals are needed to reduce the incidence of this event.

  20. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Directory of Open Access Journals (Sweden)

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  1. Anterior Segment Imaging Predicts Incident Gonioscopic Angle Closure.

    Science.gov (United States)

    Baskaran, Mani; Iyer, Jayant V; Narayanaswamy, Arun K; He, Yingke; Sakata, Lisandro M; Wu, Renyi; Liu, Dianna; Nongpiur, Monisha E; Friedman, David S; Aung, Tin

    2015-12-01

    To investigate the incidence of gonioscopic angle closure after 4 years in subjects with gonioscopically open angles but varying degrees of angle closure detected on anterior segment optical coherence tomography (AS OCT; Visante; Carl Zeiss Meditec, Dublin, CA) at baseline. Prospective, observational study. Three hundred forty-two subjects, mostly Chinese, 50 years of age or older, were recruited, of whom 65 were controls with open angles on gonioscopy and AS OCT at baseline, and 277 were cases with baseline open angles on gonioscopy but closed angles (1-4 quadrants) on AS OCT scans. All subjects underwent gonioscopy and AS OCT at baseline (horizontal and vertical single scans) and after 4 years. The examiner performing gonioscopy was masked to the baseline and AS OCT data. Angle closure in a quadrant was defined as nonvisibility of the posterior trabecular meshwork by gonioscopy and visible iridotrabecular contact beyond the scleral spur in AS OCT scans. Gonioscopic angle closure in 2 or 3 quadrants after 4 years. There were no statistically significant differences in age, ethnicity, or gender between cases and controls. None of the control subjects demonstrated gonioscopic angle closure after 4 years. Forty-eight of the 277 subjects (17.3%; 95% confidence interval [CI], 12.8-23; P < 0.0001) with at least 1 quadrant of angle closure on AS OCT at baseline demonstrated gonioscopic angle closure in 2 or more quadrants, whereas 28 subjects (10.1%; 95% CI, 6.7-14.6; P < 0.004) demonstrated gonioscopic angle closure in 3 or more quadrants after 4 years. Individuals with more quadrants of angle closure on baseline AS OCT scans had a greater likelihood of gonioscopic angle closure developing after 4 years (P < 0.0001, chi-square test for trend for both definitions of angle closure). Anterior segment OCT imaging at baseline predicts incident gonioscopic angle closure after 4 years among subjects who have gonioscopically open angles and iridotrabecular contact on AS OCT at

  2. A new, accurate predictive model for incident hypertension.

    Science.gov (United States)

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-11-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures. The primary study population consisted of 1605 normotensive individuals aged 20-79 years with 5-year follow-up from the population-based study, that is the Study of Health in Pomerania (SHIP). The initial set was randomly split into a training and a testing set. We used a probabilistic graphical model applying a Bayesian network to create a predictive model for incident hypertension and compared the predictive performance with the established Framingham risk score for hypertension. Finally, the model was validated in 2887 participants from INTER99, a Danish community-based intervention study. In the training set of SHIP data, the Bayesian network used a small subset of relevant baseline features including age, mean arterial pressure, rs16998073, serum glucose and urinary albumin concentrations. Furthermore, we detected relevant interactions between age and serum glucose as well as between rs16998073 and urinary albumin concentrations [area under the receiver operating characteristic (AUC 0.76)]. The model was confirmed in the SHIP validation set (AUC 0.78) and externally replicated in INTER99 (AUC 0.77). Compared to the established Framingham risk score for hypertension, the predictive performance of the new model was similar in the SHIP validation set and moderately better in INTER99. Data mining procedures identified a predictive model for incident hypertension, which included innovative and easy-to-measure variables. The findings promise great applicability in screening settings and clinical practice.

  3. Identifying Predictive Factors for Incident Reports in Patients Receiving Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Elnahal, Shereef M., E-mail: selnaha1@jhmi.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States); Blackford, Amanda [Department of Oncology Biostatistics, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States); Smith, Koren; Souranis, Annette N.; Briner, Valerie; McNutt, Todd R.; DeWeese, Theodore L.; Wright, Jean L.; Terezakis, Stephanie A. [Department of Radiation Oncology and Molecular Radiation Sciences, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States)

    2016-04-01

    Purpose: To describe radiation therapy cases during which voluntary incident reporting occurred; and identify patient- or treatment-specific factors that place patients at higher risk for incidents. Methods and Materials: We used our institution's incident learning system to build a database of patients with incident reports filed between January 2011 and December 2013. Patient- and treatment-specific data were reviewed for all patients with reported incidents, which were classified by step in the process and root cause. A control group of patients without events was generated for comparison. Summary statistics, likelihood ratios, and mixed-effect logistic regression models were used for group comparisons. Results: The incident and control groups comprised 794 and 499 patients, respectively. Common root causes included documentation errors (26.5%), communication (22.5%), technical treatment planning (37.5%), and technical treatment delivery (13.5%). Incidents were more frequently reported in minors (age <18 years) than in adult patients (37.7% vs 0.4%, P<.001). Patients with head and neck (16% vs 8%, P<.001) and breast (20% vs 15%, P=.03) primaries more frequently had incidents, whereas brain (18% vs 24%, P=.008) primaries were less frequent. Larger tumors (17% vs 10% had T4 lesions, P=.02), and cases on protocol (9% vs 5%, P=.005) or with intensity modulated radiation therapy/image guided intensity modulated radiation therapy (52% vs 43%, P=.001) were more likely to have incidents. Conclusions: We found several treatment- and patient-specific variables associated with incidents. These factors should be considered by treatment teams at the time of peer review to identify patients at higher risk. Larger datasets are required to recommend changes in care process standards, to minimize safety risks.

  4. Identifying Predictive Factors for Incident Reports in Patients Receiving Radiation Therapy

    International Nuclear Information System (INIS)

    Elnahal, Shereef M.; Blackford, Amanda; Smith, Koren; Souranis, Annette N.; Briner, Valerie; McNutt, Todd R.; DeWeese, Theodore L.; Wright, Jean L.; Terezakis, Stephanie A.

    2016-01-01

    Purpose: To describe radiation therapy cases during which voluntary incident reporting occurred; and identify patient- or treatment-specific factors that place patients at higher risk for incidents. Methods and Materials: We used our institution's incident learning system to build a database of patients with incident reports filed between January 2011 and December 2013. Patient- and treatment-specific data were reviewed for all patients with reported incidents, which were classified by step in the process and root cause. A control group of patients without events was generated for comparison. Summary statistics, likelihood ratios, and mixed-effect logistic regression models were used for group comparisons. Results: The incident and control groups comprised 794 and 499 patients, respectively. Common root causes included documentation errors (26.5%), communication (22.5%), technical treatment planning (37.5%), and technical treatment delivery (13.5%). Incidents were more frequently reported in minors (age <18 years) than in adult patients (37.7% vs 0.4%, P<.001). Patients with head and neck (16% vs 8%, P<.001) and breast (20% vs 15%, P=.03) primaries more frequently had incidents, whereas brain (18% vs 24%, P=.008) primaries were less frequent. Larger tumors (17% vs 10% had T4 lesions, P=.02), and cases on protocol (9% vs 5%, P=.005) or with intensity modulated radiation therapy/image guided intensity modulated radiation therapy (52% vs 43%, P=.001) were more likely to have incidents. Conclusions: We found several treatment- and patient-specific variables associated with incidents. These factors should be considered by treatment teams at the time of peer review to identify patients at higher risk. Larger datasets are required to recommend changes in care process standards, to minimize safety risks.

  5. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Prediction of cancer incidence in Tyrol/Austria for year of diagnosis 2020.

    Science.gov (United States)

    Oberaigner, Willi; Geiger-Gritsch, Sabine

    2014-10-01

    Prediction of the number of incident cancer cases is very relevant for health planning purposes and allocation of resources. The shift towards elder age groups in central European populations in the next decades is likely to contribute to an increase in cancer incidence for many cancer sites. In Tyrol, cancer incidence data have been registered on a high level of completeness for more than 20 years. We therefore aimed to compute well-founded predictions of cancer incidence for Tyrol for the year 2020 for all frequent cancer sites and for all cancer sites combined. After defining a prediction base range for every cancer site, we extrapolated the age-specific time trends in the prediction base range following a linear model for increasing and a log-linear model for decreasing time trends. The extrapolated time trends were evaluated for the year 2020 applying population figures supplied by Statistics Austria. Compared with the number of annual incident cases for the year 2009 for all cancer sites combined except non-melanoma skin cancer, we predicted an increase of 235 (15 %) and 362 (21 %) for females and males, respectively. For both sexes, more than 90 % of the increase is attributable to the shift toward older age groups in the next decade. The biggest increase in absolute numbers is seen for females in breast cancer (92, 21 %), lung cancer (64, 52 %), colorectal cancer (40, 24 %), melanoma (38, 30 %) and the haematopoietic system (37, 35 %) and for males in prostate cancer (105, 25 %), colorectal cancer (91, 45 %), the haematopoietic system (71, 55 %), bladder cancer (69, 100 %) and melanoma (64, 52 %). The increase in the number of incident cancer cases of 15 % in females and 21 % in males in the next decade is very relevant for planning purposes. However, external factors cause uncertainty in the prediction of some cancer sites (mainly prostate cancer and colorectal cancer) and the prediction intervals are still broad. Therefore

  7. Predicting hepatitis B monthly incidence rates using weighted Markov chains and time series methods.

    Science.gov (United States)

    Shahdoust, Maryam; Sadeghifar, Majid; Poorolajal, Jalal; Javanrooh, Niloofar; Amini, Payam

    2015-01-01

    Hepatitis B (HB) is a major global mortality. Accurately predicting the trend of the disease can provide an appropriate view to make health policy disease prevention. This paper aimed to apply three different to predict monthly incidence rates of HB. This historical cohort study was conducted on the HB incidence data of Hamadan Province, the west of Iran, from 2004 to 2012. Weighted Markov Chain (WMC) method based on Markov chain theory and two time series models including Holt Exponential Smoothing (HES) and SARIMA were applied on the data. The results of different applied methods were compared to correct percentages of predicted incidence rates. The monthly incidence rates were clustered into two clusters as state of Markov chain. The correct predicted percentage of the first and second clusters for WMC, HES and SARIMA methods was (100, 0), (84, 67) and (79, 47) respectively. The overall incidence rate of HBV is estimated to decrease over time. The comparison of results of the three models indicated that in respect to existing seasonality trend and non-stationarity, the HES had the most accurate prediction of the incidence rates.

  8. Straight line fitting and predictions: On a marginal likelihood approach to linear regression and errors-in-variables models

    Science.gov (United States)

    Christiansen, Bo

    2015-04-01

    Linear regression methods are without doubt the most used approaches to describe and predict data in the physical sciences. They are often good first order approximations and they are in general easier to apply and interpret than more advanced methods. However, even the properties of univariate regression can lead to debate over the appropriateness of various models as witnessed by the recent discussion about climate reconstruction methods. Before linear regression is applied important choices have to be made regarding the origins of the noise terms and regarding which of the two variables under consideration that should be treated as the independent variable. These decisions are often not easy to make but they may have a considerable impact on the results. We seek to give a unified probabilistic - Bayesian with flat priors - treatment of univariate linear regression and prediction by taking, as starting point, the general errors-in-variables model (Christiansen, J. Clim., 27, 2014-2031, 2014). Other versions of linear regression can be obtained as limits of this model. We derive the likelihood of the model parameters and predictands of the general errors-in-variables model by marginalizing over the nuisance parameters. The resulting likelihood is relatively simple and easy to analyze and calculate. The well known unidentifiability of the errors-in-variables model is manifested as the absence of a well-defined maximum in the likelihood. However, this does not mean that probabilistic inference can not be made; the marginal likelihoods of model parameters and the predictands have, in general, well-defined maxima. We also include a probabilistic version of classical calibration and show how it is related to the errors-in-variables model. The results are illustrated by an example from the coupling between the lower stratosphere and the troposphere in the Northern Hemisphere winter.

  9. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  10. PREVAIL: Predicting Recovery through Estimation and Visualization of Active and Incident Lesions.

    Science.gov (United States)

    Dworkin, Jordan D; Sweeney, Elizabeth M; Schindler, Matthew K; Chahin, Salim; Reich, Daniel S; Shinohara, Russell T

    2016-01-01

    The goal of this study was to develop a model that integrates imaging and clinical information observed at lesion incidence for predicting the recovery of white matter lesions in multiple sclerosis (MS) patients. Demographic, clinical, and magnetic resonance imaging (MRI) data were obtained from 60 subjects with MS as part of a natural history study at the National Institute of Neurological Disorders and Stroke. A total of 401 lesions met the inclusion criteria and were used in the study. Imaging features were extracted from the intensity-normalized T1-weighted (T1w) and T2-weighted sequences as well as magnetization transfer ratio (MTR) sequence acquired at lesion incidence. T1w and MTR signatures were also extracted from images acquired one-year post-incidence. Imaging features were integrated with clinical and demographic data observed at lesion incidence to create statistical prediction models for long-term damage within the lesion. The performance of the T1w and MTR predictions was assessed in two ways: first, the predictive accuracy was measured quantitatively using leave-one-lesion-out cross-validated (CV) mean-squared predictive error. Then, to assess the prediction performance from the perspective of expert clinicians, three board-certified MS clinicians were asked to individually score how similar the CV model-predicted one-year appearance was to the true one-year appearance for a random sample of 100 lesions. The cross-validated root-mean-square predictive error was 0.95 for normalized T1w and 0.064 for MTR, compared to the estimated measurement errors of 0.48 and 0.078 respectively. The three expert raters agreed that T1w and MTR predictions closely resembled the true one-year follow-up appearance of the lesions in both degree and pattern of recovery within lesions. This study demonstrates that by using only information from a single visit at incidence, we can predict how a new lesion will recover using relatively simple statistical techniques. The

  11. Prediction Model for Gastric Cancer Incidence in Korean Population.

    Directory of Open Access Journals (Sweden)

    Bang Wool Eom

    Full Text Available Predicting high risk groups for gastric cancer and motivating these groups to receive regular checkups is required for the early detection of gastric cancer. The aim of this study is was to develop a prediction model for gastric cancer incidence based on a large population-based cohort in Korea.Based on the National Health Insurance Corporation data, we analyzed 10 major risk factors for gastric cancer. The Cox proportional hazards model was used to develop gender specific prediction models for gastric cancer development, and the performance of the developed model in terms of discrimination and calibration was also validated using an independent cohort. Discrimination ability was evaluated using Harrell's C-statistics, and the calibration was evaluated using a calibration plot and slope.During a median of 11.4 years of follow-up, 19,465 (1.4% and 5,579 (0.7% newly developed gastric cancer cases were observed among 1,372,424 men and 804,077 women, respectively. The prediction models included age, BMI, family history, meal regularity, salt preference, alcohol consumption, smoking and physical activity for men, and age, BMI, family history, salt preference, alcohol consumption, and smoking for women. This prediction model showed good accuracy and predictability in both the developing and validation cohorts (C-statistics: 0.764 for men, 0.706 for women.In this study, a prediction model for gastric cancer incidence was developed that displayed a good performance.

  12. Pelvic Incidence: A Predictive Factor for Three-Dimensional Acetabular Orientation—A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Christophe Boulay

    2014-01-01

    Full Text Available Acetabular cup orientation (inclination and anteversion is a fundamental topic in orthopaedics and depends on pelvis tilt (positional parameter emphasising the notion of a safe range of pelvis tilt. The hypothesis was that pelvic incidence (morphologic parameter could yield a more accurate and reliable assessment than pelvis tilt. The aim was to find out a predictive equation of acetabular 3D orientation parameters which were determined by pelvic incidence to include in the model. The second aim was to consider the asymmetry between the right and left acetabulae. Twelve pelvic anatomic specimens were measured with an electromagnetic Fastrak system (Polhemus Society providing 3D position of anatomical landmarks to allow measurement of acetabular and pelvic parameters. Acetabulum and pelvis data were correlated by a Spearman matrix. A robust linear regression analysis provided prediction of acetabulum axes. The orientation of each acetabulum could be predicted by the incidence. The incidence is correlated with the morphology of acetabula. The asymmetry of the acetabular roof was correlated with pelvic incidence. This study allowed analysis of relationships of acetabular orientation and pelvic incidence. Pelvic incidence (morphologic parameter could determine the safe range of pelvis tilt (positional parameter for an individual and not a group.

  13. Symptoms of delirium predict incident delirium in older long-term care residents.

    Science.gov (United States)

    Cole, Martin G; McCusker, Jane; Voyer, Philippe; Monette, Johanne; Champoux, Nathalie; Ciampi, Antonio; Vu, Minh; Dyachenko, Alina; Belzile, Eric

    2013-06-01

    Detection of long-term care (LTC) residents at risk of delirium may lead to prevention of this disorder. The primary objective of this study was to determine if the presence of one or more Confusion Assessment Method (CAM) core symptoms of delirium at baseline assessment predicts incident delirium. Secondary objectives were to determine if the number or the type of symptoms predict incident delirium. The study was a secondary analysis of data collected for a prospective study of delirium among older residents of seven LTC facilities in Montreal and Quebec City, Canada. The Mini-Mental State Exam (MMSE), CAM, Delirium Index (DI), Hierarchic Dementia Scale, Barthel Index, and Cornell Scale for Depression were completed at baseline. The MMSE, CAM, and DI were repeated weekly for six months. Multivariate Cox regression models were used to determine if baseline symptoms predict incident delirium. Of 273 residents, 40 (14.7%) developed incident delirium. Mean (SD) time to onset of delirium was 10.8 (7.4) weeks. When one or more CAM core symptoms were present at baseline, the Hazard Ratio (HR) for incident delirium was 3.5 (95% CI = 1.4, 8.9). The HRs for number of symptoms present ranged from 2.9 (95% CI = 1.0, 8.3) for one symptom to 3.8 (95% CI = 1.3, 11.0) for three symptoms. The HR for one type of symptom, fluctuation, was 2.2 (95% CI = 1.2, 4.2). The presence of CAM core symptoms at baseline assessment predicts incident delirium in older LTC residents. These findings have potentially important implications for clinical practice and research in LTC settings.

  14. Predictive Index The Incidence Of Tuberculosis Children In South Kalimantan Province

    Directory of Open Access Journals (Sweden)

    Bahrul Ilmi

    2015-08-01

    Full Text Available The research objective to formulate predictive index of Tuberculosis Children in South Kalimantan province. Research methods combined mixed methods with a combination of research model Sequential Exploratory Design qualitative approach to support quantitative and centered on quantitative Sugiono 2012 case control design. The number of qualitative sample was 16 respondents to interviews and 48 respondents for FGD. The number of quantitative research sample was 216 consisted of 62 cases and 154 controls. Qualitative sampling by purposive sampling and quantitative Multi-stage Cluster random sampling on 3 stages. The analysis technique used is descriptive qualitative and Confirmatory Factor Analysis Confirmatory Factor Analysis measure the latent of variables by using path analysis path analysis with the program Linear Structural Relationships LISREL. The results showed a positive effect on the socio-cultural environment and significantly associated with the incidence of Tuberculosis Children. While the physical environment of the house positively and significantly with biological environments and the incidence of Tuberculosis Children and immunization and nutrition status of children positively and significantly to the incidence of Tuberculosis of the Child as well as to the biological environment positive and significant effect on the incidence of TB Children. Formulation Predictive Index of Tuberculosis Children in South Kalimantan province. is index 019 Physical Environment Home 044 053 Biological Environment Social Environment Culture 019 Status Immunization and Child Nutrition. The results of all the R-square value indicates that all of the R-square values 0.5. This means that a predictive model of TB Kids index has met the required Goodness of Fit. New findings from research of this dissertation are 1. Research Variable of social networks social support and collective efficacy were associated with the incidence of Tuberculosis Children. 2

  15. Effects of passengers on bus driver celeration behavior and incident prediction.

    Science.gov (United States)

    Af Wåhlberg, A E

    2007-01-01

    Driver celeration (speed change) behavior of bus drivers has previously been found to predict their traffic incident involvement, but it has also been ascertained that the level of celeration is influenced by the number of passengers carried as well as other traffic density variables. This means that the individual level of celeration is not as well estimated as could be the case. Another hypothesized influence of the number of passengers is that of differential quality of measurements, where high passenger density circumstances are supposed to yield better estimates of the individual driver component of celeration behavior. Comparisons were made between different variants of the celeration as predictor of traffic incidents of bus drivers. The number of bus passengers was held constant, and cases identified by their number of passengers per kilometer during measurement were excluded (in 12 samples of repeated measurements). After holding passengers constant, the correlations between celeration behavior and incident record increased very slightly. Also, the selective prediction of incident record of those drivers who had had many passengers when measured increased the correlations even more. The influence of traffic density variables like the number of passengers have little direct influence on the predictive power of celeration behavior, despite the impact upon absolute celeration level. Selective prediction on the other hand increased correlations substantially. This unusual effect was probably due to how the individual propensity for high or low celeration driving was affected by the number of stops made and general traffic density; differences between drivers in this respect were probably enhanced by the denser traffic, thus creating a better estimate of the theoretical celeration behavior parameter C. The new concept of selective prediction was discussed in terms of making estimates of the systematic differences in quality of the individual driver data.

  16. Gaussian copula as a likelihood function for environmental models

    Science.gov (United States)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  17. Development of incident progress prediction technologies for nuclear emergency preparedness. Current status and future subjects

    International Nuclear Information System (INIS)

    Yoshida, Yoshitaka; Yamamoto, Yasunori; Kusunoki, Takayoshi; Kawasaki, Ikuo; Yanagi, Chihiro; Kinoshita, Ikuo; Iwasaki, Yoshito

    2014-01-01

    Nuclear licensees are required to maintain a prediction system during normal condition for using a nuclear emergency by the Basic Plan for Disaster Prevention of government. With prediction of the incident progress, if the present condition of nuclear power plant is understood appropriately and it grows more serious with keeping the present situation, it is in predicting what kind of situation will be occurred in the near future, choosing the effective countermeasures against the coming threat, and understanding the time available of intervention time. Following the accident on September 30 1999 in the nuclear fuel fabrication facility in Tokai Village of Ibaraki Prefecture, the Institute of Nuclear Safety System started development of incident progress prediction technologies for nuclear emergency preparedness. We have performed technical applications and made improvements in nuclear emergency exercises and verified the developed systems using the observed values of the Fukushima Daiichi Nuclear Power Plant accident. As a result, our developed Incident Progress Prediction System was applied to nuclear emergency exercises and we accumulated knowledge and experience by which we improved the system to make predictions more rapidly and more precisely, including for example, the development of a prediction method for leak size of reactor coolant. On the other hand, if a rapidly progressing incident occurs, since end users need simple and quick predictions about the public's protection and evacuation areas, we developed the Radioactive Materials Release, Radiation Dose and Radiological Protection Area Prediction System which changed solving an inverse problem into a forward problem solution. In view of the water-level-decline incident of the spent fuel storage facility at the Fukushima Daiichi Nuclear Power Plant, the spent fuel storage facility water level and the water temperature evaluation tool were improved. Such incident progress prediction technologies were

  18. Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    CERN Document Server

    Conway, J.S.

    2011-01-01

    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.

  19. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Financial and health literacy predict incident AD dementia and AD pathology

    Science.gov (United States)

    Yu, Lei; Wilson, Robert S.; Schneider, Julie A.; Bennett, David A.; Boyle, Patricia A.

    2017-01-01

    Background Domain specific literacy is a multidimensional construct that requires multiple resources including cognitive and non-cognitive factors. Objective We test the hypothesis that domain specific literacy is associated with AD dementia and AD pathology after controlling for cognition. Methods Participants were community based older persons who completed a baseline literacy assessment, underwent annual clinical evaluations for up to 8 years and agreed to organ donation after death. Financial and health literacy was measured using 32 questions and cognition was measured using 19 tests. Annual diagnosis of AD dementia followed standard criteria. AD pathology was examined post-mortem by quantifying plaques and tangles. Cox models examined the association of literacy with incident AD dementia. Performance of model prediction for incident AD dementia was assessed using indices for integrated discrimination improvement and continuous net reclassification improvement. Linear regression models examined the independent association of literacy with AD pathology in autopsied participants. Results All 805 participants were free of dementia at baseline and 102 (12.7%) developed AD dementia during the follow-up. Lower literacy was associated with higher risk for incident AD dementia (pliteracy measure had better predictive performance than the one with demographics and cognition only. Lower literacy also was associated with higher burden of AD pathology after controlling for cognition (β=0.07, p=0.035). Conclusion Literacy predicts incident AD dementia and AD pathology in community-dwelling older persons, and the association is independent of traditional measures of cognition. PMID:28157101

  1. Financial and Health Literacy Predict Incident Alzheimer's Disease Dementia and Pathology.

    Science.gov (United States)

    Yu, Lei; Wilson, Robert S; Schneider, Julie A; Bennett, David A; Boyle, Patricia A

    2017-01-01

    Domain specific literacy is a multidimensional construct that requires multiple resources including cognitive and non-cognitive factors. We test the hypothesis that domain specific literacy is associated with Alzheimer's disease (AD) dementia and AD pathology after controlling for cognition. Participants were community-based older persons who completed a baseline literacy assessment, underwent annual clinical evaluations for up to 8 years, and agreed to organ donation after death. Financial and health literacy was measured using 32 questions and cognition was measured using 19 tests. Annual diagnosis of AD dementia followed standard criteria. AD pathology was examined postmortem by quantifying plaques and tangles. Cox models examined the association of literacy with incident AD dementia. Performance of model prediction for incident AD dementia was assessed using indices for integrated discrimination improvement and continuous net reclassification improvement. Linear regression models examined the independent association of literacy with AD pathology in autopsied participants. All 805 participants were free of dementia at baseline and 102 (12.7%) developed AD dementia during the follow-up. Lower literacy was associated with higher risk for incident AD dementia (p literacy measure had better predictive performance than the one with demographics and cognition only. Lower literacy also was associated with higher burden of AD pathology after controlling for cognition (β= 0.07, p = 0.035). Literacy predicts incident AD dementia and AD pathology in community-dwelling older persons, and the association is independent of traditional measures of cognition.

  2. Incidence, Mortality, and Predictive Factors of Hepatocellular Carcinoma in Primary Biliary Cirrhosis

    Directory of Open Access Journals (Sweden)

    Kenichi Hosonuma

    2013-01-01

    Full Text Available Background. The study aims to analyze in detail the incidence, mortality using the standardized incidence ratio (SIR, and standardized mortality ratio (SMR of hepatocellular carcinoma (HCC in primary biliary cirrhosis (PBC, because no large case studies have focused on the detailed statistical analysis of them in Asia. Methods. The study cohorts were consecutively diagnosed at Gunma University and its affiliated hospitals. Age- or sex-specific annual cancer incidence and deaths were obtained from Japanese Cancer Registry and Death Registry as a reference for the comparison of SIR or SMR of HCC. Moreover, univariate analyses and multivariate analyses were performed to clarify predictive factors for the incidence of HCC. Results. The overall 179 patients were followed up for a median of 97 months. HCC had developed in 13 cases. SIR for HCC was 11.6 (95% confidence interval (CI, 6.2–19.8 and SMR for HCC was 11.2 (95% CI, 5.4–20.6 in overall patients. The serum albumin levels were a predictive factor for the incidence of HCC in overall patients. Conclusions. The incidence and mortality of HCC in PBC patients were significantly higher than those in Japanese general population. PBC patients with low serum albumin levels were populations at high risk for HCC.

  3. Risk factors and likelihood of Campylobacter colonization in broiler flocks

    Directory of Open Access Journals (Sweden)

    SL Kuana

    2007-09-01

    Full Text Available Campylobacter was investigated in cecal droppings, feces, and cloacal swabs of 22 flocks of 3 to 5 week-old broilers. Risk factors and the likelihood of the presence of this agent in these flocks were determined. Management practices, such as cleaning and disinfection, feeding, drinkers, and litter treatments, were assessed. Results were evaluated using Odds Ratio (OR test, and their significance was tested by Fisher's test (p<0.05. A Campylobacter prevalence of 81.8% was found in the broiler flocks (18/22, and within positive flocks, it varied between 85 and 100%. Campylobacter incidence among sample types was homogenous, being 81.8% in cecal droppings, 80.9% in feces, and 80.4% in cloacal swabs (230. Flocks fed by automatic feeding systems presented higher incidence of Campylobacter as compared to those fed by tube feeders. Litter was reused in 63.6% of the farm, and, despite the lack of statistical significance, there was higher likelihood of Campylobacter incidence when litter was reused. Foot bath was not used in 45.5% of the flocks, whereas the use of foot bath associated to deficient lime management increased the number of positive flocks, although with no statiscal significance. The evaluated parameters were not significantly associated with Campylobacter colonization in the assessed broiler flocks.

  4. Using HPV prevalence to predict cervical cancer incidence.

    Science.gov (United States)

    Sharma, Monisha; Bruni, Laia; Diaz, Mireia; Castellsagué, Xavier; de Sanjosé, Silvia; Bosch, F Xavier; Kim, Jane J

    2013-04-15

    Knowledge of a country's cervical cancer (CC) burden is critical to informing decisions about resource allocation to combat the disease; however, many countries lack cancer registries to provide such data. We developed a prognostic model to estimate CC incidence rates in countries without cancer registries, leveraging information on human papilloma virus (HPV) prevalence, screening, and other country-level factors. We used multivariate linear regression models to identify predictors of CC incidence in 40 countries. We extracted age-specific HPV prevalence (10-year age groups) by country from a meta-analysis in women with normal cytology (N = 40) and matched to most recent CC incidence rates from Cancer Incidence in Five Continents when available (N = 36), or Globocan 2008 (N = 4). We evaluated country-level behavioral, economic, and public health indicators. CC incidence was significantly associated with age-specific HPV prevalence in women aged 35-64 (adjusted R-squared 0.41) ("base model"). Adding geographic region to the base model increased the adjusted R-squared to 0.77, but the further addition of screening was not statistically significant. Similarly, country-level macro-indicators did not improve predictive validity. Age-specific HPV prevalence at older ages was found to be a better predictor of CC incidence than prevalence in women under 35. However, HPV prevalence could not explain the entire CC burden as many factors modify women's risk of progression to cancer. Geographic region seemed to serve as a proxy for these country-level indicators. Our analysis supports the assertion that conducting a population-based HPV survey targeting women over age 35 can be valuable in approximating the CC risk in a given country. Copyright © 2012 UICC.

  5. Sex-dependent independent prediction of incident diabetes by depressive symptoms.

    Science.gov (United States)

    Akbaş-Şimşek, Tuğba; Onat, Altan; Kaya, Adnan; Tusun, Eyyup; Yüksel, Hüsniye; Can, Günay

    2017-12-01

    To study the predictive value of depressive symptoms (DeprSs) in a general population of Turkey for type 2 diabetes. Responses to three questions served to assess the sense of depression. Cox regression analyses were used regarding risk estimates for incident diabetes, after exclusion of prevalent cases of diabetes. Mean follow-up consisted of 5.15 (±1.4) years. Depressive symptoms were present at baseline in 16.2% of the whole study sample, threefold in women than men. Reduced physical activity grade was the only significant covariate at baseline in men, while younger age and lower blood pressure were significantly different in women compared with those without DeprS. In men, presence of DeprS predicted incident diabetes at a significant 2.58-fold relative risk (95% confidence interval 1.03; 6.44), after adjustment for age, systolic blood pressure, and antidepressant drug usage. When further covariates were added, waist circumference remained the only significant predictor, while DepS was attenuated to a relative risk of 2.12 (95% confidence interval 0.83; 5.40). DeprS was not associated with diabetes in women, whereas antidepressant drug usage only tended to be positively associated. Gender difference existed in the relationship between DeprS and incident diabetes. DeprS predicted subsequent development of diabetes in men alone, not in women. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Prediction of sound transmission loss through multilayered panels by using Gaussian distribution of directional incident energy

    Science.gov (United States)

    Kang; Ih; Kim; Kim

    2000-03-01

    In this study, a new prediction method is suggested for sound transmission loss (STL) of multilayered panels of infinite extent. Conventional methods such as random or field incidence approach often given significant discrepancies in predicting STL of multilayered panels when compared with the experiments. In this paper, appropriate directional distributions of incident energy to predict the STL of multilayered panels are proposed. In order to find a weighting function to represent the directional distribution of incident energy on the wall in a reverberation chamber, numerical simulations by using a ray-tracing technique are carried out. Simulation results reveal that the directional distribution can be approximately expressed by the Gaussian distribution function in terms of the angle of incidence. The Gaussian function is applied to predict the STL of various multilayered panel configurations as well as single panels. The compared results between the measurement and the prediction show good agreements, which validate the proposed Gaussian function approach.

  7. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    Science.gov (United States)

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  8. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  9. Physical constraints on the likelihood of life on exoplanets

    Science.gov (United States)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  10. Factors predictive for incidence and remission of internet addiction in young adolescents: a prospective study.

    Science.gov (United States)

    Ko, Chih-Hung; Yen, Ju-Yu; Yen, Cheng-Fang; Lin, Huang-Chi; Yang, Ming-Jen

    2007-08-01

    The aim of the study is to determine the incidence and remission rates for Internet addiction and the associated predictive factors in young adolescents over a 1-year follow-up. This was a prospective, population-based investigation. Five hundred seventeen students (267 male and 250 female) were recruited from three junior high schools in southern Taiwan. The factors examined included gender, personality, mental health, self-esteem, family function, life satisfaction, and Internet activities. The result revealed that the 1-year incidence and remission rates for Internet addiction were 7.5% and 49.5% respectively. High exploratory excitability, low reward dependence, low self-esteem, low family function, and online game playing predicted the emergency of the Internet addiction. Further, low hostility and low interpersonal sensitivity predicted remission of Internet addiction. The factors predictive incidence and remission of Internet addiction identified in this study could be provided for prevention and promoting remission of Internet addiction in adolescents.

  11. Meterology-driven Prediction of RSV/RHV Incidence in Rural Nepal

    Science.gov (United States)

    Scott, Anna; Englund, Janet; Chu, Helen; Tielsch, James; Tielsch, James; Khatry, Subarna; Leclerq, Steven C; Shrestha, Laxman; Kuypers, Jane; Steinhoff, Mark C; Katz, Joanne

    2017-01-01

    Abstract Background Incidence of respiratory syncytial virus (RSV) and rhinovirus (RHV) varies throughout the year. We aim to quantify the relationship between weather variables (temperature, humidity, precipitation, and aerosol concentration) and disease incidence in order to quantify how outbreaks of RSV and RHV are related to seasonal or sub-seasonal meteorology, and if these relationships can predict viral outbreaks of RSV and RHV. Methods Health data were collected in a community-based, prospective randomized trial of maternal influenza immunization of pregnant women and their infants conducted in rural Nepal from 2011–2014. Adult illness episodes were defined as fever plus cough, sore throat, runny nose, and/or myalgia, with infant illness defined similarly but without fever requirement. Cases were identified through longitudinal household-based weekly surveillance. Temperature, humidity, precipitation, and fine particulate matter (PM 2.5) data come from reanalysis data products NCEP, Era-Interim, and Merra-2, which are produced by assimilating historical in-situ and satellite-based observations into a weather model. Results RSV exhibits a relationship with temperature after removing the seasonal cycle (r = -0.16, N = 208, P = 0.02), and RHV exhibits a strong relationship to daily temperature (r =-0.14, N =208, P = 0.05). When lagging meteorology by up to 15 weeks, correlations with disease count and weather improve (RSV: r_max = 0.45, P < 0.05; RHV: r_max = 0.15, P = 0.05). We use an SIR model forced by lagged meteorological variables to predict RSV and RHV, suggesting that disease burden can be predicted at lead times of weeks to months. Conclusion Meteorological variables are associated with RSV and RHV incidence in rural Nepal and can be used to drive predictive models with a lead time of several months. Disclosures J. Englund, Gilead: Consultant and Investigator, Research support Chimerix: Investigator, Research support Alios: Investigator

  12. Estimating likelihood of future crashes for crash-prone drivers

    Directory of Open Access Journals (Sweden)

    Subasish Das

    2015-06-01

    Full Text Available At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the at-fault drivers. The logistic regression method is used by employing eight years' traffic crash data (2004–2011 in Louisiana. Crash predictors such as the driver's crash involvement, crash and road characteristics, human factors, collision type, and environmental factors are considered in the model. The at-fault and not-at-fault status of the crashes are used as the response variable. The developed model has identified a few important variables, and is used to correctly classify at-fault crashes up to 62.40% with a specificity of 77.25%. This model can identify as many as 62.40% of the crash incidence of at-fault drivers in the upcoming year. Traffic agencies can use the model for monitoring the performance of an at-fault crash-prone drivers and making roadway improvements meant to reduce crash proneness. From the findings, it is recommended that crash-prone drivers should be targeted for special safety programs regularly through education and regulations.

  13. Applications of Machine learning in Prediction of Breast Cancer Incidence and Mortality

    International Nuclear Information System (INIS)

    Helal, N.; Sarwat, E.

    2012-01-01

    Breast cancer is one of the leading causes of cancer deaths for the female population in both developed and developing countries. In this work we have used the baseline descriptive data about the incidence (new cancer cases) of in situ breast cancer among Wisconsin females. The documented data were from the most recent 12-years period for which data are available. Wiscons in cancer incidence and mortality (deaths due to cancer) that occurred were also considered in this work. Artificial Neural network (ANN) have been successfully applied to problems in the prediction of the number of new cancer cases and mortality. Using artificial intelligence (AI) in this study, the numbers of new cancer cases and mortality that may occur are predicted.

  14. Incidents Prediction in Road Junctions Using Artificial Neural Networks

    Science.gov (United States)

    Hajji, Tarik; Alami Hassani, Aicha; Ouazzani Jamil, Mohammed

    2018-05-01

    The implementation of an incident detection system (IDS) is an indispensable operation in the analysis of the road traffics. However the IDS may, in no case, represent an alternative to the classical monitoring system controlled by the human eye. The aim of this work is to increase detection and prediction probability of incidents in camera-monitored areas. Knowing that, these areas are monitored by multiple cameras and few supervisors. Our solution is to use Artificial Neural Networks (ANN) to analyze moving objects trajectories on captured images. We first propose a modelling of the trajectories and their characteristics, after we develop a learning database for valid and invalid trajectories, and then we carry out a comparative study to find the artificial neural network architecture that maximizes the rate of valid and invalid trajectories recognition.

  15. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  16. Approximate Likelihood

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  17. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis--data from the Osteoarthritis Initiative.

    Science.gov (United States)

    Emmanuel, K; Quinn, E; Niu, J; Guermazi, A; Roemer, F; Wirth, W; Eckstein, F; Felson, D

    2016-02-01

    To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  18. Adaptive Unscented Kalman Filter using Maximum Likelihood Estimation

    DEFF Research Database (Denmark)

    Mahmoudi, Zeinab; Poulsen, Niels Kjølstad; Madsen, Henrik

    2017-01-01

    The purpose of this study is to develop an adaptive unscented Kalman filter (UKF) by tuning the measurement noise covariance. We use the maximum likelihood estimation (MLE) and the covariance matching (CM) method to estimate the noise covariance. The multi-step prediction errors generated...

  19. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis – data from the Osteoarthritis Initiative

    Science.gov (United States)

    Emmanuel, K.; Quinn, E.; Niu, J.; Guermazi, A.; Roemer, F.; Wirth, W.; Eckstein, F.; Felson, D.

    2017-01-01

    SUMMARY Objective To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. Methods 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Results Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Conclusion Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. PMID:26318658

  20. Predicting mortality and incident immobility in older Belgian men by characteristics related to sarcopenia and frailty

    DEFF Research Database (Denmark)

    Kruse, C; Goemaere, S; De Buyser, S

    2018-01-01

    and bone mineral density scores were the most important predictors. INTRODUCTION: Machine learning principles were used to predict 5-year mortality and 3-year incident severe immobility in a population of older men by frailty and sarcopenia characteristics. METHODS: Using prospective data from 1997 on 264......There is an increasing awareness of sarcopenia in older people. We applied machine learning principles to predict mortality and incident immobility in older Belgian men through sarcopenia and frailty characteristics. Mortality could be predicted with good accuracy. Serum 25-hydroxyvitamin D...... the most important predictors of immobility. Sarcopenia assessed by lean mass estimates was relevant to mortality prediction but not immobility prediction. CONCLUSIONS: Using advanced statistical models and a machine learning approach 5-year mortality can be predicted with good accuracy using a Bayesian...

  1. PSA predicts development of incident lower urinary tract symptoms: results from the REDUCE study.

    Science.gov (United States)

    Patel, Devin N; Feng, Tom; Simon, Ross M; Howard, Lauren E; Vidal, Adriana C; Moreira, Daniel M; Castro-Santamaria, Ramiro; Roehrborn, Claus; Andriole, Gerald L; Freedland, Stephen J

    2018-05-23

    The relationship between baseline prostate-specific antigen (PSA) and development of lower urinary tract symptoms (LUTS) in asymptomatic and mildly symptomatic men is unclear. We sought to determine if PSA predicts incident LUTS in these men. A post-hoc analysis of the 4-year REDUCE study was performed to assess for incident LUTS in 1534 men with mild to no LUTS at baseline. The primary aim was to determine whether PSA independently predicted incident LUTS after adjusting for the key clinical variables of age, prostate size, and baseline International prostate symptom score (IPSS). Incident LUTS was defined as the first report of medical treatment, surgery, or sustained clinically significant symptoms (two IPSS >14). Cox proportional hazards, cumulative incidence curves, and the log-rank test were used to test our hypothesis. A total of 1534 men with baseline IPSS PSA 2.5-4 ng/mL, 589 with PSA 4.1-6 ng/mL, and 610 with PSA 6-10 ng/mL. During the 4-year study, 196 men progressed to incident LUTS (50.5% medical treatment, 9% surgery, and 40.5% new symptoms). As a continuous variable, higher PSA was associated with increased incident LUTS on univariable (HR 1.09, p = 0.019) and multivariable (HR 1.08, p = 0.040) analysis. Likewise, baseline PSA 6-10 ng/mL was associated with increased incident LUTS vs. PSA 2.5-4 ng/mL in adjusted models (HR 1.68, p = 0.016). This association was also observed in men with PSA 4.1-6 ng/mL vs. PSA 2.5-4 ng/mL (HR 1.60, p = 0.032). Men with mild to no LUTS but increased baseline PSA are at increased risk of developing incident LUTS presumed due to benign prostatic hyperplasia.

  2. Incidence of atrial fibrillation and its risk prediction model based on a prospective urban Han Chinese cohort.

    Science.gov (United States)

    Ding, L; Li, J; Wang, C; Li, X; Su, Q; Zhang, G; Xue, F

    2017-09-01

    Prediction models of atrial fibrillation (AF) have been developed; however, there was no AF prediction model validated in Chinese population. Therefore, we aimed to investigate the incidence of AF in urban Han Chinese health check-up population, as well as to develop AF prediction models using behavioral, anthropometric, biochemical, electrocardiogram (ECG) markers, as well as visit-to-visit variability (VVV) in blood pressures available in the routine health check-up. A total of 33 186 participants aged 45-85 years and free of AF at baseline were included in this cohort, to follow up for incident AF with an annually routine health check-up. Cox regression models were used to develop AF prediction model and 10-fold cross-validation was used to test the discriminatory accuracy of prediction model. We developed three prediction models, with age, sex, history of coronary heart disease (CHD), hypertension as predictors for simple model, with left high-amplitude waves, premature beats added for ECG model, and with age, sex, history of CHD and VVV in systolic and diabolic blood pressures as predictors for VVV model, to estimate risk of incident AF. The calibration of our models ranged from 1.001 to 1.004 (P for Hosmer Lemeshow test >0.05). The area under receiver operator characteristics curve were 78%, 80% and 82%, respectively, for predicting risk of AF. In conclusion, we have identified predictors of incident AF and developed prediction models for AF with variables readily available in routine health check-up.

  3. Progression of diffuse esophageal spasm to achalasia: incidence and predictive factors.

    Science.gov (United States)

    Fontes, L H S; Herbella, F A M; Rodriguez, T N; Trivino, T; Farah, J F M

    2013-07-01

    The progression of certain primary esophageal motor disorders to achalasia has been documented; however, the true incidence of this decay is still elusive. This study aims to evaluate: (i) the incidence of the progression of diffuse esophageal spasm to achalasia, and (ii) predictive factors to this progression. Thirty-five patients (mean age 53 years, 80% females) with a manometric picture of diffuse esophageal spasm were followed for at least 1 year. Patients with gastroesophageal reflux disease confirmed by pH monitoring or systemic diseases that may affect esophageal motility were excluded. Esophageal manometry was repeated in all patients. Five (14%) of the patients progressed to achalasia at a mean follow-up of 2.1 (range 1-4) years. Demographic characteristics were not predictive of transition to achalasia, while dysphagia (P= 0.005) as the main symptom and the wave amplitude of simultaneous waves less than 50 mmHg (P= 0.003) were statistically significant. In conclusion, the transition of diffuse esophageal spasm to achalasia is not frequent at a 2-year follow-up. Dysphagia and simultaneous waves with low amplitude are predictive factors for this degeneration. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.

  4. Prostate-specific antigen and long-term prediction of prostate cancer incidence and mortality in the general population

    DEFF Research Database (Denmark)

    Ørsted, David Dynnes; Nordestgaard, Børge G; Jensen, Gorm B

    2012-01-01

    It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population.......It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population....

  5. A Game Theoretical Approach to Hacktivism: Is Attack Likelihood a Product of Risks and Payoffs?

    Science.gov (United States)

    Bodford, Jessica E; Kwan, Virginia S Y

    2018-02-01

    The current study examines hacktivism (i.e., hacking to convey a moral, ethical, or social justice message) through a general game theoretic framework-that is, as a product of costs and benefits. Given the inherent risk of carrying out a hacktivist attack (e.g., legal action, imprisonment), it would be rational for the user to weigh these risks against perceived benefits of carrying out the attack. As such, we examined computer science students' estimations of risks, payoffs, and attack likelihood through a game theoretic design. Furthermore, this study aims at constructing a descriptive profile of potential hacktivists, exploring two predicted covariates of attack decision making, namely, peer prevalence of hacking and sex differences. Contrary to expectations, results suggest that participants' estimations of attack likelihood stemmed solely from expected payoffs, rather than subjective risks. Peer prevalence significantly predicted increased payoffs and attack likelihood, suggesting an underlying descriptive norm in social networks. Notably, we observed no sex differences in the decision to attack, nor in the factors predicting attack likelihood. Implications for policymakers and the understanding and prevention of hacktivism are discussed, as are the possible ramifications of widely communicated payoffs over potential risks in hacking communities.

  6. Patterns and Trends of Liver Cancer Incidence Rates in Eastern and Southeastern Asian Countries (1983-2007) and Predictions to 2030.

    Science.gov (United States)

    Wu, Jie; Yang, Shigui; Xu, Kaijin; Ding, Cheng; Zhou, Yuqing; Fu, Xiaofang; Li, Yiping; Deng, Min; Wang, Chencheng; Liu, Xiaoxiao; Li, Lanjuan

    2018-05-01

    We examined temporal trends in liver cancer incidence rates overall and by histological type from 1983 through 2007. We predict trends in liver cancer incidence rates through 2030 for selected Eastern and Southeastern Asian countries. Data on yearly liver cancer incident cases by age group and sex were drawn from 6 major selected Eastern and Southeastern Asian countries or regions with cancer registries available in the CI5plus database, including China, Japan, Hong Kong Special Administrative Region (SAR), the Philippines, Singapore, and Thailand. We also analyzed data for the United States and Australia for comparative purposes. Age-standardized incidence rates were calculated and plotted from 1983 through 2007. Numbers of new cases and incidence rates were predicted through 2030 by fitting and extrapolating age-period-cohort models. The incidence rates of liver cancer have been decreasing, and decreases will continue in all selected Eastern and Southeastern Asian countries, except for Thailand, whose liver cancer incidence rate will increase due to the increasing incidence rate of intrahepatic cholangiocarcinomas. Even though the incidence rates of liver cancer are predicted to decrease in most Eastern and Southeastern Asian countries, the burden, in terms of new cases, will continue to increase because of population growth and aging. Based on an analysis of data from cancer registries from Asian countries, incidence rates of liver cancer are expected to decrease through 2030 in most Eastern and Southeastern Asian countries. However, in Thailand, the incidence rate of intrahepatic cholangiocarcinomas is predicted to increase, so health education programs are necessary. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  7. Prediction of Safety Incidents

    Data.gov (United States)

    National Aeronautics and Space Administration — Safety incidents, including injuries, property damage and mission failures, cost NASA and contractors thousands of dollars in direct and indirect costs. This project...

  8. The incidence of bacterial endosymbionts in terrestrial arthropods.

    Science.gov (United States)

    Weinert, Lucy A; Araujo-Jnr, Eli V; Ahmed, Muhammad Z; Welch, John J

    2015-05-22

    Intracellular endosymbiotic bacteria are found in many terrestrial arthropods and have a profound influence on host biology. A basic question about these symbionts is why they infect the hosts that they do, but estimating symbiont incidence (the proportion of potential host species that are actually infected) is complicated by dynamic or low prevalence infections. We develop a maximum-likelihood approach to estimating incidence, and testing hypotheses about its variation. We apply our method to a database of screens for bacterial symbionts, containing more than 3600 distinct arthropod species and more than 150 000 individual arthropods. After accounting for sampling bias, we estimate that 52% (CIs: 48-57) of arthropod species are infected with Wolbachia, 24% (CIs: 20-42) with Rickettsia and 13% (CIs: 13-55) with Cardinium. We then show that these differences stem from the significantly reduced incidence of Rickettsia and Cardinium in most hexapod orders, which might be explained by evolutionary differences in the arthropod immune response. Finally, we test the prediction that symbiont incidence should be higher in speciose host clades. But while some groups do show a trend for more infection in species-rich families, the correlations are generally weak and inconsistent. These results argue against a major role for parasitic symbionts in driving arthropod diversification. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  9. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  10. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  11. Temporal Trends and Future Prediction of Breast Cancer Incidence Across Age Groups in Trivandrum, South India.

    Science.gov (United States)

    Mathew, Aleyamma; George, Preethi Sara; Arjunan, Asha; Augustine, Paul; Kalavathy, Mc; Padmakumari, G; Mathew, Beela Sarah

    2016-01-01

    Increasing breast cancer (BC) incidence rates have been reported from India; causal factors for this increased incidence are not understood and diagnosis is mostly in advanced stages. Trivandrum exhibits the highest BC incidence rates in India. This study aimed to estimate trends in incidence by age from 2005- 2014, to predict rates through 2020 and to assess the stage at diagnosis of BC in Trivandrum. BC cases were obtained from the Population Based Cancer Registry, Trivandrum. Distribution of stage at diagnosis and incidence rates of BC [Age-specific (ASpR), crude (CR) and age-standardized (ASR)] are described and employed with a joinpoint regression model to estimate average annual percent changes (AAPC) and a Bayesian model to estimate predictive rates. BC accounts for 31% (2681/8737) of all female cancers in Trivandrum. Thirty-five percent (944/2681) are 60 years and overall CR is 80 (ASR: 57) for 2019- 20. BC, mostly diagnosed in advanced stages, is rising rapidly in South India with large increases likely in the future; particularly among post-menopausal women. This increase might be due to aging and/or changes in lifestyle factors. Reasons for the increased incidence and late stage diagnosis need to be studied.

  12. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia.

    Science.gov (United States)

    Loha, Eskindir; Lindtjørn, Bernt

    2010-06-16

    Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data

  13. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    Directory of Open Access Journals (Sweden)

    Loha Eskindir

    2010-06-01

    Full Text Available Abstract Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations, temperature (17 locations, and relative humidity (three locations. Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF models and univariate auto-regressive integrated moving average (ARIMA when there was no significant predictor meteorological variable. Results Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations or when coupled with meteorological variables (four locations was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location

  14. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  15. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  16. Conditional predictive inference for online surveillance of spatial disease incidence

    Science.gov (United States)

    Corberán-Vallet, Ana; Lawson, Andrew B.

    2012-01-01

    This paper deals with the development of statistical methodology for timely detection of incident disease clusters in space and time. The increasing availability of data on both the time and the location of events enables the construction of multivariate surveillance techniques, which may enhance the ability to detect localized clusters of disease relative to the surveillance of the overall count of disease cases across the entire study region. We introduce the surveillance conditional predictive ordinate as a general Bayesian model-based surveillance technique that allows us to detect small areas of increased disease incidence when spatial data are available. To address the problem of multiple comparisons, we incorporate a common probability that each small area signals an alarm when no change in the risk pattern of disease takes place into the analysis. We investigate the performance of the proposed surveillance technique within the framework of Bayesian hierarchical Poisson models using a simulation study. Finally, we present a case study of salmonellosis in South Carolina. PMID:21898522

  17. Predicting likelihood of seeking help through the employee assistance program among salaried and union hourly employees.

    Science.gov (United States)

    Delaney, W; Grube, J W; Ames, G M

    1998-03-01

    This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.

  18. Erythema nodosum and the risk of tuberculosis in a high incidence setting

    DEFF Research Database (Denmark)

    Bjorn-Mortensen, Karen; Ladefoged, Karin; Simonsen, Jacob

    2016-01-01

    OBJECTIVE: This study estimates the erythema nodosum (EN) incidence in a tuberculosis (TB) endemic setting and evaluates the likelihood of a subsequent TB diagnosis among individuals with Mycobacterium tuberculosis infection (MTI) with or without EN. DESIGN: We estimated EN incidence rates (IRs...

  19. Emphysema predicts hospitalisation and incident airflow obstruction among older smokers: a prospective cohort study.

    Directory of Open Access Journals (Sweden)

    David A McAllister

    Full Text Available Emphysema on CT is common in older smokers. We hypothesised that emphysema on CT predicts acute episodes of care for chronic lower respiratory disease among older smokers.Participants in a lung cancer screening study age ≥ 60 years were recruited into a prospective cohort study in 2001-02. Two radiologists independently visually assessed the severity of emphysema as absent, mild, moderate or severe. Percent emphysema was defined as the proportion of voxels ≤ -910 Hounsfield Units. Participants completed a median of 5 visits over a median of 6 years of follow-up. The primary outcome was hospitalization, emergency room or urgent office visit for chronic lower respiratory disease. Spirometry was performed following ATS/ERS guidelines. Airflow obstruction was defined as FEV1/FVC ratio <0.70 and FEV1<80% predicted.Of 521 participants, 4% had moderate or severe emphysema, which was associated with acute episodes of care (rate ratio 1.89; 95% CI: 1.01-3.52 adjusting for age, sex and race/ethnicity, as was percent emphysema, with similar associations for hospitalisation. Emphysema on visual assessment also predicted incident airflow obstruction (HR 5.14; 95% CI 2.19-21.1.Visually assessed emphysema and percent emphysema on CT predicted acute episodes of care for chronic lower respiratory disease, with the former predicting incident airflow obstruction among older smokers.

  20. Increase in breast cancer incidence among older women in Mumbai: 30-year trends and predictions to 2025.

    Science.gov (United States)

    Dikshit, Rajesh P; Yeole, B B; Nagrani, Rajini; Dhillon, P; Badwe, R; Bray, Freddie

    2012-08-01

    Increasing trends in the incidence of breast cancer have been observed in India, including Mumbai. These have likely stemmed from an increasing adoption of lifestyle factors more akin to those commonly observed in westernized countries. Analyses of breast cancer trends and corresponding estimation of the future burden are necessary to better plan rationale cancer control programmes within the country. We used data from the population-based Mumbai Cancer Registry to study time trends in breast cancer incidence rates 1976-2005 and stratified them according to younger (25-49) and older age group (50-74). Age-period-cohort models were fitted and the net drift used as a measure of the estimated annual percentage change (EAPC). Age-period-cohort models and population projections were used to predict the age-adjusted rates and number of breast cancer cases circa 2025. Breast cancer incidence increased significantly among older women over three decades (EAPC = 1.6%; 95% CI 1.1-2.0), while lesser but significant 1% increase in incidence among younger women was observed (EAPC = 1.0; 95% CI 0.2-1.8). Non-linear period and cohort effects were observed; a trends-based model predicted a close-to-doubling of incident cases by 2025 from 1300 mean cases per annum in 2001-2005 to over 2500 cases in 2021-2025. The incidence of breast cancer has increased in Mumbai during last two to three decades, with increases greater among older women. The number of breast cancer cases is predicted to double to over 2500 cases, the vast majority affecting older women. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Usefulness and limitations of dK random graph models to predict interactions and functional homogeneity in biological networks under a pseudo-likelihood parameter estimation approach

    Directory of Open Access Journals (Sweden)

    Luan Yihui

    2009-09-01

    Full Text Available Abstract Background Many aspects of biological functions can be modeled by biological networks, such as protein interaction networks, metabolic networks, and gene coexpression networks. Studying the statistical properties of these networks in turn allows us to infer biological function. Complex statistical network models can potentially more accurately describe the networks, but it is not clear whether such complex models are better suited to find biologically meaningful subnetworks. Results Recent studies have shown that the degree distribution of the nodes is not an adequate statistic in many molecular networks. We sought to extend this statistic with 2nd and 3rd order degree correlations and developed a pseudo-likelihood approach to estimate the parameters. The approach was used to analyze the MIPS and BIOGRID yeast protein interaction networks, and two yeast coexpression networks. We showed that 2nd order degree correlation information gave better predictions of gene interactions in both protein interaction and gene coexpression networks. However, in the biologically important task of predicting functionally homogeneous modules, degree correlation information performs marginally better in the case of the MIPS and BIOGRID protein interaction networks, but worse in the case of gene coexpression networks. Conclusion Our use of dK models showed that incorporation of degree correlations could increase predictive power in some contexts, albeit sometimes marginally, but, in all contexts, the use of third-order degree correlations decreased accuracy. However, it is possible that other parameter estimation methods, such as maximum likelihood, will show the usefulness of incorporating 2nd and 3rd degree correlations in predicting functionally homogeneous modules.

  2. Usefulness and limitations of dK random graph models to predict interactions and functional homogeneity in biological networks under a pseudo-likelihood parameter estimation approach.

    Science.gov (United States)

    Wang, Wenhui; Nunez-Iglesias, Juan; Luan, Yihui; Sun, Fengzhu

    2009-09-03

    Many aspects of biological functions can be modeled by biological networks, such as protein interaction networks, metabolic networks, and gene coexpression networks. Studying the statistical properties of these networks in turn allows us to infer biological function. Complex statistical network models can potentially more accurately describe the networks, but it is not clear whether such complex models are better suited to find biologically meaningful subnetworks. Recent studies have shown that the degree distribution of the nodes is not an adequate statistic in many molecular networks. We sought to extend this statistic with 2nd and 3rd order degree correlations and developed a pseudo-likelihood approach to estimate the parameters. The approach was used to analyze the MIPS and BIOGRID yeast protein interaction networks, and two yeast coexpression networks. We showed that 2nd order degree correlation information gave better predictions of gene interactions in both protein interaction and gene coexpression networks. However, in the biologically important task of predicting functionally homogeneous modules, degree correlation information performs marginally better in the case of the MIPS and BIOGRID protein interaction networks, but worse in the case of gene coexpression networks. Our use of dK models showed that incorporation of degree correlations could increase predictive power in some contexts, albeit sometimes marginally, but, in all contexts, the use of third-order degree correlations decreased accuracy. However, it is possible that other parameter estimation methods, such as maximum likelihood, will show the usefulness of incorporating 2nd and 3rd degree correlations in predicting functionally homogeneous modules.

  3. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    Science.gov (United States)

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  4. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  6. Over Time, Do Anthropometric Measures Still Predict Diabetes Incidence in Chinese Han Nationality Population from Chengdu Community?

    Directory of Open Access Journals (Sweden)

    Kai Liu

    2013-01-01

    Full Text Available Objective. To examine whether anthropometric measures could predict diabetes incidence in a Chinese population during a 15-year follow-up. Design and Methods. The data were collected in 1992 and then again in 2007 from the same group of 687 individuals. Waist circumference, body mass index, waist to hip ratio, and waist to height ratio were collected based on a standard protocol. To assess the effects of baseline anthropometric measures on the new onset of diabetes, Cox's proportional hazards regression models were used to estimate the hazard ratios of them, and the discriminatory power of anthropometric measures for diabetes was assessed by the area under the receiver operating curve (AROC. Results. Seventy-four individuals were diagnosed with diabetes during a 15-year follow-up period (incidence: 10.8%. These anthropometric measures also predicted future diabetes during a long follow-up (. At 7-8 years, the AROC of central obesity measures (WC, WHpR, WHtR were higher than that of general obesity measures (BMI (. But, there were no significant differences among the four anthropometric measurements at 15 years. Conclusions. These anthropometric measures could still predict diabetes with a long time follow-up. However, the validity of anthropometric measures to predict incident diabetes may change with time.

  7. The gap between fatherhood and couplehood desires among Israeli gay men and estimations of their likelihood.

    Science.gov (United States)

    Shenkman, Geva

    2012-10-01

    This study examined the frequencies of the desires and likelihood estimations of Israeli gay men regarding fatherhood and couplehood, using a sample of 183 gay men aged 19-50. It follows previous research which indicated the existence of a gap in the United States with respect to fatherhood, and called for generalizability examinations in other countries and the exploration of possible explanations. As predicted, a gap was also found in Israel between fatherhood desires and their likelihood estimations, as well as between couplehood desires and their likelihood estimations. In addition, lower estimations of fatherhood likelihood were found to predict depression and to correlate with decreased subjective well-being. Possible psychosocial explanations are offered. Moreover, by mapping attitudes toward fatherhood and couplehood among Israeli gay men, the current study helps to extend our knowledge of several central human development motivations and their correlations with depression and subjective well-being in a less-studied sexual minority in a complex cultural climate. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  8. Maximum-likelihood methods for array processing based on time-frequency distributions

    Science.gov (United States)

    Zhang, Yimin; Mu, Weifeng; Amin, Moeness G.

    1999-11-01

    This paper proposes a novel time-frequency maximum likelihood (t-f ML) method for direction-of-arrival (DOA) estimation for non- stationary signals, and compares this method with conventional maximum likelihood DOA estimation techniques. Time-frequency distributions localize the signal power in the time-frequency domain, and as such enhance the effective SNR, leading to improved DOA estimation. The localization of signals with different t-f signatures permits the division of the time-frequency domain into smaller regions, each contains fewer signals than those incident on the array. The reduction of the number of signals within different time-frequency regions not only reduces the required number of sensors, but also decreases the computational load in multi- dimensional optimizations. Compared to the recently proposed time- frequency MUSIC (t-f MUSIC), the proposed t-f ML method can be applied in coherent environments, without the need to perform any type of preprocessing that is subject to both array geometry and array aperture.

  9. Chronic dry eye in PRK and LASIK: manifestations, incidence and predictive factors

    Science.gov (United States)

    Bower, Kraig S.; Sia, Rose K.; Ryan, Denise S.; Mines, Michael J.; Dartt, Darlene A.

    2017-01-01

    Purpose To evaluate dry eye manifestations following photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) and determine the incidence and predictive factors of chronic dry eye using a set of dry eye criteria. Setting Walter Reed Army Medical Center, Washington, DC, USA Methods This is a prospective non-randomized clinical study of 143 active duty U.S. Army personnel aged 29.9±5.2 years with myopia or myopic astigmatism (manifest spherical equivalent −3.83±1.96 diopters) undergoing either PRK or LASIK. Dry eye evaluation was performed pre- and postoperatively. Main outcome measures included dry eye manifestations, incidence, and predictive factors of chronic dry eye. Results Schirmer scores, corneal sensitivity, ocular surface staining, surface regularity index (SRI), and responses to dry eye questionnaire significantly changed over time after PRK. After LASIK, significant changes were observed in tear breakup time, corneal sensitivity, ocular surface staining, and responses to questionnaire. At twelve months postoperatively, 5.0% of PRK and 0.8% of LASIK participants developed chronic dry eye. Regression analysis showed preoperatively lower Schirmer score will significantly influence development of chronic dry eye after PRK whereas preoperatively lower Schirmer score or higher ocular surface staining score will significantly influence the occurrence of chronic dry eye after LASIK. Conclusions Chronic dry eye is uncommon after PRK and LASIK. Ocular surface and tear film characteristics during preoperative examination may help predict chronic dry eye development in PRK and LASIK. PMID:26796443

  10. Predicting Likelihood of Surgery Prior to First Visit in Patients with Back and Lower Extremity Symptoms: A simple mathematical model based on over 8000 patients.

    Science.gov (United States)

    Boden, Lauren M; Boden, Stephanie A; Premkumar, Ajay; Gottschalk, Michael B; Boden, Scott D

    2018-02-09

    Retrospective analysis of prospectively collected data. To create a data-driven triage system stratifying patients by likelihood of undergoing spinal surgery within one year of presentation. Low back pain (LBP) and radicular lower extremity (LE) symptoms are common musculoskeletal problems. There is currently no standard data-derived triage process based on information that can be obtained prior to the initial physician-patient encounter to direct patients to the optimal physician type. We analyzed patient-reported data from 8006 patients with a chief complaint of LBP and/or LE radicular symptoms who presented to surgeons at a large multidisciplinary spine center between September 1, 2005 and June 30, 2016. Univariate and multivariate analysis identified independent risk factors for undergoing spinal surgery within one year of initial visit. A model incorporating these risk factors was created using a random sample of 80% of the total patients in our cohort, and validated on the remaining 20%. The baseline one-year surgery rate within our cohort was 39% for all patients and 42% for patients with LE symptoms. Those identified as high likelihood by the center's existing triage process had a surgery rate of 45%. The new triage scoring system proposed in this study was able to identify a high likelihood group in which 58% underwent surgery, which is a 46% higher surgery rate than in non-triaged patients and a 29% improvement from our institution's existing triage system. The data-driven triage model and scoring system derived and validated in this study (Spine Surgery Likelihood model [SSL-11]), significantly improved existing processes in predicting the likelihood of undergoing spinal surgery within one year of initial presentation. This triage system will allow centers to more selectively screen for surgical candidates and more effectively direct patients to surgeons or non-operative spine specialists. 4.

  11. Wavelength prediction of laser incident on amorphous silicon detector by neural network

    International Nuclear Information System (INIS)

    Esmaeili Sani, V.; Moussavi-Zarandi, A.; Kafaee, M.

    2011-01-01

    In this paper we present a method based on artificial neural networks (ANN) and the use of only one amorphous semiconductor detector to predict the wavelength of incident laser. Amorphous semiconductors and especially amorphous hydrogenated silicon, a-Si:H, are now widely used in many electronic devices, such as solar cells, many types of position sensitive detectors and X-ray imagers for medical applications. In order to study the electrical properties and detection characteristics of thin films of a-Si:H, n-i-p structures have been simulated by SILVACO software. The basic electronic properties of most of the materials used are known, but device modeling depends on a large number of parameters that are not all well known. In addition, the relationship between the shape of the induced anode current and the wavelength of the incident laser leads to complicated calculations. Soft data-based computational methods can model multidimensional non-linear processes and represent the complex input-output relation between the form of the output signal and the wavelength of incident laser.

  12. Wavelength prediction of laser incident on amorphous silicon detector by neural network

    Energy Technology Data Exchange (ETDEWEB)

    Esmaeili Sani, V., E-mail: vaheed_esmaeely80@yahoo.com [Amirkabir University of Technology, Faculty of Physics, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of); Moussavi-Zarandi, A.; Kafaee, M. [Amirkabir University of Technology, Faculty of Physics, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of)

    2011-10-21

    In this paper we present a method based on artificial neural networks (ANN) and the use of only one amorphous semiconductor detector to predict the wavelength of incident laser. Amorphous semiconductors and especially amorphous hydrogenated silicon, a-Si:H, are now widely used in many electronic devices, such as solar cells, many types of position sensitive detectors and X-ray imagers for medical applications. In order to study the electrical properties and detection characteristics of thin films of a-Si:H, n-i-p structures have been simulated by SILVACO software. The basic electronic properties of most of the materials used are known, but device modeling depends on a large number of parameters that are not all well known. In addition, the relationship between the shape of the induced anode current and the wavelength of the incident laser leads to complicated calculations. Soft data-based computational methods can model multidimensional non-linear processes and represent the complex input-output relation between the form of the output signal and the wavelength of incident laser.

  13. Self-stigma of seeking treatment and being male predict an increased likelihood of having an undiagnosed eating disorder.

    Science.gov (United States)

    Griffiths, Scott; Mond, Jonathan M; Li, Zhicheng; Gunatilake, Sanduni; Murray, Stuart B; Sheffield, Jeanie; Touyz, Stephen

    2015-09-01

    To examine whether self-stigma of seeking psychological help and being male would be associated with an increased likelihood of having an undiagnosed eating disorder. A multi-national sample of 360 individuals with diagnosed eating disorders and 125 individuals with undiagnosed eating disorders were recruited. Logistic regression was used to identify variables affecting the likelihood of having an undiagnosed eating disorder, including sex, self-stigma of seeking psychological help, and perceived stigma of having a mental illness, controlling for a broad range of covariates. Being male and reporting greater self-stigma of seeking psychological help was independently associated with an increased likelihood of being undiagnosed. Further, the association between self-stigma of seeking psychological help and increased likelihood of being undiagnosed was significantly stronger for males than for females. Perceived stigma associated with help-seeking may be a salient barrier to treatment for eating disorders-particularly among male sufferers. © 2015 Wiley Periodicals, Inc.

  14. Screening and Predicting Posttraumatic Stress and Depression in Children Following Single-Incident Trauma

    Science.gov (United States)

    Nixon, Reginald D. V.; Ellis, Alicia A.; Nehmy, Thomas J.; Ball, Shelley-Anne

    2010-01-01

    Three screening methods to predict posttraumatic stress disorder (PTSD) and depression symptoms in children following single-incident trauma were tested. Children and adolescents (N = 90; aged 7-17 years) were assessed within 4 weeks of an injury that led to hospital treatment and followed up 3 and 6 months later. Screening methods were adapted…

  15. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    Science.gov (United States)

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  16. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  17. ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    Kofi Placid Adragni

    2014-11-01

    Full Text Available In regression settings, a su?cient dimension reduction (SDR method seeks the core information in a p-vector predictor that completely captures its relationship with a response. The reduced predictor may reside in a lower dimension d < p, improving ability to visualize data and predict future observations, and mitigating dimensionality issues when carrying out further analysis. We introduce ldr, a new R software package that implements three recently proposed likelihood-based methods for SDR: covariance reduction, likelihood acquired directions, and principal fitted components. All three methods reduce the dimensionality of the data by pro jection into lower dimensional subspaces. The package also implements a variable screening method built upon principal ?tted components which makes use of ?exible basis functions to capture the dependencies between the predictors and the response. Examples are given to demonstrate likelihood-based SDR analyses using ldr, including estimation of the dimension of reduction subspaces and selection of basis functions. The ldr package provides a framework that we hope to grow into a comprehensive library of likelihood-based SDR methodologies.

  18. A maximum likelihood framework for protein design

    Directory of Open Access Journals (Sweden)

    Philippe Hervé

    2006-06-01

    Full Text Available Abstract Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces

  19. Maximum likelihood estimation of the parameters of nonminimum phase and noncausal ARMA models

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The well-known prediction-error-based maximum likelihood (PEML) method can only handle minimum phase ARMA models. This paper presents a new method known as the back-filtering-based maximum likelihood (BFML) method, which can handle nonminimum phase and noncausal ARMA models. The BFML method...... is identical to the PEML method in the case of a minimum phase ARMA model, and it turns out that the BFML method incorporates a noncausal ARMA filter with poles outside the unit circle for estimation of the parameters of a causal, nonminimum phase ARMA model...

  20. Predictions of Quantum Molecular Dynamical Model between incident energy 50 and 1000 MeV/Nucleon

    Directory of Open Access Journals (Sweden)

    Kumar Sanjeev

    2015-01-01

    Full Text Available In the present work, the Quantum Molecular Dynamical (QMD model is summarized as a useful tool for the incident energy range of 50 to 1000 MeV/nucleon in heavy-ion collisions. The model has reproduced the experimental results of various collaborations such as ALADIN, INDRA, PLASTIC BALL and FOPI upto a high level of accuracy for the phenomena like multifragmentation, collective flow as well as elliptical flow in the above prescribed energy range. The efforts are further in the direction to predict the symmetry energy in the wide incident energy range.

  1. Deformation of log-likelihood loss function for multiclass boosting.

    Science.gov (United States)

    Kanamori, Takafumi

    2010-09-01

    The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.

  2. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  3. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    Science.gov (United States)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  4. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  5. Predicting the Likelihood of Going to Graduate School: The Importance of Locus of Control

    Science.gov (United States)

    Nordstrom, Cynthia R.; Segrist, Dan J.

    2009-01-01

    Although many undergraduates apply to graduate school, only a fraction will be admitted. A question arises as to what factors relate to the likelihood of pursuing graduate studies. The current research examined this question by surveying students in a Careers in Psychology course. We hypothesized that GPA, a more internal locus of control…

  6. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  7. Likelihood devices in spatial statistics

    NARCIS (Netherlands)

    Zwet, E.W. van

    1999-01-01

    One of the main themes of this thesis is the application to spatial data of modern semi- and nonparametric methods. Another, closely related theme is maximum likelihood estimation from spatial data. Maximum likelihood estimation is not common practice in spatial statistics. The method of moments

  8. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  9. Trait anger but not anxiety predicts incident type 2 diabetes: The Multi-Ethnic Study of Atherosclerosis (MESA).

    Science.gov (United States)

    Abraham, Sherley; Shah, Nina G; Diez Roux, Ana; Hill-Briggs, Felicia; Seeman, Teresa; Szklo, Moyses; Schreiner, Pamela J; Golden, Sherita Hill

    2015-10-01

    Prior studies have shown a bidirectional association between depression and type 2 diabetes mellitus (T2DM); however, the prospective associations of anger and anxiety with T2DM have not been established. We hypothesized that trait anger and anxiety would predict incident T2DM, independently of depressive symptoms. In the Multi-ethnic Study of Atherosclerosis (MESA), we prospectively examined the association of trait anger and trait anxiety (assessed via the Spielberger Trait Anger and Anxiety Scales, respectively) with incident T2DM over 11.4 years in 5598 White, Black, Hispanic, and Chinese participants (53.2% women, mean age 61.6 years) at baseline without prevalent T2DM or cardiovascular disease. We used Cox proportional hazards models to calculate the hazard ratios (HR) of incident T2DM by previously defined anger category (low, moderate, high), and anxiety quartile, as there were no previously defined categories. High total trait anger was associated with incident T2DM (HR 1.50; 95% CI 1.08-2.07) relative to low total trait anger. The association was attenuated following adjustment for waist circumference (HR 1.32; 95% CI 0.94-1.86). Higher anger reaction was also associated with incident T2DM (HR=1.07; 95% CI 1.03-1.11) that remained significant after adjusting for potential confounders/explanatory factors. In contrast, trait anxiety did not predict incident T2DM. High total trait anger and anger reaction are potential modifiable risk factors for T2DM. Further research is needed to explore the mechanisms of the anger-diabetes relationship and to develop preventive interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The prediction of the incidence rate of upper limb musculoskeletal disorders, with CTD risk index method on potters of Meybod city

    Directory of Open Access Journals (Sweden)

    Reza Khani Jazani

    2012-02-01

    Full Text Available Background: The objective of this study was to predict the incidence of musculoskeletal disorders in potters of Meybod city by performing CTD risk index method.Materials and Method: This is a descriptive cross-sectional study. Target society was all workers in pottery workshops which were located in the Meybod. Information related to musculoskeletal disorders was obtained by the Nordic questionnaire and we used CTD risk index method to predict the incidence of musculoskeletal disorders.Results: We observed in this study that 59.3% of the potters had symptoms of musculoskeletal disorders in at least in one of their upper extremities. Also significant differences between mean CTD risk index on potters with and without symptoms of the upper limb musculoskeletal disorders, respectively (p=0.038.Conclusion: CTD risk index method can be as a suitable method for predicting the incidence of musculoskeletal disorders used in the potters

  11. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    International Nuclear Information System (INIS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-01-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle. (paper)

  12. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  13. Evaluation of Dynamic Coastal Response to Sea-level Rise Modifies Inundation Likelihood

    Science.gov (United States)

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments, making assessments of SLR-induced hazards essential for informed decision making. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30x30m resolution predictions for more than 38,000 sq km of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  14. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    Science.gov (United States)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  15. A likelihood ratio-based method to predict exact pedigrees for complex families from next-generation sequencing data.

    Science.gov (United States)

    Heinrich, Verena; Kamphans, Tom; Mundlos, Stefan; Robinson, Peter N; Krawitz, Peter M

    2017-01-01

    Next generation sequencing technology considerably changed the way we screen for pathogenic mutations in rare Mendelian disorders. However, the identification of the disease-causing mutation amongst thousands of variants of partly unknown relevance is still challenging and efficient techniques that reduce the genomic search space play a decisive role. Often segregation- or linkage analysis are used to prioritize candidates, however, these approaches require correct information about the degree of relationship among the sequenced samples. For quality assurance an automated control of pedigree structures and sample assignment is therefore highly desirable in order to detect label mix-ups that might otherwise corrupt downstream analysis. We developed an algorithm based on likelihood ratios that discriminates between different classes of relationship for an arbitrary number of genotyped samples. By identifying the most likely class we are able to reconstruct entire pedigrees iteratively, even for highly consanguineous families. We tested our approach on exome data of different sequencing studies and achieved high precision for all pedigree predictions. By analyzing the precision for varying degrees of relatedness or inbreeding we could show that a prediction is robust down to magnitudes of a few hundred loci. A java standalone application that computes the relationships between multiple samples as well as a Rscript that visualizes the pedigree information is available for download as well as a web service at www.gene-talk.de CONTACT: heinrich@molgen.mpg.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  16. Worldwide trends in gastric cancer mortality (1980-2011), with predictions to 2015, and incidence by subtype.

    Science.gov (United States)

    Ferro, Ana; Peleteiro, Bárbara; Malvezzi, Matteo; Bosetti, Cristina; Bertuccio, Paola; Levi, Fabio; Negri, Eva; La Vecchia, Carlo; Lunet, Nuno

    2014-05-01

    Gastric cancer incidence and mortality decreased substantially over the last decades in most countries worldwide, with differences in the trends and distribution of the main topographies across regions. To monitor recent mortality trends (1980-2011) and to compute short-term predictions (2015) of gastric cancer mortality in selected countries worldwide, we analysed mortality data provided by the World Health Organization. We also analysed incidence of cardia and non-cardia cancers using data from Cancer Incidence in Five Continents (2003-2007). The joinpoint regression over the most recent calendar periods gave estimated annual percent changes (EAPC) around -3% for the European Union (EU) and major European countries, as well as in Japan and Korea, and around -2% in North America and major Latin American countries. In the United States of America (USA), EU and other major countries worldwide, the EAPC, however, were lower than in previous years. The predictions for 2015 show that a levelling off of rates is expected in the USA and a few other countries. The relative contribution of cardia and non-cardia gastric cancers to the overall number of cases varies widely, with a generally higher proportion of cardia cancers in countries with lower gastric cancer incidence and mortality rates (e.g. the USA, Canada and Denmark). Despite the favourable mortality trends worldwide, in some countries the declines are becoming less marked. There still is the need to control Helicobacter pylori infection and other risk factors, as well as to improve diagnosis and management, to further reduce the burden of gastric cancer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Incidence trends for childhood type 1 diabetes in Europe during 1989-2003 and predicted new cases 2005-20: a multicentre prospective registration study

    DEFF Research Database (Denmark)

    Patterson, Christopher C; Dahlquist, Gisela G; Gyürüs, Eva

    2009-01-01

    BACKGROUND: The incidence of type 1 diabetes in children younger than 15 years is increasing. Prediction of future incidence of this disease will enable adequate fund allocation for delivery of care to be planned. We aimed to establish 15-year incidence trends for childhood type 1 diabetes in Eur...

  18. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    Science.gov (United States)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  19. Spontaneous regression of retinopathy of prematurity:incidence and predictive factors

    Directory of Open Access Journals (Sweden)

    Rui-Hong Ju

    2013-08-01

    Full Text Available AIM:To evaluate the incidence of spontaneous regression of changes in the retina and vitreous in active stage of retinopathy of prematurity(ROP and identify the possible relative factors during the regression.METHODS: This was a retrospective, hospital-based study. The study consisted of 39 premature infants with mild ROP showed spontaneous regression (Group A and 17 with severe ROP who had been treated before naturally involuting (Group B from August 2008 through May 2011. Data on gender, single or multiple pregnancy, gestational age, birth weight, weight gain from birth to the sixth week of life, use of oxygen in mechanical ventilation, total duration of oxygen inhalation, surfactant given or not, need for and times of blood transfusion, 1,5,10-min Apgar score, presence of bacterial or fungal or combined infection, hyaline membrane disease (HMD, patent ductus arteriosus (PDA, duration of stay in the neonatal intensive care unit (NICU and duration of ROP were recorded.RESULTS: The incidence of spontaneous regression of ROP with stage 1 was 86.7%, and with stage 2, stage 3 was 57.1%, 5.9%, respectively. With changes in zone Ⅲ regression was detected 100%, in zoneⅡ 46.2% and in zoneⅠ 0%. The mean duration of ROP in spontaneous regression group was 5.65±3.14 weeks, lower than that of the treated ROP group (7.34±4.33 weeks, but this difference was not statistically significant (P=0.201. GA, 1min Apgar score, 5min Apgar score, duration of NICU stay, postnatal age of initial screening and oxygen therapy longer than 10 days were significant predictive factors for the spontaneous regression of ROP (P<0.05. Retinal hemorrhage was the only independent predictive factor the spontaneous regression of ROP (OR 0.030, 95%CI 0.001-0.775, P=0.035.CONCLUSION:This study showed most stage 1 and 2 ROP and changes in zone Ⅲ can spontaneously regression in the end. Retinal hemorrhage is weakly inversely associated with the spontaneous regression.

  20. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    OpenAIRE

    Loha, Eskindir; Lindtj?rn, Bernt

    2010-01-01

    Abstract Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 4...

  1. Assessing cutoff values for increased exercise blood pressure to predict incident hypertension in a general population.

    Science.gov (United States)

    Lorbeer, Roberto; Ittermann, Till; Völzke, Henry; Gläser, Sven; Ewert, Ralf; Felix, Stephan B; Dörr, Marcus

    2015-07-01

    Cutoff values for increased exercise blood pressure (BP) are not established in hypertension guidelines. The aim of the study was to assess optimal cutoff values for increased exercise BP to predict incident hypertension. Data of 661 normotensive participants (386 women) aged 25-77 years from the Study of Health in Pomerania (SHIP-1) with a 5-year follow-up were used. Exercise BP was measured at a submaximal level of 100 W and at maximum level of a symptom-limited cycle ergometry test. Cutoff values for increased exercise BP were defined at the maximum sum of sensitivity and specificity for the prediction of incident hypertension. The area under the receiver-operating characteristic curve (AUC) and net reclassification index (NRI) were calculated to investigate whether increased exercise BP adds predictive value for incident hypertension beyond established cardiovascular risk factors. In men, values of 160  mmHg (100  W level; AUC = 0.7837; NRI = 0.534, P AUC = 0.7677; NRI = 0.340, P = 0.003) were detected as optimal cutoff values for the definition of increased exercise SBP. A value of 190  mmHg (AUC = 0.8347; NRI = 0.519, P < 0.001) showed relevance for the definition of increased exercise SBP in women at the maximum level. According to our analyses, 190 and 210  mmHg are clinically relevant cutoff values for increased exercise SBP at the maximum exercise level of cycle ergometry test for women and men, respectively. In addition, for men, our analyses provided a cutoff value of 160  mmHg for increased exercise SBP at the 100  W level.

  2. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  3. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  4. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    Full Text Available Abstract Background Most existing likelihood-based methods for fitting historical demographic models to DNA sequence polymorphism data to do not scale feasibly up to the level of whole-genome data sets. Computational economies can be achieved by incorporating two forms of pseudo-likelihood: composite and approximate likelihood methods. Composite likelihood enables scaling up to large data sets because it takes the product of marginal likelihoods as an estimator of the likelihood of the complete data set. This approach is especially useful when a large number of genomic regions constitutes the data set. Additionally, approximate likelihood methods can reduce the dimensionality of the data by summarizing the information in the original data by either a sufficient statistic, or a set of statistics. Both composite and approximate likelihood methods hold promise for analyzing large data sets or for use in situations where the underlying demographic model is complex and has many parameters. This paper considers a simple demographic model of allopatric divergence between two populations, in which one of the population is hypothesized to have experienced a founder event, or population bottleneck. A large resequencing data set from human populations is summarized by the joint frequency spectrum, which is a matrix of the genomic frequency spectrum of derived base frequencies in two populations. A Bayesian Metropolis-coupled Markov chain Monte Carlo (MCMCMC method for parameter estimation is developed that uses both composite and likelihood methods and is applied to the three different pairwise combinations of the human population resequence data. The accuracy of the method is also tested on data sets sampled from a simulated population model with known parameters. Results The Bayesian MCMCMC method also estimates the ratio of effective population size for the X chromosome versus that of the autosomes. The method is shown to estimate, with reasonable

  5. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  6. Predicted risks of second malignant neoplasm incidence and mortality due to secondary neutrons in a girl and boy receiving proton craniospinal irradiation

    International Nuclear Information System (INIS)

    Taddei, Phillip J; Mirkovic, Dragan; Zhang Rui; Giebeler, Annelise; Harvey, Mark; Newhauser, Wayne D; Mahajan, Anita; Kornguth, David; Woo, Shiao

    2010-01-01

    The purpose of this study was to compare the predicted risks of second malignant neoplasm (SMN) incidence and mortality from secondary neutrons for a 9-year-old girl and a 10-year-old boy who received proton craniospinal irradiation (CSI). SMN incidence and mortality from neutrons were predicted from equivalent doses to radiosensitive organs for cranial, spinal and intracranial boost fields. Therapeutic proton absorbed dose and equivalent dose from neutrons were calculated using Monte Carlo simulations. Risks of SMN incidence and mortality in most organs and tissues were predicted by applying risks models from the National Research Council of the National Academies to the equivalent dose from neutrons; for non-melanoma skin cancer, risk models from the International Commission on Radiological Protection were applied. The lifetime absolute risks of SMN incidence due to neutrons were 14.8% and 8.5%, for the girl and boy, respectively. The risks of a fatal SMN were 5.3% and 3.4% for the girl and boy, respectively. The girl had a greater risk for any SMN except colon and liver cancers, indicating that the girl's higher risks were not attributable solely to greater susceptibility to breast cancer. Lung cancer predominated the risk of SMN mortality for both patients. This study suggests that the risks of SMN incidence and mortality from neutrons may be greater for girls than for boys treated with proton CSI.

  7. Nine-year incident diabetes is predicted by fatty liver indices: the French D.E.S.I.R. study

    Directory of Open Access Journals (Sweden)

    Vol Sylviane

    2010-06-01

    Full Text Available Abstract Background Fatty liver is known to be linked with insulin resistance, alcohol intake, diabetes and obesity. Biopsy and even scan-assessed fatty liver are not always feasible in clinical practice. This report evaluates the predictive ability of two recently published markers of fatty liver: the Fatty Liver Index (FLI and the NAFLD fatty liver score (NAFLD-FLS, for 9-year incident diabetes, in the French general-population cohort: Data from an Epidemiological Study on the Insulin Resistance syndrome (D.E.S.I.R. Methods At baseline, there were 1861 men and 1950 women, non-diabetic, aged 30 to 65 years. Over the follow-up, 203 incident diabetes cases (140 men, 63 women were identified by diabetes-treatment or fasting plasma glucose ≥ 7.0 mmol/l. The FLI includes: BMI, waist circumference, triglycerides and gamma glutamyl transferase, and the NAFLD-FLS: the metabolic syndrome, diabetes, insulin, alanine aminotransferase, and asparate aminotransferase. Logistic regression was used to determine the odds ratios for incident diabetes associated with categories of the fatty liver indices. Results In comparison to those with a FLI Conclusions These fatty liver indexes are simple clinical tools for evaluating the extent of liver fat and they are predictive of incident diabetes. Physicians should screen for diabetes in patients with fatty liver.

  8. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    Science.gov (United States)

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  9. Cancer incidence predictions in the North of Portugal: keeping population-based cancer registration up to date.

    Science.gov (United States)

    Castro, Clara; Antunes, Luís; Lunet, Nuno; Bento, Maria José

    2016-09-01

    Decision making towards cancer prevention and control requires monitoring of trends in cancer incidence and accurate estimation of its burden in different settings. We aimed to estimate the number of incident cases in northern Portugal for 2015 and 2020 (all cancers except nonmelanoma skin and for the 15 most frequent tumours). Cancer cases diagnosed in 1994-2009 were collected by the North Region Cancer Registry of Portugal (RORENO) and corresponding population figures were obtained from Statistics Portugal. JoinPoint regression was used to analyse incidence trends. Population projections until 2020 were derived by RORENO. Predictions were performed using the Poisson regression models proposed by Dyba and Hakulinen. The number of incident cases is expected to increase by 18.7% in 2015 and by 37.6% in 2020, with lower increments among men than among women. For most cancers considered, the number of cases will keep rising up to 2020, although decreasing trends of age-standardized rates are expected for some tumours. Cervix was the only cancer with a decreasing number of incident cases in the entire period. Thyroid and lung cancers were among those with the steepest increases in the number of incident cases expected for 2020, especially among women. In 2020, the top five cancers are expected to account for 82 and 62% of all cases diagnosed in men and women, respectively. This study contributes to a broader understanding of cancer burden in the north of Portugal and provides the basis for keeping population-based incidence estimates up to date.

  10. Likelihood-based methods for evaluating principal surrogacy in augmented vaccine trials.

    Science.gov (United States)

    Liu, Wei; Zhang, Bo; Zhang, Hui; Zhang, Zhiwei

    2017-04-01

    There is growing interest in assessing immune biomarkers, which are quick to measure and potentially predictive of long-term efficacy, as surrogate endpoints in randomized, placebo-controlled vaccine trials. This can be done under a principal stratification approach, with principal strata defined using a subject's potential immune responses to vaccine and placebo (the latter may be assumed to be zero). In this context, principal surrogacy refers to the extent to which vaccine efficacy varies across principal strata. Because a placebo recipient's potential immune response to vaccine is unobserved in a standard vaccine trial, augmented vaccine trials have been proposed to produce the information needed to evaluate principal surrogacy. This article reviews existing methods based on an estimated likelihood and a pseudo-score (PS) and proposes two new methods based on a semiparametric likelihood (SL) and a pseudo-likelihood (PL), for analyzing augmented vaccine trials. Unlike the PS method, the SL method does not require a model for missingness, which can be advantageous when immune response data are missing by happenstance. The SL method is shown to be asymptotically efficient, and it performs similarly to the PS and PL methods in simulation experiments. The PL method appears to have a computational advantage over the PS and SL methods.

  11. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  12. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan; Genton, Marc G.

    2014-01-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  13. Associations between active shooter incidents and gun ownership and storage among families with young children in the United States.

    Science.gov (United States)

    Morrissey, Taryn W

    2017-07-01

    The presence of firearms and their unsafe storage in the home can increase risk of firearm-related death and injury, but public opinion suggests that firearm ownership is a protective factor against gun violence. This study examined the effects of a recent nearby active shooter incident on gun ownership and storage practices among families with young children. A series of regression models, with data from the nationally representative Early Childhood Longitudinal Study-Birth Cohort merged with the FBI's Active Shooter Incidents data collected in 2003-2006, were used to examine whether household gun ownership and storage practices differed in the months prior to and following an active shooter incident that occurred anywhere in the United States or within the same state. Approximately one-fifth of young children lived in households with one or more guns; of these children, only two-thirds lived in homes that stored all guns in locked cabinets. Results suggest that the experience of a recent active shooter incident was associated with an increased likelihood of storing all guns locked, with the magnitude dependent on the temporal and geographic proximity of the incident. The severity of the incident, defined as the number of fatalities, predicted an increase in storing guns locked. Findings suggest that public shootings change behaviors related to firearm storage among families with young children. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Prediction of incidence and stability of alcohol use disorders by latent internalizing psychopathology risk profiles in adolescence and young adulthood.

    Science.gov (United States)

    Behrendt, Silke; Bühringer, Gerhard; Höfler, Michael; Lieb, Roselind; Beesdo-Baum, Katja

    2017-10-01

    Comorbid internalizing mental disorders in alcohol use disorders (AUD) can be understood as putative independent risk factors for AUD or as expressions of underlying shared psychopathology vulnerabilities. However, it remains unclear whether: 1) specific latent internalizing psychopathology risk-profiles predict AUD-incidence and 2) specific latent internalizing comorbidity-profiles in AUD predict AUD-stability. To investigate baseline latent internalizing psychopathology risk profiles as predictors of subsequent AUD-incidence and -stability in adolescents and young adults. Data from the prospective-longitudinal EDSP study (baseline age 14-24 years) were used. The study-design included up to three follow-up assessments in up to ten years. DSM-IV mental disorders were assessed with the DIA-X/M-CIDI. To investigate risk-profiles and their associations with AUD-outcomes, latent class analysis with auxiliary outcome variables was applied. AUD-incidence: a 4-class model (N=1683) was identified (classes: normative-male [45.9%], normative-female [44.2%], internalizing [5.3%], nicotine dependence [4.5%]). Compared to the normative-female class, all other classes were associated with a higher risk of subsequent incident alcohol dependence (p<0.05). AUD-stability: a 3-class model (N=1940) was identified with only one class (11.6%) with high probabilities for baseline AUD. This class was further characterized by elevated substance use disorder (SUD) probabilities and predicted any subsequent AUD (OR 8.5, 95% CI 5.4-13.3). An internalizing vulnerability may constitute a pathway to AUD incidence in adolescence and young adulthood. In contrast, no indication for a role of internalizing comorbidity profiles in AUD-stability was found, which may indicate a limited importance of such profiles - in contrast to SUD-related profiles - in AUD stability. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. The paradox of high apolipoprotein A-I levels independently predicting incident type-2 diabetes among Turks.

    Science.gov (United States)

    Onat, A; Hergenç, G; Bulur, S; Uğur, M; Küçükdurmaz, Z; Can, G

    2010-06-25

    Predictive value of apolipoprotein (apo) A-I for incident hypertension, metabolic syndrome (MetS), type 2 diabetes (DM) and coronary heart disease (CHD) needs further exploration. A representative sample of Turkish adults was studied with this purpose prospectively. Sex-specific apoA-I tertiles were examined regarding cardiometabolic risk. A total of 1044 men and 1067 women (aged 49+/-12 years at baseline) were followed up over 7.4 years. High serum apoA-I levels were significantly associated in multivariable analysis with female sex, aging, alcohol intake, (inversely) cigarette smoking and, in women, with systolic blood pressure. Risk of diabetes was predicted in logistic regression in both genders by top versus bottom apoA-I tertile (RR 1.98; [95%CI 1.31; 3.0]), additive to age, body mass index (BMI), C-reactive protein (CRP), HDL-cholesterol and lipid lowering drugs. By adding sex hormone-binding globulin to the model in a subset of the sample, the association between high apoA-I and incident diabetes was attenuated only in women. ApoA-I tertiles tended to be positively associated also with hypertension and CHD only in women but this did not reach significance. High compared with low serum apoA-I levels nearly double the risk for incident diabetes, additively to age, BMI, CRP, HDL-cholesterol among Turks. Systemic inflammation concomitant with prevailing MetS might turn apoA-I into proinflammatory particles. Copyright 2008 Elsevier Ireland Ltd. All rights reserved.

  16. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  17. Costs, mortality likelihood and outcomes of hospitalized US children with traumatic brain injuries.

    Science.gov (United States)

    Shi, Junxin; Xiang, Huiyun; Wheeler, Krista; Smith, Gary A; Stallones, Lorann; Groner, Jonathan; Wang, Zengzhen

    2009-07-01

    To examine the hospitalization costs and discharge outcomes of US children with TBI and to evaluate a severity measure, the predictive mortality likelihood level. Data from the 2006 Healthcare Cost and Utilization Project Kids' Inpatient Database (KID) were used to report the national estimates and characteristics of TBI-associated hospitalizations among US children percentage of children with TBI caused by motor vehicle crashes (MVC) and falls was calculated according to the predictive mortality likelihood levels (PMLL), death in hospital and discharge into long-term rehabilitation facilities. Associations with the PMLL, discharge outcomes and average hospital charges were examined. In 2006, there were an estimated 58 900 TBI-associated hospitalizations among US children, accounting for $2.56 billion in hospital charges. MVCs caused 38.9% and falls caused 21.2% of TBI hospitalizations. The PMLL was strongly associated with TBI type, length of hospital stay, hospital charges and discharge disposition. About 4% of children with fall or MVC related TBIs died in hospital and 9% were discharged into long-term facilities. The PMLL may provide a useful tool to assess characteristics and treatment outcomes of hospitalized TBI children, but more research is still needed.

  18. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  19. Cost-Effectiveness of Coal Workers' Pneumoconiosis Prevention Based on Its Predicted Incidence within the Datong Coal Mine Group in China.

    Science.gov (United States)

    Shen, Fuhai; Liu, Hongbo; Yuan, Juxiang; Han, Bing; Cui, Kai; Ding, Yu; Fan, Xueyun; Cao, Hong; Yao, Sanqiao; Suo, Xia; Sun, Zhiqian; Yun, Xiang; Hua, Zhengbing; Chen, Jie

    2015-01-01

    We aimed to estimate the economic losses currently caused by coal workers' pneumoconiosis (CWP) and, on the basis of these measurements, confirm the economic benefit of preventive measures. Our cohort study included 1,847 patients with CWP and 43,742 coal workers without CWP who were registered in the employment records of the Datong Coal Mine Group. We calculated the cumulative incidence rate of pneumoconiosis using the life-table method. We used the dose-response relationship between cumulative incidence density and cumulative dust exposure to predict the future trend in the incidence of CWP. We calculate the economic loss caused by CWP and economic effectiveness of CWP prevention by a step-wise model. The cumulative incidence rates of CWP in the tunneling, mining, combining, and helping cohorts were 58.7%, 28.1%, 21.7%, and 4.0%, respectively. The cumulative incidence rates increased gradually with increasing cumulative dust exposure (CDE). We predicted 4,300 new CWP cases, assuming the dust concentrations remained at the levels of 2011. If advanced dustproof equipment was adopted, 537 fewer people would be diagnosed with CWP. In all, losses of 1.207 billion Renminbi (RMB, official currency of China) would be prevented and 4,698.8 healthy life years would be gained. Investments in advanced dustproof equipment would be total 843 million RMB, according to our study; the ratio of investment to restored economic losses was 1:1.43. Controlling workplace dust concentrations is critical to reduce the onset of pneumoconiosis and to achieve economic benefits.

  20. Cost-Effectiveness of Coal Workers' Pneumoconiosis Prevention Based on Its Predicted Incidence within the Datong Coal Mine Group in China

    Science.gov (United States)

    Yuan, Juxiang; Han, Bing; Cui, Kai; Ding, Yu; Fan, Xueyun; Cao, Hong; Yao, Sanqiao; Suo, Xia; Sun, Zhiqian; Yun, Xiang; Hua, Zhengbing; Chen, Jie

    2015-01-01

    We aimed to estimate the economic losses currently caused by coal workers’ pneumoconiosis (CWP) and, on the basis of these measurements, confirm the economic benefit of preventive measures. Our cohort study included 1,847 patients with CWP and 43,742 coal workers without CWP who were registered in the employment records of the Datong Coal Mine Group. We calculated the cumulative incidence rate of pneumoconiosis using the life-table method. We used the dose-response relationship between cumulative incidence density and cumulative dust exposure to predict the future trend in the incidence of CWP. We calculate the economic loss caused by CWP and economic effectiveness of CWP prevention by a step-wise model. The cumulative incidence rates of CWP in the tunneling, mining, combining, and helping cohorts were 58.7%, 28.1%, 21.7%, and 4.0%, respectively. The cumulative incidence rates increased gradually with increasing cumulative dust exposure (CDE). We predicted 4,300 new CWP cases, assuming the dust concentrations remained at the levels of 2011. If advanced dustproof equipment was adopted, 537 fewer people would be diagnosed with CWP. In all, losses of 1.207 billion Renminbi (RMB, official currency of China) would be prevented and 4,698.8 healthy life years would be gained. Investments in advanced dustproof equipment would be total 843 million RMB, according to our study; the ratio of investment to restored economic losses was 1:1.43. Controlling workplace dust concentrations is critical to reduce the onset of pneumoconiosis and to achieve economic benefits. PMID:26098706

  1. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  2. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Climatic and ecological future of the Amazon: likelihood and causes of change

    OpenAIRE

    B. Cook; N. Zeng; J.-H. Yoon

    2010-01-01

    Some recent climate modeling results suggested a possible dieback of the Amazon rainforest under future climate change, a prediction that raised considerable interest as well as controversy. To determine the likelihood and causes of such changes, we analyzed the output of 15 models from the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC/AR4) and a dynamic vegetation model VEGAS driven by these climate output. Our results suggest that the core of the Amazon rainforest...

  4. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  5. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  6. Likelihood analysis of parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Bowman, D.; Sharapov, E.

    1993-01-01

    We discuss the determination of the root mean-squared matrix element of the parity-violating interaction between compound-nuclear states using likelihood analysis. We briefly review the relevant features of the statistical model of the compound nucleus and the formalism of likelihood analysis. We then discuss the application of likelihood analysis to data on panty-violating longitudinal asymmetries. The reliability of the extracted value of the matrix element and errors assigned to the matrix element is stressed. We treat the situations where the spins of the p-wave resonances are not known and known using experimental data and Monte Carlo techniques. We conclude that likelihood analysis provides a reliable way to determine M and its confidence interval. We briefly discuss some problems associated with the normalization of the likelihood function

  7. Corporate brand extensions based on the purchase likelihood: governance implications

    Directory of Open Access Journals (Sweden)

    Spyridon Goumas

    2018-03-01

    Full Text Available This paper is examining the purchase likelihood of hypothetical service brand extensions from product companies focusing on consumer electronics based on sector categorization and perceptions of fit between the existing product category and image of the company. Prior research has recognized that levels of brand knowledge eases the transference of associations and affect to the new products. Similarity to the existing products of the parent company and perceived image also influence the success of brand extensions. However, sector categorization may interfere with this relationship. The purpose of this study is to examine Greek consumers’ attitudes towards hypothetical brand extensions, and how these are affected by consumers’ existing knowledge about the brand, sector categorization and perceptions of image and category fit of cross-sector extensions. This aim is examined in the context of technological categories, where less-known companies exhibited significance in purchase likelihood, and contradictory with the existing literature, service companies did not perform as positively as expected. Additional insights to the existing literature about sector categorization are provided. The effect of both image and category fit is also examined and predictions regarding the effect of each are made.

  8. Source and Message Factors in Persuasion: A Reply to Stiff's Critique of the Elaboration Likelihood Model.

    Science.gov (United States)

    Petty, Richard E.; And Others

    1987-01-01

    Answers James Stiff's criticism of the Elaboration Likelihood Model (ELM) of persuasion. Corrects certain misperceptions of the ELM and criticizes Stiff's meta-analysis that compares ELM predictions with those derived from Kahneman's elastic capacity model. Argues that Stiff's presentation of the ELM and the conclusions he draws based on the data…

  9. Chronic dry eye in photorefractive keratectomy and laser in situ keratomileusis: Manifestations, incidence, and predictive factors.

    Science.gov (United States)

    Bower, Kraig S; Sia, Rose K; Ryan, Denise S; Mines, Michael J; Dartt, Darlene A

    2015-12-01

    To evaluate dry-eye manifestations after photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) and determine the incidence and predictive factors of chronic dry eye using a set of dry-eye criteria. Walter Reed Army Medical Center, Washington, DC, USA. Prospective, non-randomized clinical study. Dry-eye evaluation was performed before and after surgery. Main outcome measures included dry-eye manifestations, incidence, and predictive factors of chronic dry eye. This study comprised 143 active-duty U.S. Army personnel, ages 29.9 ± 5.2 years, with myopia or myopic astigmatism (manifest spherical equivalent -3.83 ± 1.96 diopters) having PRK or LASIK. Schirmer scores, corneal sensitivity, ocular surface staining, surface regularity index, and responses to dry-eye questionnaire significantly changed over time after PRK. After LASIK, significant changes were observed in tear breakup time, corneal sensitivity, ocular surface staining, and responses to questionnaire. Twelve months postoperatively, 5.0% of PRK and 0.8% of LASIK participants developed chronic dry eye. Regression analysis showed that pre-operatively lower Schirmer score will significantly influence development of chronic dry eye after PRK, whereas preoperatively, lower Schirmer score or higher ocular surface staining score will significantly influence the occurrence of chronic dry eye after LASIK. Chronic dry eye was uncommon after PRK and LASIK. Ocular surface and tear-film characteristics during pre-operative examination might help to predict chronic dry-eye development in PRK and LASIK. The authors have no financial interest in any product, drug, instrument, or equipment discussed in this manuscript. Copyright © 2015 ASCRS and ESCRS. All rights reserved.

  10. Unbinned likelihood analysis of EGRET observations

    International Nuclear Information System (INIS)

    Digel, Seth W.

    2000-01-01

    We present a newly-developed likelihood analysis method for EGRET data that defines the likelihood function without binning the photon data or averaging the instrumental response functions. The standard likelihood analysis applied to EGRET data requires the photons to be binned spatially and in energy, and the point-spread functions to be averaged over energy and inclination angle. The full-width half maximum of the point-spread function increases by about 40% from on-axis to 30 degree sign inclination, and depending on the binning in energy can vary by more than that in a single energy bin. The new unbinned method avoids the loss of information that binning and averaging cause and can properly analyze regions where EGRET viewing periods overlap and photons with different inclination angles would otherwise be combined in the same bin. In the poster, we describe the unbinned analysis method and compare its sensitivity with binned analysis for detecting point sources in EGRET data

  11. Efficient Detection of Repeating Sites to Accelerate Phylogenetic Likelihood Calculations.

    Science.gov (United States)

    Kobert, K; Stamatakis, A; Flouri, T

    2017-03-01

    The phylogenetic likelihood function (PLF) is the major computational bottleneck in several applications of evolutionary biology such as phylogenetic inference, species delimitation, model selection, and divergence times estimation. Given the alignment, a tree and the evolutionary model parameters, the likelihood function computes the conditional likelihood vectors for every node of the tree. Vector entries for which all input data are identical result in redundant likelihood operations which, in turn, yield identical conditional values. Such operations can be omitted for improving run-time and, using appropriate data structures, reducing memory usage. We present a fast, novel method for identifying and omitting such redundant operations in phylogenetic likelihood calculations, and assess the performance improvement and memory savings attained by our method. Using empirical and simulated data sets, we show that a prototype implementation of our method yields up to 12-fold speedups and uses up to 78% less memory than one of the fastest and most highly tuned implementations of the PLF currently available. Our method is generic and can seamlessly be integrated into any phylogenetic likelihood implementation. [Algorithms; maximum likelihood; phylogenetic likelihood function; phylogenetics]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  12. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...... improves precision substantially. Another source of error is that models testing away mixing dimensions must replicate the relevant dimensions of the quasi-random draws in the simulation of the restricted likelihood. These simulation errors are ignored in the standard estimation procedures used today...

  13. The dorsal medial frontal cortex is sensitive to time on task, not response conflict or error likelihood.

    Science.gov (United States)

    Grinband, Jack; Savitskaya, Judith; Wager, Tor D; Teichert, Tobias; Ferrera, Vincent P; Hirsch, Joy

    2011-07-15

    The dorsal medial frontal cortex (dMFC) is highly active during choice behavior. Though many models have been proposed to explain dMFC function, the conflict monitoring model is the most influential. It posits that dMFC is primarily involved in detecting interference between competing responses thus signaling the need for control. It accurately predicts increased neural activity and response time (RT) for incompatible (high-interference) vs. compatible (low-interference) decisions. However, it has been shown that neural activity can increase with time on task, even when no decisions are made. Thus, the greater dMFC activity on incompatible trials may stem from longer RTs rather than response conflict. This study shows that (1) the conflict monitoring model fails to predict the relationship between error likelihood and RT, and (2) the dMFC activity is not sensitive to congruency, error likelihood, or response conflict, but is monotonically related to time on task. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Physical and Sexual Violence and Incident Sexually Transmitted Infections

    Science.gov (United States)

    Anand, Mallika; Redding, Colleen A.; Peipert, Jeffrey F.

    2009-01-01

    Abstract Objective To investigate whether women aged 13–35 who were victims of interpersonal violence were more likely than nonvictims to experience incident sexually transmitted infections (STIs). Methods We examined 542 women aged 13–35 enrolled in Project PROTECT, a randomized clinical trial that compared two different methods of computer-based intervention to promote the use of dual methods of contraception. Participants completed a baseline questionnaire that included questions about their history of interpersonal violence and were followed for incident STIs over the 2-year study period. We compared the incidence of STIs in women with and without a history of interpersonal violence using bivariate analyses and multiple logistic regression. Results In the bivariate analyses, STI incidence was found to be significantly associated with African American race/ethnicity, a higher number of sexual partners in the past month, and a lower likelihood of avoidance of sexual partners who pressure to have sex without a condom. In both crude and adjusted regression analyses, time to STI incidence was faster among women who reported physical or sexual abuse in the year before study enrollment (HRRadj = 1.68, 95% CI 1.06, 2.65). Conclusions Women with a recent history of abuse are at significantly increased risk of STI incidence than are nonvictims. PMID:19245303

  15. Development of Risk Score for Predicting 3-Year Incidence of Type 2 Diabetes: Japan Epidemiology Collaboration on Occupational Health Study.

    Directory of Open Access Journals (Sweden)

    Akiko Nanri

    Full Text Available Risk models and scores have been developed to predict incidence of type 2 diabetes in Western populations, but their performance may differ when applied to non-Western populations. We developed and validated a risk score for predicting 3-year incidence of type 2 diabetes in a Japanese population.Participants were 37,416 men and women, aged 30 or older, who received periodic health checkup in 2008-2009 in eight companies. Diabetes was defined as fasting plasma glucose (FPG ≥ 126 mg/dl, random plasma glucose ≥ 200 mg/dl, glycated hemoglobin (HbA1c ≥ 6.5%, or receiving medical treatment for diabetes. Risk scores on non-invasive and invasive models including FPG and HbA1c were developed using logistic regression in a derivation cohort and validated in the remaining cohort.The area under the curve (AUC for the non-invasive model including age, sex, body mass index, waist circumference, hypertension, and smoking status was 0.717 (95% CI, 0.703-0.731. In the invasive model in which both FPG and HbA1c were added to the non-invasive model, AUC was increased to 0.893 (95% CI, 0.883-0.902. When the risk scores were applied to the validation cohort, AUCs (95% CI for the non-invasive and invasive model were 0.734 (0.715-0.753 and 0.882 (0.868-0.895, respectively. Participants with a non-invasive score of ≥ 15 and invasive score of ≥ 19 were projected to have >20% and >50% risk, respectively, of developing type 2 diabetes within 3 years.The simple risk score of the non-invasive model might be useful for predicting incident type 2 diabetes, and its predictive performance may be markedly improved by incorporating FPG and HbA1c.

  16. Prediction model for prevalence and incidence of advanced age-related macular degeneration based on genetic, demographic, and environmental variables.

    Science.gov (United States)

    Seddon, Johanna M; Reynolds, Robyn; Maller, Julian; Fagerness, Jesen A; Daly, Mark J; Rosner, Bernard

    2009-05-01

    The joint effects of genetic, ocular, and environmental variables were evaluated and predictive models for prevalence and incidence of AMD were assessed. Participants in the multicenter Age-Related Eye Disease Study (AREDS) were included in a prospective evaluation of 1446 individuals, of which 279 progressed to advanced AMD (geographic atrophy or neovascular disease) and 1167 did not progress during 6.3 years of follow-up. For prevalent AMD, 509 advanced cases were compared with 222 controls. Covariates for the incidence analysis included age, sex, education, smoking, body mass index (BMI), baseline AMD grade, and the AREDS vitamin-mineral treatment assignment. DNA specimens were evaluated for six variants in five genes related to AMD. Unconditional logistic regression analyses were performed for prevalent and incident advanced AMD. An algorithm was developed and receiver operating characteristic curves and C statistics were calculated to assess the predictive ability of risk scores to discriminate progressors from nonprogressors. All genetic polymorphisms were independently related to prevalence of advanced AMD, controlling for genetic factors, smoking, BMI, and AREDS treatment. Multivariate odds ratios (ORs) were 3.5 (95% confidence interval [CI], 1.7-7.1) for CFH Y402H; 3.7 (95% CI, 1.6-8.4) for CFH rs1410996; 25.4 (95% CI, 8.6-75.1) for LOC387715 A69S (ARMS2); 0.3 (95% CI, 0.1-0.7) for C2 E318D; 0.3 (95% CI, 0.1-0.5) for CFB; and 3.6 (95% CI, 1.4-9.4) for C3 R102G, comparing the homozygous risk/protective genotypes to the referent genotypes. For incident AMD, all these variants except CFB were significantly related to progression to advanced AMD, after controlling for baseline AMD grade and other factors, with ORs from 1.8 to 4.0 for presence of two risk alleles and 0.4 for the protective allele. An interaction was seen between CFH402H and treatment, after controlling for all genotypes. Smoking was independently related to AMD, with a multiplicative joint

  17. The fine-tuning cost of the likelihood in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    In SUSY models, the fine tuning of the electroweak (EW) scale with respect to their parameters gamma_i={m_0, m_{1/2}, mu_0, A_0, B_0,...} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Delta of the usual likelihood L and the traditional fine tuning measure Delta of the EW scale. A similar result is obtained for the integrated likelihood over the set {gamma_i}, that can be written as a surface integral of the ratio L/Delta, with the surface in gamma_i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Delta or equivalently, a small chi^2_{new}=chi^2_{old}+2*ln(Delta). This shows the fine-tuning cost to the likelihood ...

  18. Maximum-likelihood estimation of the hyperbolic parameters from grouped observations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1988-01-01

    a least-squares problem. The second procedure Hypesti first approaches the maximum-likelihood estimate by iterating in the profile-log likelihood function for the scale parameter. Close to the maximum of the likelihood function, the estimation is brought to an end by iteration, using all four parameters...

  19. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  20. Education and Income Imbalances Among Married Couples in Malawi as Predictors for Likelihood of Physical and Emotional Intimate Partner Violence.

    Science.gov (United States)

    Bonnes, Stephanie

    2016-01-01

    Intimate partner violence is a social and public health problem that is prevalent across the world. In many societies, power differentials in relationships, often supported by social norms that promote gender inequality, lead to incidents of intimate partner violence. Among other factors, both a woman's years of education and educational differences between a woman and her partner have been shown to have an effect on her likelihood of experiencing intimate partner abuse. Using the 2010 Malawian Demographic and Health Survey data to analyze intimate partner violence among 3,893 married Malawian women and their husbands, this article focuses on understanding the effect of educational differences between husband and wife on the likelihood of physical and emotional abuse within a marriage. The results from logistic regression models show that a woman's level of education is a significant predictor of her likelihood of experiencing intimate partner violence by her current husband, but that this effect is contingent on her husband's level of education. This study demonstrates the need to educate men alongside of women in Malawi to help decrease women's risk of physical and emotional intimate partner violence.

  1. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    Science.gov (United States)

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  2. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  3. Predicting sun protection behaviors using protection motivation variables.

    Science.gov (United States)

    Ch'ng, Joanne W M; Glendon, A Ian

    2014-04-01

    Protection motivation theory components were used to predict sun protection behaviors (SPBs) using four outcome measures: typical reported behaviors, previous reported behaviors, current sunscreen use as determined by interview, and current observed behaviors (clothing worn) to control for common method bias. Sampled from two SE Queensland public beaches during summer, 199 participants aged 18-29 years completed a questionnaire measuring perceived severity, perceived vulnerability, response efficacy, response costs, and protection motivation (PM). Personal perceived risk (similar to threat appraisal) and response likelihood (similar to coping appraisal) were derived from their respective PM components. Protection motivation predicted all four SPB criterion variables. Personal perceived risk and response likelihood predicted protection motivation. Protection motivation completely mediated the effect of response likelihood on all four criterion variables. Alternative models are considered. Strengths and limitations of the study are outlined and suggestions made for future research.

  4. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  5. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2015-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  6. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  7. Predicting Likelihood of Having Four or More Positive Nodes in Patient With Sentinel Lymph Node-Positive Breast Cancer: A Nomogram Validation Study

    International Nuclear Information System (INIS)

    Unal, Bulent; Gur, Akif Serhat; Beriwal, Sushil; Tang Gong; Johnson, Ronald; Ahrendt, Gretchen; Bonaventura, Marguerite; Soran, Atilla

    2009-01-01

    Purpose: Katz suggested a nomogram for predicting having four or more positive nodes in sentinel lymph node (SLN)-positive breast cancer patients. The findings from this formula might influence adjuvant radiotherapy decisions. Our goal was to validate the accuracy of the Katz nomogram. Methods and Materials: We reviewed the records of 309 patients with breast cancer who had undergone completion axillary lymph node dissection. The factors associated with the likelihood of having four or more positive axillary nodes were evaluated in patients with one to three positive SLNs. The nomogram developed by Katz was applied to our data set. The area under the curve of the corresponding receiver operating characteristics curve was calculated for the nomogram. Results: Of the 309 patients, 80 (25.9%) had four or more positive axillary lymph nodes. On multivariate analysis, the number of positive SLNs (p < .0001), overall metastasis size (p = .019), primary tumor size (p = .0001), and extracapsular extension (p = .01) were significant factors predicting for four or more positive nodes. For patients with <5% probability, 90.3% had fewer than four positive nodes and 9.7% had four or more positive nodes. The negative predictive value was 91.7%, and sensitivity was 80%. The nomogram was accurate and discriminating (area under the curve, .801). Conclusion: The probability of four or more involved nodes is significantly greater in patients who have an increased number of positive SLNs, increased overall metastasis size, increased tumor size, and extracapsular extension. The Katz nomogram was validated in our patients. This nomogram will be helpful to clinicians making adjuvant treatment recommendations to their patients.

  8. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    Science.gov (United States)

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  9. Brief Communication: Likelihood of societal preparedness for global change: trend detection

    Directory of Open Access Journals (Sweden)

    R. M. Vogel

    2013-07-01

    Full Text Available Anthropogenic influences on earth system processes are now pervasive, resulting in trends in river discharge, pollution levels, ocean levels, precipitation, temperature, wind, landslides, bird and plant populations and a myriad of other important natural hazards relating to earth system state variables. Thousands of trend detection studies have been published which report the statistical significance of observed trends. Unfortunately, such studies only concentrate on the null hypothesis of "no trend". Little or no attention is given to the power of such statistical trend tests, which would quantify the likelihood that we might ignore a trend if it really existed. The probability of missing the trend, if it exists, known as the type II error, informs us about the likelihood of whether or not society is prepared to accommodate and respond to such trends. We describe how the power or probability of detecting a trend if it exists, depends critically on our ability to develop improved multivariate deterministic and statistical methods for predicting future trends in earth system processes. Several other research and policy implications for improving our understanding of trend detection and our societal response to those trends are discussed.

  10. Maximum likelihood estimation of semiparametric mixture component models for competing risks data.

    Science.gov (United States)

    Choi, Sangbum; Huang, Xuelin

    2014-09-01

    In the analysis of competing risks data, the cumulative incidence function is a useful quantity to characterize the crude risk of failure from a specific event type. In this article, we consider an efficient semiparametric analysis of mixture component models on cumulative incidence functions. Under the proposed mixture model, latency survival regressions given the event type are performed through a class of semiparametric models that encompasses the proportional hazards model and the proportional odds model, allowing for time-dependent covariates. The marginal proportions of the occurrences of cause-specific events are assessed by a multinomial logistic model. Our mixture modeling approach is advantageous in that it makes a joint estimation of model parameters associated with all competing risks under consideration, satisfying the constraint that the cumulative probability of failing from any cause adds up to one given any covariates. We develop a novel maximum likelihood scheme based on semiparametric regression analysis that facilitates efficient and reliable estimation. Statistical inferences can be conveniently made from the inverse of the observed information matrix. We establish the consistency and asymptotic normality of the proposed estimators. We validate small sample properties with simulations and demonstrate the methodology with a data set from a study of follicular lymphoma. © 2014, The International Biometric Society.

  11. The potential for spills and leaks of hydraulic fracturing related fluids on well sites and from road incidents.

    Science.gov (United States)

    Clancy, Sarah; Worrall, Fred; Davies, Richard; Gluyas, Jon

    2017-04-01

    The potential growth of shale gas developments within Europe has raised concerns of the possibility of spills and leaks from shale gas sites and from liquid transportation via roads and pipelines. Data from a range of sources has been examined to estimate the likelihood of an incident. From the US, the Texas Railroad Commission and the Colorado Oil and Gas Commission have maintained records of the quantity; reasons for the spill; and reported impacts. For the UK, the Environment Agency pollution incident database and transport statistics from the UKs Department for Transport have also been analysed and used as an analogy to determine the likelihood of an incident or spill on the road. Data were used as an analogue to predict the potential number of spills and leaks that might occur at a well site, or in transport operation, under different shale gas development scenarios if fracking was to go forward in the UK. Since 2014 the Colorado Oil and Gas Commission has recorded 3874 spills in the State of Colorado, the majority of these (1941) consisted of produced water, whereas 835 recorded oil spills. Of all the spills recorded 1809 spilt more than 0.79 m3, with 1356 of these leaking outside the berm of the well site, and three sites requiring construction of an emergency pits to contain the spillage. During 2015, there were 53054 active wells; the percentage of produced oil spilt was 0.001%, whilst the percentage of produced water spilt was 0.009%. Data from the Texas Railroad Commission shows the number of reported spills over 0.16 m3 in Texas since 2009 has increased year on year, with 675 reported in 2009 and 1485 in 2015. The greatest loss each year was of crude oil, with 14176 m3 being spilt in 2015, which is equivalent to 0.0089% of the oil produced. Clean-up operations recover some of the lost fluid; however, much is left unrecovered, annually 60% of the crude oil spilt is recovered, 65% of production fluid is recovered, whereas just 30% of liquid gas is

  12. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  13. Incident detection and isolation in drilling using analytical redundancy relations

    DEFF Research Database (Denmark)

    Willersrud, Anders; Blanke, Mogens; Imsland, Lars

    2015-01-01

    must be avoided. This paper employs model-based diagnosis using analytical redundancy relations to obtain residuals which are affected differently by the different incidents. Residuals are found to be non-Gaussian - they follow a multivariate t-distribution - hence, a dedicated generalized likelihood...... measurements available. In the latter case, isolation capability is shown to be reduced to group-wise isolation, but the method would still detect all serious events with the prescribed false alarm probability...

  14. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which is a r...

  15. Quantitative prediction of shrimp disease incidence via the profiles of gut eukaryotic microbiota.

    Science.gov (United States)

    Xiong, Jinbo; Yu, Weina; Dai, Wenfang; Zhang, Jinjie; Qiu, Qiongfen; Ou, Changrong

    2018-04-01

    One common notion is emerging that gut eukaryotes are commensal or beneficial, rather than detrimental. To date, however, surprisingly few studies have been taken to discern the factors that govern the assembly of gut eukaryotes, despite growing interest in the dysbiosis of gut microbiota-disease relationship. Herein, we firstly explored how the gut eukaryotic microbiotas were assembled over shrimp postlarval to adult stages and a disease progression. The gut eukaryotic communities changed markedly as healthy shrimp aged, and converged toward an adult-microbiota configuration. However, the adult-like stability was distorted by disease exacerbation. A null model untangled that the deterministic processes that governed the gut eukaryotic assembly tended to be more important over healthy shrimp development, whereas this trend was inverted as the disease progressed. After ruling out the baseline of gut eukaryotes over shrimp ages, we identified disease-discriminatory taxa (species level afforded the highest accuracy of prediction) that characteristic of shrimp health status. The profiles of these taxa contributed an overall 92.4% accuracy in predicting shrimp health status. Notably, this model can accurately diagnose the onset of shrimp disease. Interspecies interaction analysis depicted how the disease-discriminatory taxa interacted with one another in sustaining shrimp health. Taken together, our findings offer novel insights into the underlying ecological processes that govern the assembly of gut eukaryotes over shrimp postlarval to adult stages and a disease progression. Intriguingly, the established model can quantitatively and accurately predict the incidences of shrimp disease.

  16. Evidence Based Medicine; Positive and Negative Likelihood Ratios of Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Alireza Baratloo

    2015-10-01

    Full Text Available In the previous two parts of educational manuscript series in Emergency, we explained some screening characteristics of diagnostic tests including accuracy, sensitivity, specificity, and positive and negative predictive values. In the 3rd  part we aimed to explain positive and negative likelihood ratio (LR as one of the most reliable performance measures of a diagnostic test. To better understand this characteristic of a test, it is first necessary to fully understand the concept of sensitivity and specificity. So we strongly advise you to review the 1st part of this series again. In short, the likelihood ratios are about the percentage of people with and without a disease but having the same test result. The prevalence of a disease can directly influence screening characteristics of a diagnostic test, especially its sensitivity and specificity. Trying to eliminate this effect, LR was developed. Pre-test probability of a disease multiplied by positive or negative LR can estimate post-test probability. Therefore, LR is the most important characteristic of a test to rule out or rule in a diagnosis. A positive likelihood ratio > 1 means higher probability of the disease to be present in a patient with a positive test. The further from 1, either higher or lower, the stronger the evidence to rule in or rule out the disease, respectively. It is obvious that tests with LR close to one are less practical. On the other hand, LR further from one will have more value for application in medicine. Usually tests with 0.1 < LR > 10 are considered suitable for implication in routine practice.

  17. Predictive Value of Triglyceride Glucose Index for the Risk of Incident Diabetes: A 4-Year Retrospective Longitudinal Study.

    Science.gov (United States)

    Lee, Da Young; Lee, Eun Seo; Kim, Ji Hyun; Park, Se Eun; Park, Cheol-Young; Oh, Ki-Won; Park, Sung-Woo; Rhee, Eun-Jung; Lee, Won-Young

    The Triglyceride Glucose Index (TyG index) is considered a surrogate marker of insulin resistance. The aim of this study is to investigate whether the TyG index has a predictive role in identifying individuals with a high risk of incident diabetes and to compare it with other indicators of metabolic health. A total 2900 non-diabetic adults who attended five consecutive annual health check-ups at Kangbuk Samsung Hospital was divided into four subgroups using three methods: (1) baseline TyG index; (2) obesity status (body mass index ≥25 kg/m2) and cutoff value of TyG index; (3) obesity status and metabolic health, defined as having fewer than two of the five components of high blood pressure, fasting blood glucose, triglyceride, low high-density lipoprotein cholesterol, and highest decile of homeostasis model assessment-insulin resistance. The development of diabetes was assessed annually using self-questionnaire, fasting glucose, and glycated hemoglobin. We compared the risk of incident diabetes using multivariate Cox analysis. During 11623 person-years there were 101 case of incident diabetes. Subjects with high TyG index had a high risk of diabetes. For TyG index quartiles, hazard ratios (HRs) of quartiles 3 and 4 were 4.06 (p = 0.033) and 5.65 (p = 0.006) respectively. When the subjects were divided by obesity status and cutoff value of TyG index of 8.8, the subgroups with TyG index ≥ 8.8 regardless of obesity had a significantly high risk for diabetes (HR 2.40 [p = 0.024] and 2.25 [p = 0.048]). For obesity status and metabolic health, the two metabolically unhealthy subgroups regardless of obesity had a significantly high risk for diabetes (HRs 2.54 [p = 0.024] and 2.73 [p = 0.021]). In conclusion, the TyG index measured at a single time point may be an indicator of the risk for incident diabetes. The predictive value of the TyG index was comparable to that of metabolic health.

  18. The modified signed likelihood statistic and saddlepoint approximations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1992-01-01

    SUMMARY: For a number of tests in exponential families we show that the use of a normal approximation to the modified signed likelihood ratio statistic r * is equivalent to the use of a saddlepoint approximation. This is also true in a large deviation region where the signed likelihood ratio...... statistic r is of order √ n. © 1992 Biometrika Trust....

  19. Undernutrition among adults in India: the significance of individual-level and contextual factors impacting on the likelihood of underweight across sub-populations.

    Science.gov (United States)

    Siddiqui, Md Zakaria; Donato, Ronald

    2017-01-01

    To investigate the extent to which individual-level as well as macro-level contextual factors influence the likelihood of underweight across adult sub-populations in India. Population-based cross-sectional survey included in India's National Health Family Survey conducted in 2005-06. We disaggregated into eight sub-populations. Multistage nationally representative household survey covering 99 % of India's population. The survey covered 124 385 females aged 15-49 years and 74 369 males aged 15-54 years. A social gradient in underweight exists in India. Even after allowing for wealth status, differences in the predicted probability of underweight persisted based upon rurality, age/maturity and gender. We found individual-level education lowered the likelihood of underweight for males, but no statistical association for females. Paradoxically, rural young (15-24 years) females from more educated villages had a higher likelihood of underweight relative to those in less educated villages; but for rural mature (>24 years) females the opposite was the case. Christians had a significantly lower likelihood of underweight relative to other socio-religious groups (OR=0·53-0·80). Higher state-level inequality increased the likelihood of underweight across most population groups, while neighbourhood inequality exhibited a similar relationship for the rural young population subgroups only. Individual states/neighbourhoods accounted for 5-9 % of the variation in the prediction of underweight. We found that rural young females represent a particularly highly vulnerable sub-population. Economic growth alone is unlikely to reduce the burden of malnutrition in India; accordingly, policy makers need to address the broader social determinants that contribute to higher underweight prevalence in specific demographic subgroups.

  20. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the CDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agr...

  1. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  2. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  3. Ego involvement increases doping likelihood.

    Science.gov (United States)

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  4. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  5. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  6. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  7. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  8. Predictive and prognostic properties of TB-LAM among HIV-positive patients initiating ART in Johannesburg, South Africa.

    Science.gov (United States)

    d'Elia, Alexander; Evans, Denise; McNamara, Lynne; Berhanu, Rebecca; Sanne, Ian; Lönnermark, Elisabet

    2015-01-01

    While the diagnostic properties of the TB LAM urine assay (LAM) have been well-described, little is known about its predictive and prognostic properties at ART initiation in a routine clinic setting. We describe the predictive and prognostic properties of LAM in HIV-positive patients initiating ART at an urban hospital in Johannesburg, South Africa. Retrospective study of HIV-positive adults (>18 years) who initiated standard first-line ART between February 2012 and April 2013 and had a LAM test at initiation. In HIV-positive patients with no known TB at ART initiation, we assessed the sensitivity, specificity and positive/negative likelihood ratios of LAM to predict incident TB within 6 months of ART initiation. In addition, in patients with a TB diagnosis and on TB treatment ART initiation, we measured the CD4 response at 6 months on ART. Of the 274 patients without TB at ART initiation, 65% were female with median CD4 count of 213 cells/mm(3). Among the 14 (5.1%) patients who developed active TB, none were urine LAM +ve at baseline. LAM had poor sensitivity (0.0% 95% CI 0.00-23.2) to predict incident TB within 6 months of initiation. We analyzed 22 patients with a confirmed TB diagnosis at initiation separately. Of these, LAM +ve patients (27%) showed lower CD4 gains compared to LAM negative patients (median increase 103 vs 199 cells/mm(3); p = 0.08). LAM has limited value for accurately predicting incident TB in patients with higher CD4 counts after ART initiation. LAM may help identify TB/HIV co-infected patients at ART initiation who respond more slowly to treatment and require targeted interventions to improve treatment outcomes. Larger studies with longer patient follow-up are needed.

  9. Waist circumference cut-off values to predict the incidence of hypertension: an estimation from a Brazilian population-based cohort.

    Science.gov (United States)

    Gus, M; Cichelero, F Tremea; Moreira, C Medaglia; Escobar, G Fortes; Moreira, L Beltrami; Wiehe, M; Fuchs, S Costa; Fuchs, F Danni

    2009-01-01

    Central obesity is a key component in the definition of the metabolic syndrome, but the cut-off values proposed to define abnormal values vary among different guidelines and are mostly based on cross-sectional studies. In this study, we identify the best cut-off values for waist circumference (WC) associated with the incidence of hypertension. Participants for this prospectively planned cohort study were 589 individuals who were free of hypertension and selected at random from the community of Porto Alegre, Brazil. Hypertension was defined by a blood pressure measurement >or= 140/90 mmHg or the use of blood pressure lowering drugs. A logistic regression model established the association between WC and the incidence of hypertension. A receiver operating characteristics (ROC) curve analysis was used to select the best WC cut-off point to predict the incidence of hypertension. During a mean follow-up of 5.5+/-0.9 years, 127 subjects developed hypertension. The hazard ratios for the development of hypertension, adjusted for age, baseline systolic blood pressure, alcohol consumption, gender and scholarship were 1.02 (95% CI; 1.00-1.04; P=0.02) for WC. The best cut-off WC values to predict hypertension were 87 cm in men and 80 cm in women, with an area under the curve of 0.56 (95% CI; 0.47-0.64; P=0.17) and 0.70 (95% CI; 0.63-0.77; Phypertension in individuals living in communities in Brazil, and this risk begins at lower values of WC that those recommended by some guidelines.

  10. Evidence-Based Occupational Hearing Screening I: Modeling the Effects of Real-World Noise Environments on the Likelihood of Effective Speech Communication.

    Science.gov (United States)

    Soli, Sigfrid D; Giguère, Christian; Laroche, Chantal; Vaillancourt, Véronique; Dreschler, Wouter A; Rhebergen, Koenraad S; Harkins, Kevin; Ruckstuhl, Mark; Ramulu, Pradeep; Meyers, Lawrence S

    The objectives of this study were to (1) identify essential hearing-critical job tasks for public safety and law enforcement personnel; (2) determine the locations and real-world noise environments where these tasks are performed; (3) characterize each noise environment in terms of its impact on the likelihood of effective speech communication, considering the effects of different levels of vocal effort, communication distances, and repetition; and (4) use this characterization to define an objective normative reference for evaluating the ability of individuals to perform essential hearing-critical job tasks in noisy real-world environments. Data from five occupational hearing studies performed over a 17-year period for various public safety agencies were analyzed. In each study, job task analyses by job content experts identified essential hearing-critical tasks and the real-world noise environments where these tasks are performed. These environments were visited, and calibrated recordings of each noise environment were made. The extended speech intelligibility index (ESII) was calculated for each 4-sec interval in each recording. These data, together with the estimated ESII value required for effective speech communication by individuals with normal hearing, allowed the likelihood of effective speech communication in each noise environment for different levels of vocal effort and communication distances to be determined. These likelihoods provide an objective norm-referenced and standardized means of characterizing the predicted impact of real-world noise on the ability to perform essential hearing-critical tasks. A total of 16 noise environments for law enforcement personnel and eight noise environments for corrections personnel were analyzed. Effective speech communication was essential to hearing-critical tasks performed in these environments. Average noise levels, ranged from approximately 70 to 87 dBA in law enforcement environments and 64 to 80 dBA in

  11. Constraint likelihood analysis for a network of gravitational wave detectors

    International Nuclear Information System (INIS)

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-01-01

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method

  12. Incidence, predictive factors, and clinical outcomes of acute kidney injury after gastric surgery for gastric cancer.

    Directory of Open Access Journals (Sweden)

    Chang Seong Kim

    Full Text Available BACKGROUND: Postoperative acute kidney injury (AKI, a serious surgical complication, is common after cardiac surgery; however, reports on AKI after noncardiac surgery are limited. We sought to determine the incidence and predictive factors of AKI after gastric surgery for gastric cancer and its effects on the clinical outcomes. METHODS: We conducted a retrospective study of 4718 patients with normal renal function who underwent partial or total gastrectomy for gastric cancer between June 2002 and December 2011. Postoperative AKI was defined by serum creatinine change, as per the Kidney Disease Improving Global Outcomes guideline. RESULTS: Of the 4718 patients, 679 (14.4% developed AKI. Length of hospital stay, intensive care unit admission rates, and in-hospital mortality rate (3.5% versus 0.2% were significantly higher in patients with AKI than in those without. AKI was also associated with requirement of renal replacement therapy. Multivariate analysis revealed that male gender; hypertension; chronic obstructive pulmonary disease; hypoalbuminemia (<4 g/dl; use of diuretics, vasopressors, and contrast agents; and packed red blood cell transfusion were independent predictors for AKI after gastric surgery. Postoperative AKI and vasopressor use entailed a high risk of 3-month mortality after multiple adjustments. CONCLUSIONS: AKI was common after gastric surgery for gastric cancer and associated with adverse outcomes. We identified several factors associated with postoperative AKI; recognition of these predictive factors may help reduce the incidence of AKI after gastric surgery. Furthermore, postoperative AKI in patients with gastric cancer is an important risk factor for short-term mortality.

  13. Predicting the hand, foot, and mouth disease incidence using search engine query data and climate variables: an ecological study in Guangdong, China.

    Science.gov (United States)

    Du, Zhicheng; Xu, Lin; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng; Hao, Yuantao

    2017-10-06

    Hand, foot, and mouth disease (HFMD) has caused a substantial burden in China, especially in Guangdong Province. Based on the enhanced surveillance system, we aimed to explore whether the addition of temperate and search engine query data improves the risk prediction of HFMD. Ecological study. Information on the confirmed cases of HFMD, climate parameters and search engine query logs was collected. A total of 1.36 million HFMD cases were identified from the surveillance system during 2011-2014. Analyses were conducted at aggregate level and no confidential information was involved. A seasonal autoregressive integrated moving average (ARIMA) model with external variables (ARIMAX) was used to predict the HFMD incidence from 2011 to 2014, taking into account temperature and search engine query data (Baidu Index, BDI). Statistics of goodness-of-fit and precision of prediction were used to compare models (1) based on surveillance data only, and with the addition of (2) temperature, (3) BDI, and (4) both temperature and BDI. A high correlation between HFMD incidence and BDI ( r =0.794, pengine query data significantly improved the prediction of HFMD. Further studies are warranted to examine whether including search engine query data also improves the prediction of other infectious diseases in other settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. The fine-tuning cost of the likelihood in SUSY models

    International Nuclear Information System (INIS)

    Ghilencea, D.M.; Ross, G.G.

    2013-01-01

    In SUSY models, the fine-tuning of the electroweak (EW) scale with respect to their parameters γ i ={m 0 ,m 1/2 ,μ 0 ,A 0 ,B 0 ,…} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Δ of the usual likelihood L and the traditional fine-tuning measure Δ of the EW scale. A similar result is obtained for the integrated likelihood over the set {γ i }, that can be written as a surface integral of the ratio L/Δ, with the surface in γ i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Δ or equivalently, a small χ new 2 =χ old 2 +2lnΔ. This shows the fine-tuning cost to the likelihood (χ new 2 ) of the EW scale stability enforced by SUSY, that is ignored in data fits. A good χ new 2 /d.o.f.≈1 thus demands SUSY models have a fine-tuning amount Δ≪exp(d.o.f./2), which provides a model-independent criterion for acceptable fine-tuning. If this criterion is not met, one can thus rule out SUSY models without a further χ 2 /d.o.f. analysis. Numerical methods to fit the data can easily be adapted to account for this effect.

  15. Maximum likelihood approach for several stochastic volatility models

    International Nuclear Information System (INIS)

    Camprodon, Jordi; Perelló, Josep

    2012-01-01

    Volatility measures the amplitude of price fluctuations. Despite it being one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the simplest versions of the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and its Mean First-Passage Time. The approach has some predictive power on the future returns amplitude by only knowing the current volatility. The assumed models do not consider long-range volatility autocorrelation and the asymmetric return-volatility cross-correlation but the method still yields very naturally these two important stylized facts. We apply the method to different market indices and with a good performance in all cases. (paper)

  16. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  17. DarkBit. A GAMBIT module for computing dark matter observables and likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Bringmann, Torsten; Dal, Lars A. [University of Oslo, Department of Physics, Oslo (Norway); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Kahlhoefer, Felix; Wild, Sebastian [DESY, Hamburg (Germany); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Scott, Pat [Blackett Laboratory, Imperial College London, Department of Physics, London (United Kingdom); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); White, Martin [University of Adelaide, Department of Physics, Adelaide, SA (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale, Parkville (Australia); Collaboration: The GAMBIT Dark Matter Workgroup

    2017-12-15

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments (gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments (DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool (GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes (DarkSUSY and micrOMEGAs), and application of DarkBit's advanced direct and indirect detection routines to a simple effective dark matter model. (orig.)

  18. Safety culture and learning from incidents: the role of incident reporting and causal analyses

    International Nuclear Information System (INIS)

    Wilpert, B.

    1994-01-01

    Nuclear industry more than any other industrial branch has developed and used predictive risk analysis as a method of feedforward control of safety and reliability. Systematic evaluation of operating experience, statistical documentation of component failures, systematic documentation and analysis of incidents are important complementary elements of feedback control: we are dealing here with adjustment and learning from experience, in particular from past incidents. Using preliminary findings from ongoing research at the Research Center Systems Safety at the Berlin University of Technology the contribution discusses preconditions for an effective use of lessons to be learnt from closely matched incident reporting and in depth analyses of causal chains leading to incidents. Such conditions are especially standardized documentation, reporting and analyzing methods of incidents; structured information flows and feedback loops; abstaining from culpability search; mutual trust of employees and management; willingness of all concerned to continually evaluate and optimize the established learning system. Thus, incident related reporting and causal analyses contribute to safety culture, which is seen to emerge from tightly coupled organizational measures and respective change in attitudes and behaviour. (author) 2 figs., 7 refs

  19. A simulation study of likelihood inference procedures in rayleigh distribution with censored data

    International Nuclear Information System (INIS)

    Baklizi, S. A.; Baker, H. M.

    2001-01-01

    Inference procedures based on the likelihood function are considered for the one parameter Rayleigh distribution with type1 and type 2 censored data. Using simulation techniques, the finite sample performances of the maximum likelihood estimator and the large sample likelihood interval estimation procedures based on the Wald, the Rao, and the likelihood ratio statistics are investigated. It appears that the maximum likelihood estimator is unbiased. The approximate variance estimates obtained from the asymptotic normal distribution of the maximum likelihood estimator are accurate under type 2 censored data while they tend to be smaller than the actual variances when considering type1 censored data of small size. It appears also that interval estimation based on the Wald and Rao statistics need much more sample size than interval estimation based on the likelihood ratio statistic to attain reasonable accuracy. (authors). 15 refs., 4 tabs

  20. Neonatal seizures in a rural Iranian district hospital: etiologies, incidence and predicting factors.

    Science.gov (United States)

    Sadeghian, Afsaneh; Damghanian, Maryam; Shariati, Mohammad

    2012-01-01

    Current study determined the overall incidence, common causes as well as main predictors of this final diagnosis among neonates admitted to a rural district hospital in Iran. This study was conducted on 699 neonates who were candidate for admission to the NICU. Study population was categorized in the case group, including patients exposed to final diagnosis of neonatal seizures and the control group without this diagnosis. Neonatal seizure was reported as final diagnosis in 25 (3.6%) of neonates. The most frequent discharge diagnosis in the seizure group was neonatal sepsis and in the non-seizure group was respiratory problems. No significant difference was found in early fatality rate between neonates with and without seizures (8.0% vs. 10.1%). Only gestational age <38 week had a relationship with the appearance of neonatal seizure. Low gestational age has a crucial role for predicting appearance of seizure in Iranian neonates.

  1. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  2. Socio-Economic Predictors and Distribution of Tuberculosis Incidence in Beijing, China: A Study Using a Combination of Spatial Statistics and GIS Technology.

    Science.gov (United States)

    Mahara, Gehendra; Yang, Kun; Chen, Sipeng; Wang, Wei; Guo, Xiuhua

    2018-03-21

    Evidence shows that multiple factors, such as socio-economic status and access to health care facilities, affect tuberculosis (TB) incidence. However, there is limited literature available with respect to the correlation between socio-economic/health facility factors and tuberculosis incidence. This study aimed to explore the relationship between TB incidence and socio-economic/health service predictors in the study settings. A retrospective spatial regression analysis was carried out based on new sputum smear-positive pulmonary TB cases in Beijing districts. Global Moran's I analysis was adopted to detect the spatial dependency followed by spatial regression models (spatial lag model, and spatial error model) along with the ordinary least square model were applied to examine the correlation between TB incidence and predictors. A high incidence of TB was seen in densely populated districts in Beijing, e.g., Haidian, Mentougou, and Xicheng. After comparing the R², log-likelihood, and Akaike information criterion (AIC) values among three models, the spatial error model (R² = 0.413; Log Likelihood = -591; AIC = 1199.76) identified the best model fit for the spatial regression model. The study showed that the number of beds in health institutes ( p < 0.001) and per capita gross domestic product (GDP) ( p = 0.025) had a positive effect on TB incidence, whereas population density ( p < 0.001) and migrated population ( p < 0.001) had an adverse impact on TB incidence in the study settings. High TB incidence districts were detected in urban and densely populated districts in Beijing. Our findings suggested that socio-economic predictors influence TB incidence. These findings may help to guide TB control programs and promote targeted intervention.

  3. Bohai and Yellow Sea Oil Spill Prediction System and Its Application to Huangdao ‘11.22’ Oil Spill Incident

    Science.gov (United States)

    Li, Huan; Li, Yan; Li, Cheng; Li, Wenshan; Wang, Guosong; Zhang, Song

    2017-08-01

    Marine oil spill has deep negative effect on both marine ecosystem and human activities. In recent years, due to China’s high-speed economic development, the demand for crude oil is increasing year by year in China, and leading to the high risk of marine oil spill. Therefore, it is necessary that promoting emergency response on marine oil spill in China and improving oil spill prediction techniques. In this study, based on oil spill model and GIS platform, we have developed the Bohai and Yellow sea oil spill prediction system. Combining with high-resolution meteorological and oceanographic forecast results, the system was applied to predict the drift and diffusion process of Huangdao ‘11.22’ oil spill incident. Although the prediction can’t be validated by some SAR images due to the lack of satellite observations, it still provided effective and referable oil spill behavior information to Maritime Safety Administration.

  4. Predicting the hand, foot, and mouth disease incidence using search engine query data and climate variables: an ecological study in Guangdong, China

    Science.gov (United States)

    Du, Zhicheng; Xu, Lin; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng; Hao, Yuantao

    2017-01-01

    Objectives Hand, foot, and mouth disease (HFMD) has caused a substantial burden in China, especially in Guangdong Province. Based on the enhanced surveillance system, we aimed to explore whether the addition of temperate and search engine query data improves the risk prediction of HFMD. Design Ecological study. Setting and participants Information on the confirmed cases of HFMD, climate parameters and search engine query logs was collected. A total of 1.36 million HFMD cases were identified from the surveillance system during 2011–2014. Analyses were conducted at aggregate level and no confidential information was involved. Outcome measures A seasonal autoregressive integrated moving average (ARIMA) model with external variables (ARIMAX) was used to predict the HFMD incidence from 2011 to 2014, taking into account temperature and search engine query data (Baidu Index, BDI). Statistics of goodness-of-fit and precision of prediction were used to compare models (1) based on surveillance data only, and with the addition of (2) temperature, (3) BDI, and (4) both temperature and BDI. Results A high correlation between HFMD incidence and BDI (r=0.794, pmodel. Compared with the model based on surveillance data only, the ARIMAX model including BDI reached the best goodness-of-fit with an Akaike information criterion (AIC) value of −345.332, whereas the model including both BDI and temperature had the most accurate prediction in terms of the mean absolute percentage error (MAPE) of 101.745%. Conclusions An ARIMAX model incorporating search engine query data significantly improved the prediction of HFMD. Further studies are warranted to examine whether including search engine query data also improves the prediction of other infectious diseases in other settings. PMID:28988169

  5. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  6. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed; Alsolami, Fawaz; Chikalov, Igor; Algharbi, Salem; Aboudi, Faisal; Khudiri, Musab

    2016-01-01

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  7. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  8. Incremental predictive value of sarcopenia for incident fracture in an elderly Chinese cohort: results from the Osteoporotic Fractures in Men (MrOs) Study.

    Science.gov (United States)

    Yu, Ruby; Leung, Jason; Woo, Jean

    2014-08-01

    We examined whether sarcopenia is predictive of incident fractures among older men, whether the inclusion of sarcopenia in models adds any incremental value to bone mineral density (BMD), and whether sarcopenia is associated with a higher risk of fractures in elderly with osteoporosis. A cohort of 2000 community-dwelling men aged ≥65 years were examined for which detailed information regarding demographics, socioeconomic, medical history, clinical, and lifestyle factors were documented. Body composition and BMD were measured using dual energy X-ray absorptiometry. Sarcopenia was defined according to the Asian Working Group for Sarcopenia (AWGS) algorithm. Incident fractures were documented during the follow-up period from 2001 to 2013, and related to sarcopenia and its component measures using Cox proportional hazard regressions. The contribution of sarcopenia for predicting fracture risk was evaluated by receiver operating characteristic analysis, net reclassification improvement (NRI), and integrated discrimination improvement (IDI). During an average of 11.3 years of follow-up, 226 (11.3%) men sustained at least 1 incident fracture, making the incidence of fractures 1200.6/100,000 person-years. After multivariate adjustments, sarcopenia was associated with increased fracture risk (hazard ratio [HR], 1.87, 95% confidence interval [CI], 1.26-2.79) independent of BMD and other clinical risk factors. The addition of sarcopenia did not significantly increase area under curve or IDI but significantly improved the predictive ability on fracture risk over BMD and other clinical risk factors by 5.12% (P sarcopenia (sarco-osteoporosis) resulted in a significantly increased risk of fractures (HR, 3.49, 95% CI, 1.76-6.90) compared with those with normal BMD and without sarcopenia. This study confirms that sarcopenia is a predictor of fracture risk in this elderly men cohort, establishes that sarcopenia provides incremental predictive value for fractures over the

  9. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  10. Health risk factor modification predicts incidence of diabetes in an employee population: results of an 8-year longitudinal cohort study.

    Science.gov (United States)

    Rolando, Lori; Byrne, Daniel W; McGown, Paula W; Goetzel, Ron Z; Elasy, Tom A; Yarbrough, Mary I

    2013-04-01

    To understand risk factor modification effect on Type 2 diabetes incidence in a workforce population. Annual health risk assessment data (N = 3125) in years 1 through 4 were used to predict diabetes development in years 5 through 8. Employees who reduced their body mass index from 30 or more to less than 30 decreased their chances of developing diabetes (odds ratio = 0.22, 95% confidence interval: 0.05 to 0.93), while those who became obese increased their diabetes risk (odds ratio = 8.85, 95% confidence interval: 2.53 to 31.0). Weight reduction observed over a long period can result in clinically important reductions in diabetes incidence. Workplace health promotion programs may prevent diabetes among workers by encouraging weight loss and adoption of healthy lifestyle habits.

  11. Predictive Value of Triglyceride Glucose Index for the Risk of Incident Diabetes: A 4-Year Retrospective Longitudinal Study

    OpenAIRE

    Lee, Da Young; Lee, Eun Seo; Kim, Ji Hyun; Park, Se Eun; Park, Cheol-Young; Oh, Ki-Won; Park, Sung-Woo; Rhee, Eun-Jung; Lee, Won-Young

    2016-01-01

    The Triglyceride Glucose Index (TyG index) is considered a surrogate marker of insulin resistance. The aim of this study is to investigate whether the TyG index has a predictive role in identifying individuals with a high risk of incident diabetes and to compare it with other indicators of metabolic health. A total 2900 non-diabetic adults who attended five consecutive annual health check-ups at Kangbuk Samsung Hospital was divided into four subgroups using three methods: (1) baseline TyG ind...

  12. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis

  13. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  14. Patterns of care and persistence after incident elevated blood pressure.

    Science.gov (United States)

    Daley, Matthew F; Sinaiko, Alan R; Reifler, Liza M; Tavel, Heather M; Glanz, Jason M; Margolis, Karen L; Parker, Emily; Trower, Nicole K; Chandra, Malini; Sherwood, Nancy E; Adams, Kenneth; Kharbanda, Elyse O; Greenspan, Louise C; Lo, Joan C; O'Connor, Patrick J; Magid, David J

    2013-08-01

    Screening for hypertension in children occurs during routine care. When blood pressure (BP) is elevated in the hypertensive range, a repeat measurement within 1 to 2 weeks is recommended. The objective was to assess patterns of care after an incident elevated BP, including timing of repeat BP measurement and likelihood of persistently elevated BP. This retrospective study was conducted in 3 health care organizations. All children aged 3 through 17 years with an incident elevated BP at an outpatient visit during 2007 through 2010 were identified. Within this group, we assessed the proportion who had a repeat BP measured within 1 month of their incident elevated BP and the proportion who subsequently met the definition of hypertension. Multivariate analyses were used to identify factors associated with follow-up BP within 1 month of initial elevated BP. Among 72,625 children and adolescents in the population, 6108 (8.4%) had an incident elevated BP during the study period. Among 6108 with an incident elevated BP, 20.9% had a repeat BP measured within 1 month. In multivariate analyses, having a follow-up BP within 1 month was not significantly more likely among individuals with obesity or stage 2 systolic elevation. Among 6108 individuals with an incident elevated BP, 84 (1.4%) had a second and third consecutive elevated BP within 12 months. Whereas >8% of children and adolescents had an incident elevated BP, the great majority of BPs were not repeated within 1 month. However, relatively few individuals subsequently met the definition of hypertension.

  15. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.; Ma, Y.; Sang, H.

    2011-01-01

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  16. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  17. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  18. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  19. Competition between learned reward and error outcome predictions in anterior cingulate cortex.

    Science.gov (United States)

    Alexander, William H; Brown, Joshua W

    2010-02-15

    The anterior cingulate cortex (ACC) is implicated in performance monitoring and cognitive control. Non-human primate studies of ACC show prominent reward signals, but these are elusive in human studies, which instead show mainly conflict and error effects. Here we demonstrate distinct appetitive and aversive activity in human ACC. The error likelihood hypothesis suggests that ACC activity increases in proportion to the likelihood of an error, and ACC is also sensitive to the consequence magnitude of the predicted error. Previous work further showed that error likelihood effects reach a ceiling as the potential consequences of an error increase, possibly due to reductions in the average reward. We explored this issue by independently manipulating reward magnitude of task responses and error likelihood while controlling for potential error consequences in an Incentive Change Signal Task. The fMRI results ruled out a modulatory effect of expected reward on error likelihood effects in favor of a competition effect between expected reward and error likelihood. Dynamic causal modeling showed that error likelihood and expected reward signals are intrinsic to the ACC rather than received from elsewhere. These findings agree with interpretations of ACC activity as signaling both perceptions of risk and predicted reward. Copyright 2009 Elsevier Inc. All rights reserved.

  20. Assessing Compatibility of Direct Detection Data: Halo-Independent Global Likelihood Analyses

    CERN Document Server

    Gelmini, Graciela B.

    2016-10-18

    We present two different halo-independent methods utilizing a global maximum likelihood that can assess the compatibility of dark matter direct detection data given a particular dark matter model. The global likelihood we use is comprised of at least one extended likelihood and an arbitrary number of Poisson or Gaussian likelihoods. In the first method we find the global best fit halo function and construct a two sided pointwise confidence band, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a "constrained parameter goodness-of-fit" test statistic, whose $p$-value we then use to define a "plausibility region" (e.g. where $p \\geq 10\\%$). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. $p < 10 \\%$). As an example we apply these methods to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic s...

  1. Numerical Prediction of Green Water Incidents

    DEFF Research Database (Denmark)

    Nielsen, K. B.; Mayer, Stefan

    2004-01-01

    loads on a moored FPSO exposed to head sea waves. Two cases are investigated: first, green water ona fixed vessel has been analysed, where resulting waterheight on deck, and impact pressure on a deck mounted structure have been computed. These results have been compared to experimental data obtained......Green water loads on moored or sailing ships occur when an incoming wave signigicantly exceeds the freeboard and water runs onto the deck. In this paper, a Navier-Stokes solver with a free surface capturing scheme (i.e. the VOF model; Hirt and Nichols, 1981) is used to numerically model green water...... by Greco (2001) and show very favourable agreement. Second, a full green water incident, including vessel motions has been modelled. In these computations, the vertical motion has been modelled by the use of transfer functions for heave and pitch, but the rotational contribution from the pitch motion has...

  2. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  3. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  4. The predictive value of current haemoglobin levels for incident tuberculosis and/or mortality during long-term antiretroviral therapy in South Africa: a cohort study

    NARCIS (Netherlands)

    Kerkhoff, Andrew D.; Wood, Robin; Cobelens, Frank G.; Gupta-Wright, Ankur; Bekker, Linda-Gail; Lawn, Stephen D.

    2015-01-01

    Low haemoglobin concentrations may be predictive of incident tuberculosis (TB) and death in HIV-infected patients receiving antiretroviral therapy (ART), but data are limited and inconsistent. We examined these relationships retrospectively in a long-term South African ART cohort with multiple

  5. Simplified likelihood for the re-interpretation of public CMS results

    CERN Document Server

    The CMS Collaboration

    2017-01-01

    In this note, a procedure for the construction of simplified likelihoods for the re-interpretation of the results of CMS searches for new physics is presented. The procedure relies on the use of a reduced set of information on the background models used in these searches which can readily be provided by the CMS collaboration. A toy example is used to demonstrate the procedure and its accuracy in reproducing the full likelihood for setting limits in models for physics beyond the standard model. Finally, two representative searches from the CMS collaboration are used to demonstrate the validity of the simplified likelihood approach under realistic conditions.

  6. Loss of social resources predicts incident posttraumatic stress disorder during ongoing political violence within the Palestinian Authority.

    Science.gov (United States)

    Hall, Brian J; Murray, Sarah M; Galea, Sandro; Canetti, Daphna; Hobfoll, Stevan E

    2015-04-01

    Exposure to ongoing political violence and stressful conditions increases the risk of posttraumatic stress disorder (PTSD) in low-resource contexts. However, much of our understanding of the determinants of PTSD in these contexts comes from cross-sectional data. Longitudinal studies that examine factors associated with incident PTSD may be useful to the development of effective prevention interventions and the identification of those who may be most at-risk for the disorder. A 3-stage cluster random stratified sampling methodology was used to obtain a representative sample of 1,196 Palestinian adults living in Gaza, the West Bank and East Jerusalem. Face-to-face interviews were conducted at two time points 6-months apart. Logistic regression analyses were conducted on a restricted sample of 643 people who did not have PTSD at baseline and who completed both interviews. The incidence of PTSD was 15.0 % over a 6-month period. Results of adjusted logistic regression models demonstrated that talking to friends and family about political circumstances (aOR = 0.78, p = 0.01) was protective, and female sex (aOR = 1.76, p = 0.025), threat perception of future violence (aOR = 1.50, p = 0.002), poor general health (aOR = 1.39, p = 0.005), exposure to media (aOR = 1.37, p = 0.002), and loss of social resources (aOR = 1.71, p = 0.006) were predictive of incident cases of PTSD. A high incidence of PTSD was documented during a 6-month follow-up period among Palestinian residents of Gaza, the West Bank, and East Jerusalem. Interventions that promote health and increase and forestall loss to social resources could potentially reduce the onset of PTSD in communities affected by violence.

  7. Image properties of list mode likelihood reconstruction for a rectangular positron emission mammography with DOI measurements

    International Nuclear Information System (INIS)

    Qi, Jinyi; Klein, Gregory J.; Huesman, Ronald H.

    2000-01-01

    A positron emission mammography scanner is under development at our Laboratory. The tomograph has a rectangular geometry consisting of four banks of detector modules. For each detector, the system can measure the depth of interaction information inside the crystal. The rectangular geometry leads to irregular radial and angular sampling and spatially variant sensitivity that are different from conventional PET systems. Therefore, it is of importance to study the image properties of the reconstructions. We adapted the theoretical analysis that we had developed for conventional PET systems to the list mode likelihood reconstruction for this tomograph. The local impulse response and covariance of the reconstruction can be easily computed using FFT. These theoretical results are also used with computer observer models to compute the signal-to-noise ratio for lesion detection. The analysis reveals the spatially variant resolution and noise properties of the list mode likelihood reconstruction. The theoretical predictions are in good agreement with Monte Carlo results

  8. Maximum likelihood of phylogenetic networks.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  9. Predictors of incident heart failure in patients after an acute coronary syndrome: The LIPID heart failure risk-prediction model.

    Science.gov (United States)

    Driscoll, Andrea; Barnes, Elizabeth H; Blankenberg, Stefan; Colquhoun, David M; Hunt, David; Nestel, Paul J; Stewart, Ralph A; West, Malcolm J; White, Harvey D; Simes, John; Tonkin, Andrew

    2017-12-01

    Coronary heart disease is a major cause of heart failure. Availability of risk-prediction models that include both clinical parameters and biomarkers is limited. We aimed to develop such a model for prediction of incident heart failure. A multivariable risk-factor model was developed for prediction of first occurrence of heart failure death or hospitalization. A simplified risk score was derived that enabled subjects to be grouped into categories of 5-year risk varying from 20%. Among 7101 patients from the LIPID study (84% male), with median age 61years (interquartile range 55-67years), 558 (8%) died or were hospitalized because of heart failure. Older age, history of claudication or diabetes mellitus, body mass index>30kg/m 2 , LDL-cholesterol >2.5mmol/L, heart rate>70 beats/min, white blood cell count, and the nature of the qualifying acute coronary syndrome (myocardial infarction or unstable angina) were associated with an increase in heart failure events. Coronary revascularization was associated with a lower event rate. Incident heart failure increased with higher concentrations of B-type natriuretic peptide >50ng/L, cystatin C>0.93nmol/L, D-dimer >273nmol/L, high-sensitivity C-reactive protein >4.8nmol/L, and sensitive troponin I>0.018μg/L. Addition of biomarkers to the clinical risk model improved the model's C statistic from 0.73 to 0.77. The net reclassification improvement incorporating biomarkers into the clinical model using categories of 5-year risk was 23%. Adding a multibiomarker panel to conventional parameters markedly improved discrimination and risk classification for future heart failure events. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V. [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland 20892 (United States)

    2015-01-21

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.

  11. Predictors of Self-Reported Likelihood of Working with Older Adults

    Science.gov (United States)

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  12. Estimation of National Colorectal-Cancer Incidence Using Claims Databases

    International Nuclear Information System (INIS)

    Quantin, C.; Benzenine, E.; Hagi, M.; Auverlot, B.; Cottenet, J.; Binquet, M.; Compain, D.

    2012-01-01

    The aim of the study was to assess the accuracy of the colorectal-cancer incidence estimated from administrative data. Methods. We selected potential incident colorectal-cancer cases in 2004-2005 French administrative data, using two alternative algorithms. The first was based only on diagnostic and procedure codes, whereas the second considered the past history of the patient. Results of both methods were assessed against two corresponding local cancer registries, acting as “gold standards.” We then constructed a multivariable regression model to estimate the corrected total number of incident colorectal-cancer cases from the whole national administrative database. Results. The first algorithm provided an estimated local incidence very close to that given by the regional registries (646 versus 645 incident cases) and had good sensitivity and positive predictive values (about 75% for both). The second algorithm overestimated the incidence by about 50% and had a poor positive predictive value of about 60%. The estimation of national incidence obtained by the first algorithm differed from that observed in 14 registries by only 2.34%. Conclusion. This study shows the usefulness of administrative databases for countries with no national cancer registry and suggests a method for correcting the estimates provided by these data.

  13. The Incidence of Hypothyroidism Following the Radioactive Iodine Treatment of Graves’ Disease and the Predictive Factors Influencing its Development

    International Nuclear Information System (INIS)

    Husseni, Maha Abd El-Kareem El-Sayed

    2016-01-01

    The purpose of this study is to evaluate and compare the incidence of hypothyroidism following different fixed radioactive iodine-131 ( 131 I) activities in the treatment of Graves’ disease (GD) and to investigate the predictive factors that may influence its occurrence. This retrospective analysis was performed on 272 patients with GD who were treated with 131 I, among whom 125 received 370 MBq and 147 received 555 MBq. The outcome was categorized as hypothyroidism, euthyroidism, and persistent hyperthyroidism. Multiple logistic regression analysis was performed to identify significant risk factors that affect the development of hypothyroidism. The incidence of hypothyroidism following the first low activity was 24.8% with a high treatment failure rate of 58.4%, compared with 48.3% and 32% following high activity. The overall cumulative incidence of hypothyroidism following repeated activities was 50.7%, out of which 73.9% occurred after the first activity and 20.3% after the second activity. The higher 131 I activity (P < 0.001) and average and mild enlargement of the thyroid gland (P = 0.004) were identified as significant independent factors that increase the rate of incidence of hypothyroidism (Odds ratios were 2.95 and 2.59). No correlation was found between the development of hypothyroidism and the factors such as age, gender, presence of exophthalmos, previous antithyroid medications, and the durations, and Technetium-99m (Tc-99m) pertechnetate thyroid uptake. In view of the high treatment failure rate after first low activity and lower post high activity hypothyroid incidence, high activity is recommended for GD patients, reserving the use of 370MBq for patients with average sized and mildly enlarged goiter; this increases patient convenience by avoiding multiple activities to achieve cure and long-term follow-up

  14. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  15. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  16. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  17. Incremental Predictive Value of Serum AST-to-ALT Ratio for Incident Metabolic Syndrome: The ARIRANG Study

    Science.gov (United States)

    Ahn, Song Vogue; Baik, Soon Koo; Cho, Youn zoo; Koh, Sang Baek; Huh, Ji Hye; Chang, Yoosoo; Sung, Ki-Chul; Kim, Jang Young

    2016-01-01

    Aims The ratio of aspartate aminotransferase (AST) to alanine aminotransferase (ALT) is of great interest as a possible novel marker of metabolic syndrome. However, longitudinal studies emphasizing the incremental predictive value of the AST-to-ALT ratio in diagnosing individuals at higher risk of developing metabolic syndrome are very scarce. Therefore, our study aimed to evaluate the AST-to-ALT ratio as an incremental predictor of new onset metabolic syndrome in a population-based cohort study. Material and Methods The population-based cohort study included 2276 adults (903 men and 1373 women) aged 40–70 years, who participated from 2005–2008 (baseline) without metabolic syndrome and were followed up from 2008–2011. Metabolic syndrome was defined according to the harmonized definition of metabolic syndrome. Serum concentrations of AST and ALT were determined by enzymatic methods. Results During an average follow-up period of 2.6-years, 395 individuals (17.4%) developed metabolic syndrome. In a multivariable adjusted model, the odds ratio (95% confidence interval) for new onset of metabolic syndrome, comparing the fourth quartile to the first quartile of the AST-to-ALT ratio, was 0.598 (0.422–0.853). The AST-to-ALT ratio also improved the area under the receiver operating characteristic curve (AUC) for predicting new cases of metabolic syndrome (0.715 vs. 0.732, P = 0.004). The net reclassification improvement of prediction models including the AST-to-ALT ratio was 0.23 (95% CI: 0.124–0.337, Pmetabolic syndrome and had incremental predictive value for incident metabolic syndrome. PMID:27560931

  18. Beyond Sex: Likelihood and Predictors of Effective and Ineffective Intervention in Intimate Partner Violence in Bystanders Perceiving an Emergency.

    Science.gov (United States)

    Chabot, Heather Frasier; Gray, Melissa L; Makande, Tariro B; Hoyt, Robert L

    2016-01-06

    Within the framework of the bystander model of intervention, we examined specific correlates and the likelihood of effective and ineffective intervention strategies of bystanders to an instance of intimate partner violence (IPV) identified as an emergency. We measured psychological variables associated with general prosocial behavior (including sex, instrumentality, expressiveness, empathy, personal distress, dispositional anger, and perceived barriers) as influential predictors in four IPV intervention behaviors (i.e., calling 911, talking to the victim, talking to the perpetrator, and physically interacting with the perpetrator). One hundred seventeen college community members completed preintervention measures, watched a film clip of IPV which they identified as an emergency, reported their likelihood of becoming involved and utilizing intervention behaviors, and identified perceived barriers to intervention. Participants were more likely to indicate using effective over ineffective intervention tactics. Lower perceived barriers to intervention predicted greater intervention likelihood. Hierarchical regression indicated that men and individuals higher in anger and instrumental traits were more likely to report that they would engage in riskier ineffective forms of intervention. Implications regarding bystander training and associations to intervention in related forms of violence including sexual assault are discussed. © The Author(s) 2016.

  19. Real Time Big Data Analytics for Predicting Terrorist Incidents

    Science.gov (United States)

    Toure, Ibrahim

    2017-01-01

    Terrorism is a complex and evolving phenomenon. In the past few decades, we have witnessed an increase in the number of terrorist incidents in the world. The security and stability of many countries is threatened by terrorist groups. Perpetrators now use sophisticated weapons and the attacks are more and more lethal. Currently, terrorist incidents…

  20. Solar cell angle of incidence corrections

    Science.gov (United States)

    Burger, Dale R.; Mueller, Robert L.

    1995-01-01

    Literature on solar array angle of incidence corrections was found to be sparse and contained no tabular data for support. This lack along with recent data on 27 GaAs/Ge 4 cm by 4 cm cells initiated the analysis presented in this paper. The literature cites seven possible contributors to angle of incidence effects: cosine, optical front surface, edge, shadowing, UV degradation, particulate soiling, and background color. Only the first three are covered in this paper due to lack of sufficient data. The cosine correction is commonly used but is not sufficient when the incident angle is large. Fresnel reflection calculations require knowledge of the index of refraction of the coverglass front surface. The absolute index of refraction for the coverglass front surface was not known nor was it measured due to lack of funds. However, a value for the index of refraction was obtained by examining how the prediction errors varied with different assumed indices and selecting the best fit to the set of measured values. Corrections using front surface Fresnel reflection along with the cosine correction give very good predictive results when compared to measured data, except there is a definite trend away from predicted values at the larger incident angles. This trend could be related to edge effects and is illustrated by a use of a box plot of the errors and by plotting the deviation of the mean against incidence angle. The trend is for larger deviations at larger incidence angles and there may be a fourth order effect involved in the trend. A chi-squared test was used to determine if the measurement errors were normally distributed. At 10 degrees the chi-squared test failed, probably due to the very small numbers involved or a bias from the measurement procedure. All other angles showed a good fit to the normal distribution with increasing goodness-of-fit as the angles increased which reinforces the very small numbers hypothesis. The contributed data only went to 65 degrees

  1. Metabolic syndrome, adherence to the Mediterranean diet and 10-year cardiovascular disease incidence: The ATTICA study.

    Science.gov (United States)

    Kastorini, Christina-Maria; Panagiotakos, Demosthenes B; Chrysohoou, Christina; Georgousopoulou, Ekavi; Pitaraki, Evangelia; Puddu, Paolo Emilio; Tousoulis, Dimitrios; Stefanadis, Christodoulos; Pitsavos, Christos

    2016-03-01

    To better understand the metabolic syndrome (MS) spectrum through principal components analysis and further evaluate the role of the Mediterranean diet on MS presence. During 2001-2002, 1514 men and 1528 women (>18 y) without any clinical evidence of CVD or any other chronic disease, at baseline, living in greater Athens area, Greece, were enrolled. In 2011-2012, the 10-year follow-up was performed in 2583 participants (15% of the participants were lost to follow-up). Incidence of fatal or non-fatal CVD was defined according to WHO-ICD-10 criteria. MS was defined by the National Cholesterol Education Program Adult Treatment panel III (revised NCEP ATP III) definition. Adherence to the Mediterranean diet was assessed using the MedDietScore (range 0-55). Five principal components were derived, explaining 73.8% of the total variation, characterized by the: a) body weight and lipid profile, b) blood pressure, c) lipid profile, d) glucose profile, e) inflammatory factors. All components were associated with higher likelihood of CVD incidence. After adjusting for various potential confounding factors, adherence to the Mediterranean dietary pattern for each 10% increase in the MedDietScore, was associated with 15% lower odds of CVD incidence (95%CI: 0.71-1.06). For the participants with low adherence to the Mediterranean diet all five components were significantly associated with increased likelihood of CVD incidence. However, for the ones following closely the Mediterranean pattern positive, yet not significant associations were observed. Results of the present work propose a wider MS definition, while highlighting the beneficial role of the Mediterranean dietary pattern. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation

    International Nuclear Information System (INIS)

    Helgesson, P.; Sjöstrand, H.; Koning, A.J.; Rydén, J.; Rochman, D.; Alhassan, E.; Pomp, S.

    2016-01-01

    In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also

  3. Approximate maximum likelihood estimation for population genetic inference.

    Science.gov (United States)

    Bertl, Johanna; Ewing, Gregory; Kosiol, Carolin; Futschik, Andreas

    2017-11-27

    In many population genetic problems, parameter estimation is obstructed by an intractable likelihood function. Therefore, approximate estimation methods have been developed, and with growing computational power, sampling-based methods became popular. However, these methods such as Approximate Bayesian Computation (ABC) can be inefficient in high-dimensional problems. This led to the development of more sophisticated iterative estimation methods like particle filters. Here, we propose an alternative approach that is based on stochastic approximation. By moving along a simulated gradient or ascent direction, the algorithm produces a sequence of estimates that eventually converges to the maximum likelihood estimate, given a set of observed summary statistics. This strategy does not sample much from low-likelihood regions of the parameter space, and is fast, even when many summary statistics are involved. We put considerable efforts into providing tuning guidelines that improve the robustness and lead to good performance on problems with high-dimensional summary statistics and a low signal-to-noise ratio. We then investigate the performance of our resulting approach and study its properties in simulations. Finally, we re-estimate parameters describing the demographic history of Bornean and Sumatran orang-utans.

  4. The incidence of urea cycle disorders

    OpenAIRE

    Summar, Marshall L.; Koelker, Stefan; Freedenberg, Debra; Le Mons, Cynthia; Haberle, Johannes; Lee, Hye-Seung; Kirmse, Brian

    2013-01-01

    A key question for urea cycle disorders is their incidence. In the United States two UCDs argininosuccinic synthetase and lyase deficiency are currently detected by newborn screening. We used newborn screening data on over 6. million births and data from the large US and European longitudinal registries to determine how common these conditions are. The incidence for the United States is predicted to be 1 urea cycle disorder patient for every 35000 births presenting about 113 new patients per ...

  5. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  6. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  7. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  8. Incidence and predictive factors of isolated neonatal penile glanular torsion.

    Science.gov (United States)

    Sarkis, Pierrot E; Sadasivam, Muthurajan

    2007-12-01

    To determine the incidence of isolated neonatal penile glanular torsion, describe the basic characteristics, and explore the relationship between foreskin and glans torsion. A prospective survey was conducted of all male newborns admitted to nursery after delivery, or neonates less than 3 months presenting for circumcision. Cases with associated genital malformations were excluded. The incidence of isolated neonatal penile torsion was 27% (95% CI: 22.2%-31.84%), to the left in 99% of cases. In 3.5% of cases, the penis had an angle 20 degrees. Using Spearman's correlational coefficient, deviation of penile raphe from the midline at the foreskin tip had a better correlation with glans torsion than deviation of raphe at the coronal sulcus (0.727 vs 0.570; both significant at pscope of the study.

  9. Comparison of likelihood testing procedures for parallel systems with covariances

    International Nuclear Information System (INIS)

    Ayman Baklizi; Isa Daud; Noor Akma Ibrahim

    1998-01-01

    In this paper we considered investigating and comparing the behavior of the likelihood ratio, the Rao's and the Wald's statistics for testing hypotheses on the parameters of the simple linear regression model based on parallel systems with covariances. These statistics are asymptotically equivalent (Barndorff-Nielsen and Cox, 1994). However, their relative performances in finite samples are generally known. A Monte Carlo experiment is conducted to stimulate the sizes and the powers of these statistics for complete samples and in the presence of time censoring. Comparisons of the statistics are made according to the attainment of assumed size of the test and their powers at various points in the parameter space. The results show that the likelihood ratio statistics appears to have the best performance in terms of the attainment of the assumed size of the test. Power comparisons show that the Rao statistic has some advantage over the Wald statistic in almost all of the space of alternatives while likelihood ratio statistic occupies either the first or the last position in term of power. Overall, the likelihood ratio statistic appears to be more appropriate to the model under study, especially for small sample sizes

  10. Finite mixture model: A maximum likelihood estimation approach on time series data

    Science.gov (United States)

    Yen, Phoong Seuk; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-09-01

    Recently, statistician emphasized on the fitting of finite mixture model by using maximum likelihood estimation as it provides asymptotic properties. In addition, it shows consistency properties as the sample sizes increases to infinity. This illustrated that maximum likelihood estimation is an unbiased estimator. Moreover, the estimate parameters obtained from the application of maximum likelihood estimation have smallest variance as compared to others statistical method as the sample sizes increases. Thus, maximum likelihood estimation is adopted in this paper to fit the two-component mixture model in order to explore the relationship between rubber price and exchange rate for Malaysia, Thailand, Philippines and Indonesia. Results described that there is a negative effect among rubber price and exchange rate for all selected countries.

  11. Expert elicitation on ultrafine particles: likelihood of health effects and causal pathways

    Directory of Open Access Journals (Sweden)

    Brunekreef Bert

    2009-07-01

    Full Text Available Abstract Background Exposure to fine ambient particulate matter (PM has consistently been associated with increased morbidity and mortality. The relationship between exposure to ultrafine particles (UFP and health effects is less firmly established. If UFP cause health effects independently from coarser fractions, this could affect health impact assessment of air pollution, which would possibly lead to alternative policy options to be considered to reduce the disease burden of PM. Therefore, we organized an expert elicitation workshop to assess the evidence for a causal relationship between exposure to UFP and health endpoints. Methods An expert elicitation on the health effects of ambient ultrafine particle exposure was carried out, focusing on: 1 the likelihood of causal relationships with key health endpoints, and 2 the likelihood of potential causal pathways for cardiac events. Based on a systematic peer-nomination procedure, fourteen European experts (epidemiologists, toxicologists and clinicians were selected, of whom twelve attended. They were provided with a briefing book containing key literature. After a group discussion, individual expert judgments in the form of ratings of the likelihood of causal relationships and pathways were obtained using a confidence scheme adapted from the one used by the Intergovernmental Panel on Climate Change. Results The likelihood of an independent causal relationship between increased short-term UFP exposure and increased all-cause mortality, hospital admissions for cardiovascular and respiratory diseases, aggravation of asthma symptoms and lung function decrements was rated medium to high by most experts. The likelihood for long-term UFP exposure to be causally related to all cause mortality, cardiovascular and respiratory morbidity and lung cancer was rated slightly lower, mostly medium. The experts rated the likelihood of each of the six identified possible causal pathways separately. Out of these

  12. A note on estimating errors from the likelihood function

    International Nuclear Information System (INIS)

    Barlow, Roger

    2005-01-01

    The points at which the log likelihood falls by 12 from its maximum value are often used to give the 'errors' on a result, i.e. the 68% central confidence interval. The validity of this is examined for two simple cases: a lifetime measurement and a Poisson measurement. Results are compared with the exact Neyman construction and with the simple Bartlett approximation. It is shown that the accuracy of the log likelihood method is poor, and the Bartlett construction explains why it is flawed

  13. Within-person Changes in Individual Symptoms of Depression Predict Subsequent Depressive Episodes in Adolescents: A Prospective Study

    Science.gov (United States)

    Kouros, Chrystyna D.; Morris, Matthew C.; Garber, Judy

    2015-01-01

    The current longitudinal study examined which individual symptoms of depression uniquely predicted a subsequent Major Depressive Episode (MDE) in adolescents, and whether these relations differed by sex. Adolescents (N=240) were first interviewed in grade 6 (M=11.86 years old; SD = 0.56; 54% female; 81.5% Caucasian) and then annually through grade 12 regarding their individual symptoms of depression as well as the occurrence of MDEs. Individual symptoms of depression were assessed with the Children’s Depression Rating Scale-Revised (CDRS-R) and depressive episodes were assessed with the Longitudinal Interval Follow-up Evaluation (LIFE). Results showed that within-person changes in sleep problems and low self-esteem/excessive guilt positively predicted an increased likelihood of an MDE for both boys and girls. Significant sex differences also were found. Within-person changes in anhedonia predicted an increased likelihood of a subsequent MDE among boys, whereas irritability predicted a decreased likelihood of a future MDE among boys, and concentration difficulties predicted a decreased likelihood of an MDE in girls. These results identified individual depressive symptoms that predicted subsequent depressive episodes in male and female adolescents, and may be used to guide the early detection, treatment, and prevention of depressive disorders in youth. PMID:26105209

  14. SeqAPASS: Predicting chemical susceptibility to threatened/endangered species

    Science.gov (United States)

    Conservation of a molecular target across species can be used as a line-of-evidence to predict the likelihood of chemical susceptibility. The web-based Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS; https://seqapass.epa.gov/seqapass/) application was devel...

  15. Angiographically Negative Acute Arterial Upper and Lower Gastrointestinal Bleeding: Incidence, Predictive Factors, and Clinical Outcomes

    International Nuclear Information System (INIS)

    Kim, Jin Hyoung; Shin, Ji Hoon; Yoon, Hyun Ki; Chae, Eun Young; Myung, Seung Jae; Ko, Gi Young; Gwon, Dong Il; Sung, Kyu Bo

    2009-01-01

    To evaluate the incidence, predictive factors, and clinical outcomes of angiographically negative acute arterial upper and lower gastrointestinal (GI) bleeding. From 2001 to 2008, 143 consecutive patients who underwent an angiography for acute arterial upper or lower GI bleeding were examined. The angiographies revealed a negative bleeding focus in 75 of 143 (52%) patients. The incidence of an angiographically negative outcome was significantly higher in patients with a stable hemodynamic status (p < 0.001), or in patients with lower GI bleeding (p = 0.032). A follow-up of the 75 patients (range: 0-72 months, mean: 8 ± 14 months) revealed that 60 of the 75 (80%) patients with a negative bleeding focus underwent conservative management only, and acute bleeding was controlled without rebleeding. Three of the 75 (4%) patients underwent exploratory surgery due to prolonged bleeding; however, no bleeding focus was detected. Rebleeding occurred in 12 of 75 (16%) patients. Of these, six patients experienced massive rebleeding and died of disseminated intravascular coagulation within four to nine hours after the rebleeding episode. Four of the 16 patients underwent a repeat angiography and the two remaining patients underwent a surgical intervention to control the bleeding. Angiographically negative results are relatively common in patients with acute GI bleeding, especially in patients with a stable hemodynamic status or lower GI bleeding. Most patients with a negative bleeding focus have experienced spontaneous resolution of their condition

  16. Angiographically Negative Acute Arterial Upper and Lower Gastrointestinal Bleeding: Incidence, Predictive Factors, and Clinical Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Hyoung; Shin, Ji Hoon; Yoon, Hyun Ki; Chae, Eun Young; Myung, Seung Jae; Ko, Gi Young; Gwon, Dong Il; Sung, Kyu Bo [Asan Medical Center, Seoul (Korea, Republic of)

    2009-08-15

    To evaluate the incidence, predictive factors, and clinical outcomes of angiographically negative acute arterial upper and lower gastrointestinal (GI) bleeding. From 2001 to 2008, 143 consecutive patients who underwent an angiography for acute arterial upper or lower GI bleeding were examined. The angiographies revealed a negative bleeding focus in 75 of 143 (52%) patients. The incidence of an angiographically negative outcome was significantly higher in patients with a stable hemodynamic status (p < 0.001), or in patients with lower GI bleeding (p = 0.032). A follow-up of the 75 patients (range: 0-72 months, mean: 8 {+-} 14 months) revealed that 60 of the 75 (80%) patients with a negative bleeding focus underwent conservative management only, and acute bleeding was controlled without rebleeding. Three of the 75 (4%) patients underwent exploratory surgery due to prolonged bleeding; however, no bleeding focus was detected. Rebleeding occurred in 12 of 75 (16%) patients. Of these, six patients experienced massive rebleeding and died of disseminated intravascular coagulation within four to nine hours after the rebleeding episode. Four of the 16 patients underwent a repeat angiography and the two remaining patients underwent a surgical intervention to control the bleeding. Angiographically negative results are relatively common in patients with acute GI bleeding, especially in patients with a stable hemodynamic status or lower GI bleeding. Most patients with a negative bleeding focus have experienced spontaneous resolution of their condition.

  17. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  18. A Maximum Likelihood Approach to Determine Sensor Radiometric Response Coefficients for NPP VIIRS Reflective Solar Bands

    Science.gov (United States)

    Lei, Ning; Chiang, Kwo-Fu; Oudrari, Hassan; Xiong, Xiaoxiong

    2011-01-01

    Optical sensors aboard Earth orbiting satellites such as the next generation Visible/Infrared Imager/Radiometer Suite (VIIRS) assume that the sensors radiometric response in the Reflective Solar Bands (RSB) is described by a quadratic polynomial, in relating the aperture spectral radiance to the sensor Digital Number (DN) readout. For VIIRS Flight Unit 1, the coefficients are to be determined before launch by an attenuation method, although the linear coefficient will be further determined on-orbit through observing the Solar Diffuser. In determining the quadratic polynomial coefficients by the attenuation method, a Maximum Likelihood approach is applied in carrying out the least-squares procedure. Crucial to the Maximum Likelihood least-squares procedure is the computation of the weight. The weight not only has a contribution from the noise of the sensor s digital count, with an important contribution from digitization error, but also is affected heavily by the mathematical expression used to predict the value of the dependent variable, because both the independent and the dependent variables contain random noise. In addition, model errors have a major impact on the uncertainties of the coefficients. The Maximum Likelihood approach demonstrates the inadequacy of the attenuation method model with a quadratic polynomial for the retrieved spectral radiance. We show that using the inadequate model dramatically increases the uncertainties of the coefficients. We compute the coefficient values and their uncertainties, considering both measurement and model errors.

  19. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    Science.gov (United States)

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  20. METS-IR, a novel score to evaluate insulin sensitivity, is predictive of visceral adiposity and incident type 2 diabetes.

    Science.gov (United States)

    Bello-Chavolla, Omar Yaxmehen; Almeda-Valdes, Paloma; Gomez-Velasco, Donaji; Viveros-Ruiz, Tannia; Cruz-Bautista, Ivette; Romo-Romo, Alonso; Sánchez-Lázaro, Daniel; Meza-Oviedo, Dushan; Vargas-Vázquez, Arsenio; Campos, Olimpia Arellano; Sevilla-González, Magdalena Del Rocío; Martagón, Alexandro J; Hernández, Liliana Muñoz; Mehta, Roopa; Caballeros-Barragán, César Rodolfo; Aguilar-Salinas, Carlos A

    2018-05-01

    We developed a novel non-insulin-based fasting score to evaluate insulin sensitivity validated against the euglycemic-hyperinsulinemic clamp (EHC). We also evaluated its correlation with ectopic fact accumulation and its capacity to predict incident type 2 diabetes mellitus (T2D). The discovery sample was composed by 125 subjects (57 without and 68 with T2D) that underwent an EHC. We defined METS-IR as Ln((2*G 0 )+TG 0 )*BMI)/(Ln(HDL-c)) (G 0 : fasting glucose, TG 0 : fasting triglycerides, BMI: body mass index, HDL-c: high-density lipoprotein cholesterol), and compared its diagnostic performance against the M-value adjusted by fat-free mass (MFFM) obtained by an EHC. METS-IR was validated in a sample with EHC data, a sample with modified frequently sampled intravenous glucose tolerance test (FSIVGTT) data and a large cohort against HOMA-IR. We evaluated the correlation of the score with intrahepatic and intrapancreatic fat measured using magnetic resonance spectroscopy. Subsequently, we evaluated its ability to predict incident T2D cases in a prospective validation cohort of 6144 subjects. METS-IR demonstrated the better correlation with the MFFM ( ρ  = -0.622, P  index obtained from the FSIVGTT (AUC: 0.67, 95% CI: 0.53-0.81). METS-IR significantly correlated with intravisceral, intrahepatic and intrapancreatic fat and fasting insulin levels ( P  50.39) had the highest adjusted risk to develop T2D (HR: 3.91, 95% CI: 2.25-6.81). Furthermore, subjects with incident T2D had higher baseline METS-IR compared to healthy controls (50.2 ± 10.2 vs 44.7 ± 9.2, P  < 0.001). METS-IR is a novel score to evaluate cardiometabolic risk in healthy and at-risk subjects and a promising tool for screening of insulin sensitivity. © 2018 European Society of Endocrinology.

  1. Efficient Bit-to-Symbol Likelihood Mappings

    Science.gov (United States)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  2. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  3. Phenomenological modelling of second cancer incidence for radiation treatment planning

    International Nuclear Information System (INIS)

    Pfaffenberger, Asja; Oelfke, Uwe; Schneider, Uwe; Poppe, Bjoern

    2009-01-01

    It is still an unanswered question whether a relatively low dose of radiation to a large volume or a higher dose to a small volume produces the higher cancer incidence. This is of interest in view of modalities like IMRT or rotation therapy where high conformity to the target volume is achieved at the cost of a large volume of normal tissue exposed to radiation. Knowledge of the shape of the dose response for radiation-induced cancer is essential to answer the question of what risk of second cancer incidence is implied by which treatment modality. This study therefore models the dose response for radiation-induced second cancer after radiation therapy of which the exact mechanisms are still unknown. A second cancer risk estimation tool for treatment planning is presented which has the potential to be used for comparison of different treatment modalities, and risk is estimated on a voxel basis for different organs in two case studies. The presented phenomenological model summarises the impact of microscopic biological processes into effective parameters of mutation and cell sterilisation. In contrast to other models, the effective radiosensitivities of mutated and non-mutated cells are allowed to differ. Based on the number of mutated cells present after irradiation, the model is then linked to macroscopic incidence by summarising model parameters and modifying factors into natural cancer incidence and the dose response in the lower-dose region. It was found that all principal dose-response functions discussed in the literature can be derived from the model. However, from the investigation and due to scarcity of adequate data, rather vague statements about likelihood of dose-response functions can be made than a definite decision for one response. Based on the predicted model parameters, the linear response can probably be rejected using the dynamics described, but both a flattening response and a decrease appear likely, depending strongly on the effective cell

  4. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  5. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  6. Documentation of in-hospital falls on incident reports: qualitative investigation of an imperfect process.

    Science.gov (United States)

    Haines, Terry P; Cornwell, Petrea; Fleming, Jennifer; Varghese, Paul; Gray, Len

    2008-12-11

    Incident reporting is the prevailing approach to gathering data on accidental falls in hospitals for both research and quality assurance purposes, though is of questionable quality as staff time pressures, perception of blame and other factors are thought to contribute to under-reporting. This research aimed to identify contextual factors influencing recording of in-hospital falls on incident reports. A qualitative multi-centre investigation using an open written response questionnaire was undertaken. Participants were asked to describe any factors that made them feel more or less likely to record a fall on an incident report. 212 hospital staff from 30 wards in 7 hospitals in Queensland, Australia provided a response. A framework approach was employed to identify and understand inter-relationships between emergent categories. Three main categories were developed. The first, determinants of reporting, describes a hierarchical structure of primary (principle of reporting), secondary (patient injury), and tertiary determinants that influenced the likelihood that an in-hospital fall would be recorded on an incident report. The tertiary determinants frequently had an inconsistent effect. The second and third main categories described environmental/cultural facilitators and barriers respectively which form a background upon which the determinants of reporting exists. A distinctive framework with clear differences to recording of other types of adverse events on incident reports was apparent. Providing information to hospital staff regarding the purpose of incident reporting and the usefulness of incident reporting for preventing future falls may improve incident reporting practices.

  7. Pendeteksian Outlier pada Regresi Nonlinier dengan Metode statistik Likelihood Displacement

    Directory of Open Access Journals (Sweden)

    Siti Tabi'atul Hasanah

    2012-11-01

    Full Text Available Outlier is an observation that much different (extreme from the other observational data, or data can be interpreted that do not follow the general pattern of the model. Sometimes outliers provide information that can not be provided by other data. That's why outliers should not just be eliminated. Outliers can also be an influential observation. There are many methods that can be used to detect of outliers. In previous studies done on outlier detection of linear regression. Next will be developed detection of outliers in nonlinear regression. Nonlinear regression here is devoted to multiplicative nonlinear regression. To detect is use of statistical method likelihood displacement. Statistical methods abbreviated likelihood displacement (LD is a method to detect outliers by removing the suspected outlier data. To estimate the parameters are used to the maximum likelihood method, so we get the estimate of the maximum. By using LD method is obtained i.e likelihood displacement is thought to contain outliers. Further accuracy of LD method in detecting the outliers are shown by comparing the MSE of LD with the MSE from the regression in general. Statistic test used is Λ. Initial hypothesis was rejected when proved so is an outlier.

  8. Susceptibility, likelihood to be diagnosed, worry and fear for contracting Lyme disease.

    Science.gov (United States)

    Fogel, Joshua; Chawla, Gurasees S

    Risk perception and psychological concerns are relevant for understanding how people view Lyme disease. This study investigates the four separate outcomes of susceptibility, likelihood to be diagnosed, worry, and fear for contracting Lyme disease. University students (n=713) were surveyed about demographics, perceived health, Lyme disease knowledge, Lyme disease preventive behaviors, Lyme disease history, and Lyme disease miscellaneous variables. We found that women were associated with increased susceptibility and fear. Asian/Asian-American race/ethnicity was associated with increased worry and fear. Perceived good health was associated with increased likelihood to be diagnosed, worry, and fear. Correct knowledge was associated with increased susceptibility and likelihood to be diagnosed. Those who typically spend a lot of time outdoors were associated with increased susceptibility, likelihood to be diagnosed, worry, and fear. In conclusion, healthcare providers and public health campaigns should address susceptibility, likelihood to be diagnosed, worry, and fear about Lyme disease, and should particularly target women and Asians/Asian-Americans to address any possible misconceptions and/or offer effective coping strategies. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  9. Soluble CD163 predicts incident chronic lung, kidney and liver disease in HIV infection

    DEFF Research Database (Denmark)

    Kirkegaard-Klitbo, Ditte M; Mejer, Niels; Knudsen, Troels B

    2017-01-01

    OBJECTIVE: To examine if monocyte and macrophage activity may be on the mechanistic pathway to non-AIDS comorbidity by investigating the associations between plasma-soluble CD163 (sCD163) and incident non-AIDS comorbidities in well treated HIV-infected individuals. DESIGN: Prospective single...... was examined using multivariable Cox proportional hazards models adjusted for pertinent covariates. RESULTS: In HIV-1-infected individuals (n = 799), the highest quartile of plasma sCD163 was associated with incident chronic lung disease [adjusted hazard ratio (aHR), 3.2; 95% confidence interval (CI): 1.34; 7.......46] and incident chronic kidney disease (aHR, 10.94; 95% CI: 2.32; 51.35), when compared with lowest quartiles. Further, (every 1 mg) increase in plasma sCD163 was positively correlated with incident liver disease (aHR, 1.12; 95% CI: 1.05; 1.19). The sCD163 level was not associated with incident cancer...

  10. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  11. The incidence of urea cycle disorders.

    Science.gov (United States)

    Summar, Marshall L; Koelker, Stefan; Freedenberg, Debra; Le Mons, Cynthia; Haberle, Johannes; Lee, Hye-Seung; Kirmse, Brian

    2013-01-01

    A key question for urea cycle disorders is their incidence. In the United States two UCDs, argininosuccinic synthetase and lyase deficiency, are currently detected by newborn screening. We used newborn screening data on over 6million births and data from the large US and European longitudinal registries to determine how common these conditions are. The incidence for the United States is predicted to be 1 urea cycle disorder patient for every 35,000 births presenting about 113 new patients per year across all age groups. © 2013.

  12. Assessing Individual Weather Risk-Taking and Its Role in Modeling Likelihood of Hurricane Evacuation

    Science.gov (United States)

    Stewart, A. E.

    2017-12-01

    This research focuses upon measuring an individual's level of perceived risk of different severe and extreme weather conditions using a new self-report measure, the Weather Risk-Taking Scale (WRTS). For 32 severe and extreme situations in which people could perform an unsafe behavior (e. g., remaining outside with lightning striking close by, driving over roadways covered with water, not evacuating ahead of an approaching hurricane, etc.), people rated: 1.their likelihood of performing the behavior, 2. The perceived risk of performing the behavior, 3. the expected benefits of performing the behavior, and 4. whether the behavior has actually been performed in the past. Initial development research with the measure using 246 undergraduate students examined its psychometric properties and found that it was internally consistent (Cronbach's a ranged from .87 to .93 for the four scales) and that the scales possessed good temporal (test-retest) reliability (r's ranged from .84 to .91). A second regression study involving 86 undergraduate students found that taking weather risks was associated with having taken similar risks in one's past and with the personality trait of sensation-seeking. Being more attentive to the weather and perceiving its risks when it became extreme was associated with lower likelihoods of taking weather risks (overall regression model, R2adj = 0.60). A third study involving 334 people examined the contributions of weather risk perceptions and risk-taking in modeling the self-reported likelihood of complying with a recommended evacuation ahead of a hurricane. Here, higher perceptions of hurricane risks and lower perceived benefits of risk-taking along with fear of severe weather and hurricane personal self-efficacy ratings were all statistically significant contributors to the likelihood of evacuating ahead of a hurricane. Psychological rootedness and attachment to one's home also tend to predict lack of evacuation. This research highlights the

  13. The Jarvis gas release incident

    International Nuclear Information System (INIS)

    Manocha, J.

    1992-01-01

    On 26 September, 1991, large volumes of natural gas were observed to be leaking from two water wells in the Town of Jarvis. Gas and water were being ejected from a drilled water well, at which a subsequent gas explosion occurred. Measurements of gas concentrations indicated levels far in excess of the lower flammability limit at several locations. Electrical power and natural gas services were cut off, and residents were evacuated. A state of emergency was declared, and gas was found to be flowing from water wells, around building foundations, and through other fractures in the ground. By 27 September the volumes of gas had reduced substantially, and by 30 September all residents had returned to their homes and the state of emergency was cancelled. The emergency response, possible pathways of natural gas into the aquifer, and public relations are discussed. It is felt that the likelihood of a similar incident occurring in the future is high. 11 figs

  14. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  15. Maximum Likelihood Blind Channel Estimation for Space-Time Coding Systems

    Directory of Open Access Journals (Sweden)

    Hakan A. Çırpan

    2002-05-01

    Full Text Available Sophisticated signal processing techniques have to be developed for capacity enhancement of future wireless communication systems. In recent years, space-time coding is proposed to provide significant capacity gains over the traditional communication systems in fading wireless channels. Space-time codes are obtained by combining channel coding, modulation, transmit diversity, and optional receive diversity in order to provide diversity at the receiver and coding gain without sacrificing the bandwidth. In this paper, we consider the problem of blind estimation of space-time coded signals along with the channel parameters. Both conditional and unconditional maximum likelihood approaches are developed and iterative solutions are proposed. The conditional maximum likelihood algorithm is based on iterative least squares with projection whereas the unconditional maximum likelihood approach is developed by means of finite state Markov process modelling. The performance analysis issues of the proposed methods are studied. Finally, some simulation results are presented.

  16. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    Science.gov (United States)

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the

  17. HLA Match Likelihoods for Hematopoietic Stem-Cell Grafts in the U.S. Registry

    Science.gov (United States)

    Gragert, Loren; Eapen, Mary; Williams, Eric; Freeman, John; Spellman, Stephen; Baitty, Robert; Hartzman, Robert; Rizzo, J. Douglas; Horowitz, Mary; Confer, Dennis; Maiers, Martin

    2018-01-01

    Background Hematopoietic stem-cell transplantation (HSCT) is a potentially lifesaving therapy for several blood cancers and other diseases. For patients without a suitable related HLA-matched donor, unrelated-donor registries of adult volunteers and banked umbilical cord–blood units, such as the Be the Match Registry operated by the National Marrow Donor Program (NMDP), provide potential sources of donors. Our goal in the present study was to measure the likelihood of finding a suitable donor in the U.S. registry. Methods Using human HLA data from the NMDP donor and cord-blood-unit registry, we built population-based genetic models for 21 U.S. racial and ethnic groups to predict the likelihood of identifying a suitable donor (either an adult donor or a cord-blood unit) for patients in each group. The models incorporated the degree of HLA matching, adult-donor availability (i.e., ability to donate), and cord-blood-unit cell dose. Results Our models indicated that most candidates for HSCT will have a suitable (HLA-matched or minimally mismatched) adult donor. However, many patients will not have an optimal adult donor — that is, a donor who is matched at high resolution at HLA-A, HLA-B, HLA-C, and HLA-DRB1. The likelihood of finding an optimal donor varies among racial and ethnic groups, with the highest probability among whites of European descent, at 75%, and the lowest probability among blacks of South or Central American descent, at 16%. Likelihoods for other groups are intermediate. Few patients will have an optimal cord-blood unit — that is, one matched at the antigen level at HLA-A and HLA-B and matched at high resolution at HLA-DRB1. However, cord-blood units mismatched at one or two HLA loci are available for almost all patients younger than 20 years of age and for more than 80% of patients 20 years of age or older, regardless of racial and ethnic background. Conclusions Most patients likely to benefit from HSCT will have a donor. Public investment in

  18. Incidence of Secondary Cancer Development After High-Dose Intensity-Modulated Radiotherapy and Image-Guided Brachytherapy for the Treatment of Localized Prostate Cancer

    International Nuclear Information System (INIS)

    Zelefsky, Michael J.; Housman, Douglas M.; Pei Xin; Alicikus, Zumre; Magsanoc, Juan Martin; Dauer, Lawrence T.; St Germain, Jean; Yamada, Yoshiya; Kollmeier, Marisa; Cox, Brett; Zhang Zhigang

    2012-01-01

    Purpose: To report the incidence and excess risk of second malignancy (SM) development compared with the general population after external beam radiotherapy (EBRT) and brachytherapy to treat prostate cancer. Methods and Materials: Between 1998 and 2001, 1,310 patients with localized prostate cancer were treated with EBRT (n = 897) or brachytherapy (n = 413). We compared the incidence of SMs in our patients with that of the general population extracted from the National Cancer Institute’s Surveillance, Epidemiology, and End Results data set combined with the 2000 census data. Results: The 10-year likelihood of SM development was 25% after EBRT and 15% after brachytherapy (p = .02). The corresponding 10-year likelihood for in-field SM development in these groups was 4.9% and 1.6% (p = .24). Multivariate analysis showed that EBRT vs. brachytherapy and older age were the only significant predictors for the development of all SMs (p = .037 and p = .030), with a trend for older patients to develop a SM. The increased incidence of SM for EBRT patients was explained by the greater incidence of skin cancer outside the radiation field compared with that after brachytherapy (10.6% and 3.3%, respectively, p = .004). For the EBRT group, the 5- and 10-year mortality rate was 1.96% and 5.1% from out-of field cancer, respectively; for in-field SM, the corresponding mortality rates were 0.1% and 0.7%. Among the brachytherapy group, the 5- and 10-year mortality rate related to out-of field SM was 0.8% and 2.7%, respectively. Our observed SM rates after prostate RT were not significantly different from the cancer incidence rates in the general population. Conclusions: Using modern sophisticated treatment techniques, we report low rates of in-field bladder and rectal SM risks after prostate cancer RT. Furthermore, the likelihood of mortality secondary to a SM was unusual. The greater rate of SM observed with EBRT vs. brachytherapy was related to a small, but significantly increased

  19. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  20. Parallelization of maximum likelihood fits with OpenMP and CUDA

    CERN Document Server

    Jarp, S; Leduc, J; Nowak, A; Pantaleo, F

    2011-01-01

    Data analyses based on maximum likelihood fits are commonly used in the high energy physics community for fitting statistical models to data samples. This technique requires the numerical minimization of the negative log-likelihood function. MINUIT is the most common package used for this purpose in the high energy physics community. The main algorithm in this package, MIGRAD, searches the minimum by using the gradient information. The procedure requires several evaluations of the function, depending on the number of free parameters and their initial values. The whole procedure can be very CPU-time consuming in case of complex functions, with several free parameters, many independent variables and large data samples. Therefore, it becomes particularly important to speed-up the evaluation of the negative log-likelihood function. In this paper we present an algorithm and its implementation which benefits from data vectorization and parallelization (based on OpenMP) and which was also ported to Graphics Processi...

  1. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  2. Analysis of Minute Features in Speckled Imagery with Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Alejandro C. Frery

    2004-12-01

    Full Text Available This paper deals with numerical problems arising when performing maximum likelihood parameter estimation in speckled imagery using small samples. The noise that appears in images obtained with coherent illumination, as is the case of sonar, laser, ultrasound-B, and synthetic aperture radar, is called speckle, and it can neither be assumed Gaussian nor additive. The properties of speckle noise are well described by the multiplicative model, a statistical framework from which stem several important distributions. Amongst these distributions, one is regarded as the universal model for speckled data, namely, the 𝒢0 law. This paper deals with amplitude data, so the 𝒢A0 distribution will be used. The literature reports that techniques for obtaining estimates (maximum likelihood, based on moments and on order statistics of the parameters of the 𝒢A0 distribution require samples of hundreds, even thousands, of observations in order to obtain sensible values. This is verified for maximum likelihood estimation, and a proposal based on alternate optimization is made to alleviate this situation. The proposal is assessed with real and simulated data, showing that the convergence problems are no longer present. A Monte Carlo experiment is devised to estimate the quality of maximum likelihood estimators in small samples, and real data is successfully analyzed with the proposed alternated procedure. Stylized empirical influence functions are computed and used to choose a strategy for computing maximum likelihood estimates that is resistant to outliers.

  3. Documentation of in-hospital falls on incident reports: Qualitative investigation of an imperfect process

    Directory of Open Access Journals (Sweden)

    Fleming Jennifer

    2008-12-01

    Full Text Available Abstract Background Incident reporting is the prevailing approach to gathering data on accidental falls in hospitals for both research and quality assurance purposes, though is of questionable quality as staff time pressures, perception of blame and other factors are thought to contribute to under-reporting. Methods This research aimed to identify contextual factors influencing recording of in-hospital falls on incident reports. A qualitative multi-centre investigation using an open written response questionnaire was undertaken. Participants were asked to describe any factors that made them feel more or less likely to record a fall on an incident report. 212 hospital staff from 30 wards in 7 hospitals in Queensland, Australia provided a response. A framework approach was employed to identify and understand inter-relationships between emergent categories. Results Three main categories were developed. The first, determinants of reporting, describes a hierarchical structure of primary (principle of reporting, secondary (patient injury, and tertiary determinants that influenced the likelihood that an in-hospital fall would be recorded on an incident report. The tertiary determinants frequently had an inconsistent effect. The second and third main categories described environmental/cultural facilitators and barriers respectively which form a background upon which the determinants of reporting exists. Conclusion A distinctive framework with clear differences to recording of other types of adverse events on incident reports was apparent. Providing information to hospital staff regarding the purpose of incident reporting and the usefulness of incident reporting for preventing future falls may improve incident reporting practices.

  4. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  5. The role of surface topography in predicting scattering at grazing incidence from optical surfaces

    International Nuclear Information System (INIS)

    Rehn, V.; Jones, V.O.; Elson, J.M.; Bennett, J.M.

    1980-01-01

    Monochromator design and the design of optical experiments at XUV and X-ray wavelengths are frequently limited by scattering from optical components, yet theoretical treatments are few and untested experimentally. This is partly due to the failure of scattering models used in the visible and near UV when the wavelength becomes comparable to, or smaller than, the topographic features on the surface, and partly it is due to the difficulty in measuring the topography on the required size scale. We briefly review the theoretical problems and prospects for accurately predicting both the magnitude and angular distribution of scattering at grazing incidence from optical surfaces. Experimental methods for determining and representing the surface topography are also reviewed, together with their limitations and ranges of applicability. Finally, the first results of our experiments, conducted recently at the Stanford Synchrotron Radiation Laboratory on the angular distribution of scattering by surfaces of known topography are presented and discussed, along with their potential implications for the theory of scattering, and for XUV and X-ray optical components. (orig.)

  6. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  7. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  8. Validating prediction scales of type 2 diabetes mellitus in Spain: the SPREDIA-2 population-based prospective cohort study protocol

    Science.gov (United States)

    Salinero-Fort, Miguel Ángel; de Burgos-Lunar, Carmen; Mostaza Prieto, José; Lahoz Rallo, Carlos; Abánades-Herranz, Juan Carlos; Gómez-Campelo, Paloma; Laguna Cuesta, Fernando; Estirado De Cabo, Eva; García Iglesias, Francisca; González Alegre, Teresa; Fernández Puntero, Belén; Montesano Sánchez, Luis; Vicent López, David; Cornejo Del Río, Víctor; Fernández García, Pedro J; Sabín Rodríguez, Concesa; López López, Silvia; Patrón Barandío, Pedro

    2015-01-01

    Introduction The incidence of type 2 diabetes mellitus (T2DM) is increasing worldwide. When diagnosed, many patients already have organ damage or advance subclinical atherosclerosis. An early diagnosis could allow the implementation of lifestyle changes and treatment options aimed at delaying the progression of the disease and to avoid cardiovascular complications. Different scores for identifying undiagnosed diabetes have been reported, however, their performance in populations of southern Europe has not been sufficiently evaluated. The main objectives of our study are: to evaluate the screening performance and cut-off points of the main scores that identify the risk of undiagnosed T2DM and prediabetes in a Spanish population, and to develop and validate our own predictive models of undiagnosed T2DM (screening model), and future T2DM (prediction risk model) after 5-year follow-up. As a secondary objective, we will evaluate the atherosclerotic burden of the population with undiagnosed T2DM. Methods and analysis Population-based prospective cohort study with baseline screening, to evaluate the performance of the FINDRISC, DANISH, DESIR, ARIC and QDScore, against the gold standard tests: Fasting plasma glucose, oral glucose tolerance and/or HbA1c. The sample size will include 1352 participants between the ages of 45 and 74 years. Analysis: sensitivity, specificity, positive predictive value, negative predictive value, likelihood ratio positive, likelihood ratio negative and receiver operating characteristic curves and area under curve. Binary logistic regression for the first 700 individuals (derivation) and last 652 (validation) will be performed. All analyses will be calculated with their 95% CI; statistical significance will be p<0.05. Ethics and dissemination The study protocol has been approved by the Research Ethics Committee of the Carlos III Hospital (Madrid). The score performance and predictive model will be presented in medical conferences, workshops

  9. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION BY A GAUSSIANIZED BLACKWELL-RAO ESTIMATOR

    International Nuclear Information System (INIS)

    Rudjord, Oe.; Groeneboom, N. E.; Eriksen, H. K.; Huey, Greg; Gorski, K. M.; Jewell, J. B.

    2009-01-01

    We introduce a new cosmic microwave background (CMB) temperature likelihood approximation called the Gaussianized Blackwell-Rao estimator. This estimator is derived by transforming the observed marginal power spectrum distributions obtained by the CMB Gibbs sampler into standard univariate Gaussians, and then approximating their joint transformed distribution by a multivariate Gaussian. The method is exact for full-sky coverage and uniform noise and an excellent approximation for sky cuts and scanning patterns relevant for modern satellite experiments such as the Wilkinson Microwave Anisotropy Probe (WMAP) and Planck. The result is a stable, accurate, and computationally very efficient CMB temperature likelihood representation that allows the user to exploit the unique error propagation capabilities of the Gibbs sampler to high ls. A single evaluation of this estimator between l = 2 and 200 takes ∼0.2 CPU milliseconds, while for comparison, a singe pixel space likelihood evaluation between l = 2 and 30 for a map with ∼2500 pixels requires ∼20 s. We apply this tool to the five-year WMAP temperature data, and re-estimate the angular temperature power spectrum, C l , and likelihood, L(C l ), for l ≤ 200, and derive new cosmological parameters for the standard six-parameter ΛCDM model. Our spectrum is in excellent agreement with the official WMAP spectrum, but we find slight differences in the derived cosmological parameters. Most importantly, the spectral index of scalar perturbations is n s = 0.973 ± 0.014, 1.9σ away from unity and 0.6σ higher than the official WMAP result, n s = 0.965 ± 0.014. This suggests that an exact likelihood treatment is required to higher ls than previously believed, reinforcing and extending our conclusions from the three-year WMAP analysis. In that case, we found that the suboptimal likelihood approximation adopted between l = 12 and 30 by the WMAP team biased n s low by 0.4σ, while here we find that the same approximation

  10. In 'big bang' major incidents do triage tools accurately predict clinical priority?: a systematic review of the literature.

    Science.gov (United States)

    Kilner, T M; Brace, S J; Cooke, M W; Stallard, N; Bleetman, A; Perkins, G D

    2011-05-01

    The term "big bang" major incidents is used to describe sudden, usually traumatic,catastrophic events, involving relatively large numbers of injured individuals, where demands on clinical services rapidly outstrip the available resources. Triage tools support the pre-hospital provider to prioritise which patients to treat and/or transport first based upon clinical need. The aim of this review is to identify existing triage tools and to determine the extent to which their reliability and validity have been assessed. A systematic review of the literature was conducted to identify and evaluate published data validating the efficacy of the triage tools. Studies using data from trauma patients that report on the derivation, validation and/or reliability of the specific pre-hospital triage tools were eligible for inclusion.Purely descriptive studies, reviews, exercises or reports (without supporting data) were excluded. The search yielded 1982 papers. After initial scrutiny of title and abstract, 181 papers were deemed potentially applicable and from these 11 were identified as relevant to this review (in first figure). There were two level of evidence one studies, three level of evidence two studies and six level of evidence three studies. The two level of evidence one studies were prospective validations of Clinical Decision Rules (CDR's) in children in South Africa, all the other studies were retrospective CDR derivation, validation or cohort studies. The quality of the papers was rated as good (n=3), fair (n=7), poor (n=1). There is limited evidence for the validity of existing triage tools in big bang major incidents.Where evidence does exist it focuses on sensitivity and specificity in relation to prediction of trauma death or severity of injury based on data from single or small number patient incidents. The Sacco system is unique in combining survivability modelling with the degree by which the system is overwhelmed in the triage decision system. The

  11. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  12. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  13. Mental models accurately predict emotion transitions.

    Science.gov (United States)

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  14. Mental models accurately predict emotion transitions

    Science.gov (United States)

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  15. Is Primary Prostate Cancer Treatment Influenced by Likelihood of Extraprostatic Disease? A Surveillance, Epidemiology and End Results Patterns of Care Study

    International Nuclear Information System (INIS)

    Holmes, Jordan A.; Wang, Andrew Z.; Hoffman, Karen E.; Hendrix, Laura H.; Rosenman, Julian G.; Carpenter, William R.; Godley, Paul A.; Chen, Ronald C.

    2012-01-01

    Purpose: To examine the patterns of primary treatment in a recent population-based cohort of prostate cancer patients, stratified by the likelihood of extraprostatic cancer as predicted by disease characteristics available at diagnosis. Methods and Materials: A total of 157,371 patients diagnosed from 2004 to 2008 with clinically localized and potentially curable (node-negative, nonmetastatic) prostate cancer, who have complete information on prostate-specific antigen, Gleason score, and clinical stage, were included. Patients with clinical T1/T2 disease were grouped into categories of 50% likelihood of having extraprostatic disease using the Partin nomogram. Clinical T3/T4 patients were examined separately as the highest-risk group. Logistic regression was used to examine the association between patient group and receipt of each primary treatment, adjusting for age, race, year of diagnosis, marital status, Surveillance, Epidemiology and End Results database region, and county-level education. Separate models were constructed for primary surgery, external-beam radiotherapy (RT), and conservative management. Results: On multivariable analysis, increasing likelihood of extraprostatic disease was significantly associated with increasing use of RT and decreased conservative management. Use of surgery also increased. Patients with >50% likelihood of extraprostatic cancer had almost twice the odds of receiving prostatectomy as those with 50% likelihood of extraprostatic cancer (34%) and clinical T3–T4 disease (24%). The proportion of patients who received prostatectomy or conservative management was approximately 50% or slightly higher in all groups. Conclusions: There may be underutilization of RT in older prostate cancer patients and those with likely extraprostatic disease. Because more than half of prostate cancer patients do not consult with a radiation oncologist, a multidisciplinary consultation may affect the treatment decision-making process.

  16. Groundwater uranium and cancer incidence in South Carolina

    Science.gov (United States)

    Wagner, Sara E.; Burch, James B.; Bottai, Matteo; Puett, Robin; Porter, Dwayne; Bolick-Aldrich, Susan; Temples, Tom; Wilkerson, Rebecca C.; Vena, John E.; Hébert, James R.

    2012-01-01

    Objective This ecologic study tested the hypothesis that census tracts with elevated groundwater uranium and more frequent groundwater use have increased cancer incidence. Methods Data sources included: incident total, leukemia, prostate, breast, colorectal, lung, kidney, and bladder cancers (1996–2005, SC Central Cancer Registry); demographic and groundwater use (1990 US Census); and groundwater uranium concentrations (n = 4,600, from existing federal and state databases). Kriging was used to predict average uranium concentrations within tracts. The relationship between uranium and standardized cancer incidence ratios was modeled among tracts with substantial groundwater use via linear or semiparametric regression, with and without stratification by the proportion of African Americans in each area. Results A total of 134,685 cancer cases were evaluated. Tracts with ≥50% groundwater use and uranium concentrations in the upper quartile had increased risks for colorectal, breast, kidney, prostate, and total cancer compared to referent tracts. Some of these relationships were more likely to be observed among tracts populated primarily by African Americans. Conclusion SC regions with elevated groundwater uranium and more groundwater use may have an increased incidence of certain cancers, although additional research is needed since the design precluded adjustment for race or other predictive factors at the individual level. PMID:21080052

  17. The likelihood ratio as a random variable for linked markers in kinship analysis.

    Science.gov (United States)

    Egeland, Thore; Slooten, Klaas

    2016-11-01

    The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.

  18. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood.

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific 'problem music' like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals' risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  19. Assessing compatibility of direct detection data: halo-independent global likelihood analyses

    Energy Technology Data Exchange (ETDEWEB)

    Gelmini, Graciela B. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States); Huh, Ji-Haeng [CERN Theory Division,CH-1211, Geneva 23 (Switzerland); Witte, Samuel J. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States)

    2016-10-18

    We present two different halo-independent methods to assess the compatibility of several direct dark matter detection data sets for a given dark matter model using a global likelihood consisting of at least one extended likelihood and an arbitrary number of Gaussian or Poisson likelihoods. In the first method we find the global best fit halo function (we prove that it is a unique piecewise constant function with a number of down steps smaller than or equal to a maximum number that we compute) and construct a two-sided pointwise confidence band at any desired confidence level, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a “constrained parameter goodness-of-fit” test statistic, whose p-value we then use to define a “plausibility region” (e.g. where p≥10%). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. p<10%). We illustrate these methods by applying them to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic spin-independent isospin-conserving interactions or exothermic spin-independent isospin-violating interactions.

  20. Phalangeal bone mineral density predicts incident fractures

    DEFF Research Database (Denmark)

    Friis-Holmberg, Teresa; Brixen, Kim; Rubin, Katrine Hass

    2012-01-01

    This prospective study investigates the use of phalangeal bone mineral density (BMD) in predicting fractures in a cohort (15,542) who underwent a BMD scan. In both women and men, a decrease in BMD was associated with an increased risk of fracture when adjusted for age and prevalent fractures...

  1. Imagination perspective affects ratings of the likelihood of occurrence of autobiographical memories.

    Science.gov (United States)

    Marsh, Benjamin U; Pezdek, Kathy; Lam, Shirley T

    2014-07-01

    Two experiments tested and confirmed the hypothesis that when the phenomenological characteristics of imagined events are more similar to those of related autobiographical memories, the imagined event is more likely to be considered to have occurred. At Time 1 and 2-weeks later, individuals rated the likelihood of occurrence for 20 life events. In Experiment 1, 1-week after Time 1, individuals imagined 3 childhood events from a first-person or third-person perspective. There was a no-imagination control. An increase in likelihood ratings from Time 1 to Time 2 resulted when imagination was from the third-person but not first-person perspective. In Experiment 2, childhood and recent events were imagined from a third- or first-person perspective. A significant interaction resulted. For childhood events, likelihood change scores were greater for third-person than first-person perspective; for recent adult events, likelihood change scores were greater for first-person than third-person perspective, although this latter trend was not significant. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Use of deterministic sampling for exploring likelihoods in linkage analysis for quantitative traits.

    NARCIS (Netherlands)

    Mackinnon, M.J.; Beek, van der S.; Kinghorn, B.P.

    1996-01-01

    Deterministic sampling was used to numerically evaluate the expected log-likelihood surfaces of QTL-marker linkage models in large pedigrees with simple structures. By calculating the expected values of likelihoods, questions of power of experimental designs, bias in parameter estimates, approximate

  3. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  4. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    Directory of Open Access Journals (Sweden)

    Fonseca Carlos M

    2010-10-01

    Full Text Available Abstract Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the

  5. Lightning incidents in Mongolia

    Directory of Open Access Journals (Sweden)

    Myagmar Doljinsuren

    2015-11-01

    Full Text Available This is one of the first studies that has been conducted in Mongolia on the distribution of lightning incidents. The study covers a 10-year period from 2004 to 2013. The country records a human death rate of 15.4 deaths per 10 million people per year, which is much higher than that of many countries with similar isokeraunic level. The reason may be the low-grown vegetation observed in most rural areas of Mongolia, a surface topography, typical to steppe climate. We suggest modifications to Gomes–Kadir equation for such countries, as it predicts a much lower annual death rate for Mongolia. The lightning incidents spread over the period from May to August with the peak of the number of incidents occurring in July. The worst lightning affected region in the country is the central part. Compared with impacts of other convective disasters such as squalls, thunderstorms and hail, lightning stands as the second highest in the number of incidents, human deaths and animal deaths. Economic losses due to lightning is only about 1% of the total losses due to the four extreme weather phenomena. However, unless precautionary measures are not promoted among the public, this figure of losses may significantly increase with time as the country is undergoing rapid industrialization at present.

  6. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine, E-mail: adam.bouland@aya.yale.edu, E-mail: richard.easther@yale.edu, E-mail: krosenfeld@cfa.harvard.edu [Department of Physics, Yale University, New Haven CT 06520 (United States)

    2011-05-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user.

  7. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    International Nuclear Information System (INIS)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine

    2011-01-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user

  8. A maximum pseudo-likelihood approach for estimating species trees under the coalescent model

    Directory of Open Access Journals (Sweden)

    Edwards Scott V

    2010-10-01

    Full Text Available Abstract Background Several phylogenetic approaches have been developed to estimate species trees from collections of gene trees. However, maximum likelihood approaches for estimating species trees under the coalescent model are limited. Although the likelihood of a species tree under the multispecies coalescent model has already been derived by Rannala and Yang, it can be shown that the maximum likelihood estimate (MLE of the species tree (topology, branch lengths, and population sizes from gene trees under this formula does not exist. In this paper, we develop a pseudo-likelihood function of the species tree to obtain maximum pseudo-likelihood estimates (MPE of species trees, with branch lengths of the species tree in coalescent units. Results We show that the MPE of the species tree is statistically consistent as the number M of genes goes to infinity. In addition, the probability that the MPE of the species tree matches the true species tree converges to 1 at rate O(M -1. The simulation results confirm that the maximum pseudo-likelihood approach is statistically consistent even when the species tree is in the anomaly zone. We applied our method, Maximum Pseudo-likelihood for Estimating Species Trees (MP-EST to a mammal dataset. The four major clades found in the MP-EST tree are consistent with those in the Bayesian concatenation tree. The bootstrap supports for the species tree estimated by the MP-EST method are more reasonable than the posterior probability supports given by the Bayesian concatenation method in reflecting the level of uncertainty in gene trees and controversies over the relationship of four major groups of placental mammals. Conclusions MP-EST can consistently estimate the topology and branch lengths (in coalescent units of the species tree. Although the pseudo-likelihood is derived from coalescent theory, and assumes no gene flow or horizontal gene transfer (HGT, the MP-EST method is robust to a small amount of HGT in the

  9. A practical approach to incident prevention and mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kevin; Williams, Pat [KBC Advanced Technologies, Surrey (United Kingdom)

    2012-07-01

    Our industry has taken grand interest in improving safety in the last few decades, particularly of our process operations. This has resulted in significant improvements in overall industry safety statistics. Despite this improvement in our efforts, incidents still occur where people are injured, and these tragic incidents may even be fatal. Organizations have implemented various programs to lessen the chance of these incidents occurring, three of which are most commonly: check the box, minimize legal liability, and take a practical approach. Most often it is the practical approach that proves to influence the most improvements because it is an approach focused on the employees and the organizations business needs together. The plants that show the most safety are to no surprise run by reliable individuals. Usually, the causes of incidents stem from a failure to perform and maintain basic procedures. The answers to each plant's safety dilemmas are not found in any one program, but instead lie in understanding the anatomy of what it means to be safe. Only when that is understood can a solution be constructed and catered to the entire physiology of the problem. There are three basic tenants that need to be considered in any safety improvement strategy in order for it to be effective: capability, awareness and motivation. The third is further comprised of two factors that should not be overlooked: desire and accountability. Therefore, process safety is not driven by fancy software or rigid structure programs. It is apparent that several factors come into play when implementing safer practices. The focus of these practices should be on manufacturing employees. When improvement efforts are focused on activities and behaviors whose implementation is practical in a plant environment and address the three main areas of the anatomy, the likelihood of success increases substantially. (author)

  10. Multi-step polynomial regression method to model and forecast malaria incidence.

    Directory of Open Access Journals (Sweden)

    Chandrajit Chatterjee

    Full Text Available Malaria is one of the most severe problems faced by the world even today. Understanding the causative factors such as age, sex, social factors, environmental variability etc. as well as underlying transmission dynamics of the disease is important for epidemiological research on malaria and its eradication. Thus, development of suitable modeling approach and methodology, based on the available data on the incidence of the disease and other related factors is of utmost importance. In this study, we developed a simple non-linear regression methodology in modeling and forecasting malaria incidence in Chennai city, India, and predicted future disease incidence with high confidence level. We considered three types of data to develop the regression methodology: a longer time series data of Slide Positivity Rates (SPR of malaria; a smaller time series data (deaths due to Plasmodium vivax of one year; and spatial data (zonal distribution of P. vivax deaths for the city along with the climatic factors, population and previous incidence of the disease. We performed variable selection by simple correlation study, identification of the initial relationship between variables through non-linear curve fitting and used multi-step methods for induction of variables in the non-linear regression analysis along with applied Gauss-Markov models, and ANOVA for testing the prediction, validity and constructing the confidence intervals. The results execute the applicability of our method for different types of data, the autoregressive nature of forecasting, and show high prediction power for both SPR and P. vivax deaths, where the one-lag SPR values plays an influential role and proves useful for better prediction. Different climatic factors are identified as playing crucial role on shaping the disease curve. Further, disease incidence at zonal level and the effect of causative factors on different zonal clusters indicate the pattern of malaria prevalence in the city

  11. Influence of social relationship domains and their combinations on incident dementia: a prospective cohort study.

    Science.gov (United States)

    Saito, Tami; Murata, Chiyoe; Saito, Masashige; Takeda, Tokunori; Kondo, Katsunori

    2018-01-01

    Social relationships consist of mutually related but distinct dimensions. It remains unclear how these domains independently contribute to incident dementia. This large-scale, prospective cohort study examines associations between the social relationship domains as well as their combinations and incident dementia among community-dwelling older adults. We analysed data from 13 984 community-dwelling adults aged 65+ without long-term care needs living in Aichi prefecture in Japan. Incident dementia was assessed based on the Long-term Care Insurance records, followed for 3436 days from the baseline survey conducted in 2003. Three social relationships domains (social support, social networks and social activities) were further divided into a total of eight subdomains. A social relationship diversity score was calculated using the social relationship domains which were significantly related to incident dementia. A Cox proportional hazards model showed that being married, exchanging support with family members, having contact with friends, participating in community groups and engaging in paid work were related to a lower likelihood of developing incident dementia, controlling for covariates and other social relationship domains. The diversity scores, ranging from 0 to 5, were linearly associated with incident dementia (psocial relationship subdomains which were negatively related to incident dementia, suggesting that dementia may potentially be prevented by enhancing these social relationships. Future studies should examine independent pathways between each social relationship domain and incident dementia. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Similarities in the Age-Specific Incidence of Colon and Testicular Cancers.

    Science.gov (United States)

    Soto-Ortiz, Luis; Brody, James P

    2013-01-01

    Colon cancers are thought to be an inevitable result of aging, while testicular cancers are thought to develop in only a small fraction of men, beginning in utero. These models of carcinogenesis are, in part, based upon age-specific incidence data. The specific incidence for colon cancer appears to monotonically increase with age, while that of testicular cancer increases to a maximum value at about 35 years of age, then declines to nearly zero by the age of 80. We hypothesized that the age-specific incidence for these two cancers is similar; the apparent difference is caused by a longer development time for colon cancer and the lack of age-specific incidence data for people over 84 years of age. Here we show that a single distribution can describe the age-specific incidence of both colon carcinoma and testicular cancer. Furthermore, this distribution predicts that the specific incidence of colon cancer should reach a maximum at about age 90 and then decrease. Data on the incidence of colon carcinoma for women aged 85-99, acquired from SEER and the US Census, is consistent with this prediction. We conclude that the age specific data for testicular cancers and colon cancers is similar, suggesting that the underlying process leading to the development of these two forms of cancer may be similar.

  13. Similarities in the Age-Specific Incidence of Colon and Testicular Cancers.

    Directory of Open Access Journals (Sweden)

    Luis Soto-Ortiz

    Full Text Available Colon cancers are thought to be an inevitable result of aging, while testicular cancers are thought to develop in only a small fraction of men, beginning in utero. These models of carcinogenesis are, in part, based upon age-specific incidence data. The specific incidence for colon cancer appears to monotonically increase with age, while that of testicular cancer increases to a maximum value at about 35 years of age, then declines to nearly zero by the age of 80. We hypothesized that the age-specific incidence for these two cancers is similar; the apparent difference is caused by a longer development time for colon cancer and the lack of age-specific incidence data for people over 84 years of age. Here we show that a single distribution can describe the age-specific incidence of both colon carcinoma and testicular cancer. Furthermore, this distribution predicts that the specific incidence of colon cancer should reach a maximum at about age 90 and then decrease. Data on the incidence of colon carcinoma for women aged 85-99, acquired from SEER and the US Census, is consistent with this prediction. We conclude that the age specific data for testicular cancers and colon cancers is similar, suggesting that the underlying process leading to the development of these two forms of cancer may be similar.

  14. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  15. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    This paper discusses model-based inference in an autoregressive model for fractional processes which allows the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X1......,...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that initial values are zero. For the statistical analysis we assume...... the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...

  16. Generalized linear models with random effects unified analysis via H-likelihood

    CERN Document Server

    Lee, Youngjo; Pawitan, Yudi

    2006-01-01

    Since their introduction in 1972, generalized linear models (GLMs) have proven useful in the generalization of classical normal models. Presenting methods for fitting GLMs with random effects to data, Generalized Linear Models with Random Effects: Unified Analysis via H-likelihood explores a wide range of applications, including combining information over trials (meta-analysis), analysis of frailty models for survival data, genetic epidemiology, and analysis of spatial and temporal models with correlated errors.Written by pioneering authorities in the field, this reference provides an introduction to various theories and examines likelihood inference and GLMs. The authors show how to extend the class of GLMs while retaining as much simplicity as possible. By maximizing and deriving other quantities from h-likelihood, they also demonstrate how to use a single algorithm for all members of the class, resulting in a faster algorithm as compared to existing alternatives. Complementing theory with examples, many of...

  17. SU-E-T-511: Inter-Rater Variability in Classification of Incidents in a New Incident Reporting System

    International Nuclear Information System (INIS)

    Pappas, D; Reis, S; Ali, A; Kapur, A

    2015-01-01

    Purpose To determine how consistent the results of different raters are when reviewing the same cases within the Radiation Oncology Incident Learning System (ROILS). Methods Three second-year medical physics graduate students filled out incident reports in spreadsheets set up to mimic ROILS. All students studied the same 33 cases and independently entered their assessments, for a total of 99 reviewed cases. The narratives for these cases were obtained from a published International Commission on Radiological Protection (ICRP) report which included shorter narratives selected from the Radiation Oncology Safety Information System (ROSIS) database. Each category of questions was reviewed to see how consistent the results were by utilizing free-marginal multirater kappa analysis. The percentage of cases where all raters shared full agreement or full disagreement was recorded to show which questions were answered consistently by multiple raters for a given case. The consistency among the raters was analyzed between ICRP and ROSIS cases to see if either group led to more reliable results. Results The categories where all raters agreed 100 percent in their choices were the event type (93.94 percent of cases 0.946 kappa) and the likelihood of the event being harmful to the patient (42.42 percent of cases 0.409 kappa). The categories where all raters disagreed 100 percent in their choices were the dosimetric severity scale (39.39 percent of cases 0.139 kappa) and the potential future toxicity (48.48 percent of cases 0.205 kappa). ROSIS had more cases where all raters disagreed than ICRP (23.06 percent of cases compared to 15.58 percent, respectively). Conclusion Despite reviewing the same cases, the results among the three raters was widespread. ROSIS narratives were shorter than ICRP, which suggests that longer narratives lead to more consistent results. This study shows that the incident reporting system can be optimized to yield more consistent results

  18. SU-E-T-511: Inter-Rater Variability in Classification of Incidents in a New Incident Reporting System

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, D; Reis, S; Ali, A [Hofstra University, Hempstead, NY (United States); Kapur, A [Long Island Jewish Medical Center, New Hyde Park, NY (United States)

    2015-06-15

    Purpose To determine how consistent the results of different raters are when reviewing the same cases within the Radiation Oncology Incident Learning System (ROILS). Methods Three second-year medical physics graduate students filled out incident reports in spreadsheets set up to mimic ROILS. All students studied the same 33 cases and independently entered their assessments, for a total of 99 reviewed cases. The narratives for these cases were obtained from a published International Commission on Radiological Protection (ICRP) report which included shorter narratives selected from the Radiation Oncology Safety Information System (ROSIS) database. Each category of questions was reviewed to see how consistent the results were by utilizing free-marginal multirater kappa analysis. The percentage of cases where all raters shared full agreement or full disagreement was recorded to show which questions were answered consistently by multiple raters for a given case. The consistency among the raters was analyzed between ICRP and ROSIS cases to see if either group led to more reliable results. Results The categories where all raters agreed 100 percent in their choices were the event type (93.94 percent of cases 0.946 kappa) and the likelihood of the event being harmful to the patient (42.42 percent of cases 0.409 kappa). The categories where all raters disagreed 100 percent in their choices were the dosimetric severity scale (39.39 percent of cases 0.139 kappa) and the potential future toxicity (48.48 percent of cases 0.205 kappa). ROSIS had more cases where all raters disagreed than ICRP (23.06 percent of cases compared to 15.58 percent, respectively). Conclusion Despite reviewing the same cases, the results among the three raters was widespread. ROSIS narratives were shorter than ICRP, which suggests that longer narratives lead to more consistent results. This study shows that the incident reporting system can be optimized to yield more consistent results.

  19. Design of simplified maximum-likelihood receivers for multiuser CPM systems.

    Science.gov (United States)

    Bing, Li; Bai, Baoming

    2014-01-01

    A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.

  20. Incidence of Induced Abortion and Post-Abortion Care in Tanzania.

    Science.gov (United States)

    Keogh, Sarah C; Kimaro, Godfather; Muganyizi, Projestine; Philbin, Jesse; Kahwa, Amos; Ngadaya, Esther; Bankole, Akinrinola

    2015-01-01

    Tanzania has one of the highest maternal mortality ratios in the world, and unsafe abortion is one of its leading causes. Yet little is known about its incidence. To provide the first ever estimates of the incidence of unsafe abortion in Tanzania, at the national level and for each of the 8 geopolitical zones (7 in Mainland plus Zanzibar). A nationally representative survey of health facilities was conducted to determine the number of induced abortion complications treated in facilities. A survey of experts on abortion was conducted to estimate the likelihood of women experiencing complications and obtaining treatment. These surveys were complemented with population and fertility data to obtain abortion numbers, rates and ratios, using the Abortion Incidence Complications Methodology. In Tanzania, women obtained just over 405,000 induced abortions in 2013, for a national rate of 36 abortions per 1,000 women age 15-49 and a ratio of 21 abortions per 100 live births. For each woman treated in a facility for induced abortion complications, 6 times as many women had an abortion but did not receive care. Abortion rates vary widely by zone, from 10.7 in Zanzibar to 50.7 in the Lake zone. The abortion rate is similar to that of other countries in the region. Variations by zone are explained mainly by differences in fertility and contraceptive prevalence. Measures to reduce the incidence of unsafe abortion and associated maternal mortality include expanding access to post-abortion care and contraceptive services to prevent unintended pregnancies.

  1. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Science.gov (United States)

    Enström, Rickard; Schmaltz, Rodney

    2017-01-01

    From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety. PMID:28539908

  2. A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood

    Directory of Open Access Journals (Sweden)

    Rickard Enström

    2017-05-01

    Full Text Available From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.

  3. Current incidence of duplicate publication in otolaryngology.

    Science.gov (United States)

    Cheung, Veronique Wan Fook; Lam, Gilbert O A; Wang, Yun Fan; Chadha, Neil K

    2014-03-01

    Duplicate publication--deemed highly unethical--is the reproduction of substantial content in another article by the same authors. In 1999, Rosenthal et al. identified an 8.5% incidence of duplicate articles in two otolaryngology journals. We explored the current incidence in three otolaryngology journals in North America and Europe. Retrospective literature review. Index articles in 2008 in Archives of Otolaryngology-Head and Neck Surgery, Laryngoscope, and Clinical Otolaryngology were searched using MEDLINE. Potential duplicate publications in 2006 through 2010 were identified using the first, second, and last authors' names. Three authors independently investigated suspected duplicate publications--classifying them by degree of duplication. Of 358 index articles screened, 75 (20.9%) had 119 potential duplicates from 2006 to 2010. Full review of these 119 potential duplicates revealed a total of 40 articles with some form of redundancy (33.6% of the potential duplicates) involving 27 index articles (7.5% of 358 index articles); one (0.8%) "dual" publication (identical or nearly identical data and conclusions to the index article); three (2.5%) "suspected" dual publications (less than 50% new data and same conclusions); and 36 (30.3%) publications with "salami-slicing" (portion of the index article data repeated) were obtained. Further analysis compared the likelihood of duplicate publication by study source and subspecialty within otolaryngology. The incidence of duplicate publication has not significantly changed over 10 years. "Salami-slicing" was a concerning practice, with no cross-referencing in 61% of these cases. Detecting and eliminating redundant publications is a laborious task, but it is essential in upholding the journal quality and research integrity. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  4. Higher levels of albuminuria within the normal range predict incident hypertension.

    Science.gov (United States)

    Forman, John P; Fisher, Naomi D L; Schopick, Emily L; Curhan, Gary C

    2008-10-01

    Higher levels of albumin excretion within the normal range are associated with cardiovascular disease in high-risk individuals. Whether incremental increases in urinary albumin excretion, even within the normal range, are associated with the development of hypertension in low-risk individuals is unknown. This study included 1065 postmenopausal women from the first Nurses' Health Study and 1114 premenopausal women from the second Nurses' Health Study who had an albumin/creatinine ratio who did not have diabetes or hypertension. Among the older women, 271 incident cases of hypertension occurred during 4 yr of follow-up, and among the younger women, 296 incident cases of hypertension occurred during 8 yr of follow-up. Cox proportional hazards regression was used to examine prospectively the association between the albumin/creatinine ratio and incident hypertension after adjustment for age, body mass index, estimated GFR, baseline BP, physical activity, smoking, and family history of hypertension. Participants who had an albumin/creatinine ratio in the highest quartile (4.34 to 24.17 mg/g for older women and 3.68 to 23.84 mg/g for younger women) were more likely to develop hypertension than those who had an albumin/creatinine ratio in the lowest quartile (hazard ratio 1.76 [95% confidence interval 1.21 to 2.56] and hazard ratio 1.35 [95% confidence interval 0.97 to 1.91] for older and younger women, respectively). Higher albumin/creatinine ratios, even within the normal range, are independently associated with increased risk for development of hypertension among women without diabetes. The definition of normal albumin excretion should be reevaluated.

  5. Parametric Roll Resonance Detection using Phase Correlation and Log-likelihood Testing Techniques

    DEFF Research Database (Denmark)

    Galeazzi, Roberto; Blanke, Mogens; Poulsen, Niels Kjølstad

    2009-01-01

    generation warning system the purpose of which is to provide the master with an onboard system able to trigger an alarm when parametric roll is likely to happen within the immediate future. A detection scheme is introduced, which is able to issue a warning within five roll periods after a resonant motion......Real-time detection of parametric roll is still an open issue that is gathering an increasing attention. A first generation warning systems, based on guidelines and polar diagrams, showed their potential to face issues like long-term prediction and risk assessment. This paper presents a second...... started. After having determined statistical properties of the signals at hand, a detector based on the generalised log-likelihood ratio test (GLRT) is designed to look for variation in signal power. The ability of the detector to trigger alarms when parametric roll is going to onset is evaluated on two...

  6. Number of Siblings During Childhood and the Likelihood of Divorce in Adulthood.

    Science.gov (United States)

    Bobbitt-Zeher, Donna; Downey, Douglas B; Merry, Joseph

    2016-11-01

    Despite fertility decline across economically developed countries, relatively little is known about the social consequences of children being raised with fewer siblings. Much research suggests that growing up with fewer siblings is probably positive, as children tend to do better in school when sibship size is small. Less scholarship, however, has explored how growing up with few siblings influences children's ability to get along with peers and develop long-term meaningful relationships. If siblings serve as important social practice partners during childhood, individuals with few or no siblings may struggle to develop successful social lives later in adulthood. With data from the General Social Surveys 1972-2012 , we explore this possibility by testing whether sibship size during childhood predicts the probability of divorce in adulthood. We find that, among those who ever marry, each additional sibling is associated with a three percent decline in the likelihood of divorce, net of covariates.

  7. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  8. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  9. Predicting Risk-Mitigating Behaviors From Indecisiveness and Trait Anxiety

    DEFF Research Database (Denmark)

    Mcneill, Ilona M.; Dunlop, Patrick D.; Skinner, Timothy C.

    2016-01-01

    Past research suggests that indecisiveness and trait anxiety may both decrease the likelihood of performing risk-mitigating preparatory behaviors (e.g., preparing for natural hazards) and suggests two cognitive processes (perceived control and worrying) as potential mediators. However, no single...... control over wildfire-related outcomes. Trait anxiety did not uniquely predict preparedness or perceived control, but it did uniquely predict worry, with higher trait anxiety predicting more worrying. Also, worry trended toward uniquely predicting preparedness, albeit in an unpredicted positive direction...

  10. Fast maximum likelihood estimation of mutation rates using a birth-death process.

    Science.gov (United States)

    Wu, Xiaowei; Zhu, Hongxiao

    2015-02-07

    Since fluctuation analysis was first introduced by Luria and Delbrück in 1943, it has been widely used to make inference about spontaneous mutation rates in cultured cells. Under certain model assumptions, the probability distribution of the number of mutants that appear in a fluctuation experiment can be derived explicitly, which provides the basis of mutation rate estimation. It has been shown that, among various existing estimators, the maximum likelihood estimator usually demonstrates some desirable properties such as consistency and lower mean squared error. However, its application in real experimental data is often hindered by slow computation of likelihood due to the recursive form of the mutant-count distribution. We propose a fast maximum likelihood estimator of mutation rates, MLE-BD, based on a birth-death process model with non-differential growth assumption. Simulation studies demonstrate that, compared with the conventional maximum likelihood estimator derived from the Luria-Delbrück distribution, MLE-BD achieves substantial improvement on computational speed and is applicable to arbitrarily large number of mutants. In addition, it still retains good accuracy on point estimation. Published by Elsevier Ltd.

  11. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  12. Causality of relationship between paternal radiation exposure and leukaemia incidence in the children of Sellafield workers

    International Nuclear Information System (INIS)

    Wheldon, T.E.; Mairs, R.J.; Barrett, A.

    1992-01-01

    In this letter the author comments on K.E. Baverstock's case (1991) against the likelihood of a causal relationship between reported leukemia incidence and paternal radiation dose in the children of Sellafield workers, and emphasizes the desirability of devising experimental tests of the germ-cell line damage hypothesis as well as the evaluation of its plausibility. Particular reference is made to the role played by dose-rates and by the two-hit model for childhood acute leukemia. (Letter to the Editor)

  13. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  14. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  15. Block Empirical Likelihood for Longitudinal Single-Index Varying-Coefficient Model

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2013-01-01

    Full Text Available In this paper, we consider a single-index varying-coefficient model with application to longitudinal data. In order to accommodate the within-group correlation, we apply the block empirical likelihood procedure to longitudinal single-index varying-coefficient model, and prove a nonparametric version of Wilks’ theorem which can be used to construct the block empirical likelihood confidence region with asymptotically correct coverage probability for the parametric component. In comparison with normal approximations, the proposed method does not require a consistent estimator for the asymptotic covariance matrix, making it easier to conduct inference for the model's parametric component. Simulations demonstrate how the proposed method works.

  16. Design of Simplified Maximum-Likelihood Receivers for Multiuser CPM Systems

    Directory of Open Access Journals (Sweden)

    Li Bing

    2014-01-01

    Full Text Available A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases reduced complexity and marginal performance degradation.

  17. Attitude towards, and likelihood of, complaining in the banking ...

    African Journals Online (AJOL)

    aims to determine customers' attitudes towards complaining as well as their likelihood of voicing a .... is particularly powerful and impacts greatly on customer satisfaction and retention. ...... 'Cross-national analysis of hotel customers' attitudes ...

  18. Incidence of dental lesions in musk shrews (Suncus murinus) and their association with sex, age, body weight and diet.

    Science.gov (United States)

    Dudley, Emily S; Grunden, Beverly K; Crocker, Conan; Boivin, Gregory P

    2013-10-22

    Both wild and laboratory strains of the musk shrew (Suncus murinus) have a high incidence of periodontitis. The authors completed necropsy examinations in 51 shrews to identify dental lesions including tooth loss, mobility and fractures. Dental lesions were identified in significantly more females than males, and older animals were more likely to have lesions present. Shrews with one or more dental lesions weighed significantly less than those without lesions present. Dietary supplementation with mealworms did not significantly affect the incidence of dental lesions or the body weight of male or female shrews. The authors recommend routine body weight measurement as a simple, noninvasive method of detecting shrews with an increased likelihood of having dental lesions.

  19. Maximal information analysis: I - various Wayne State plots and the most common likelihood principle

    International Nuclear Information System (INIS)

    Bonvicini, G.

    2005-01-01

    Statistical analysis using all moments of the likelihood L(y vertical bar α) (y being the data and α being the fit parameters) is presented. The relevant plots for various data fitting situations are presented. The goodness of fit (GOF) parameter (currently the χ 2 ) is redefined as the isoprobability level in a multidimensional space. Many useful properties of statistical analysis are summarized in a new statistical principle which states that the most common likelihood, and not the tallest, is the best possible likelihood, when comparing experiments or hypotheses

  20. Bayesian geostatistical modeling of leishmaniasis incidence in Brazil.

    Directory of Open Access Journals (Sweden)

    Dimitrios-Alexios Karagiannis-Voules

    Full Text Available BACKGROUND: Leishmaniasis is endemic in 98 countries with an estimated 350 million people at risk and approximately 2 million cases annually. Brazil is one of the most severely affected countries. METHODOLOGY: We applied Bayesian geostatistical negative binomial models to analyze reported incidence data of cutaneous and visceral leishmaniasis in Brazil covering a 10-year period (2001-2010. Particular emphasis was placed on spatial and temporal patterns. The models were fitted using integrated nested Laplace approximations to perform fast approximate Bayesian inference. Bayesian variable selection was employed to determine the most important climatic, environmental, and socioeconomic predictors of cutaneous and visceral leishmaniasis. PRINCIPAL FINDINGS: For both types of leishmaniasis, precipitation and socioeconomic proxies were identified as important risk factors. The predicted number of cases in 2010 were 30,189 (standard deviation [SD]: 7,676 for cutaneous leishmaniasis and 4,889 (SD: 288 for visceral leishmaniasis. Our risk maps predicted the highest numbers of infected people in the states of Minas Gerais and Pará for visceral and cutaneous leishmaniasis, respectively. CONCLUSIONS/SIGNIFICANCE: Our spatially explicit, high-resolution incidence maps identified priority areas where leishmaniasis control efforts should be targeted with the ultimate goal to reduce disease incidence.

  1. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    Science.gov (United States)

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Greenery in the university environment: Students’ preferences and perceived restoration likelihood

    Science.gov (United States)

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students’ perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  3. Greenery in the university environment: Students' preferences and perceived restoration likelihood.

    Directory of Open Access Journals (Sweden)

    Nicole van den Bogerd

    Full Text Available A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students' perceptions of greenery in the university environment. This study examined (1 preference for university indoor and outdoor spaces with and without greenery (2 perceived restoration likelihood of university outdoor spaces with and without greenery and (3 if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722. Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space. For each of the three indoor spaces there were four or five stimuli conditions: (1 the standard design (2 the standard design with a colorful poster (3 the standard design with a nature poster (4 the standard design with a green wall (5 the standard design with a green wall plus interior plants. The university outdoor space included: (1 the standard design (2 the standard design with seating (3 the standard design with colorful artifacts (4 the standard design with green elements (5 the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  4. Greenery in the university environment: Students' preferences and perceived restoration likelihood.

    Science.gov (United States)

    van den Bogerd, Nicole; Dijkstra, S Coosje; Seidell, Jacob C; Maas, Jolanda

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students' perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong

  5. Dietary Sodium Consumption Predicts Future Blood Pressure and Incident Hypertension in the Japanese Normotensive General Population.

    Science.gov (United States)

    Takase, Hiroyuki; Sugiura, Tomonori; Kimura, Genjiro; Ohte, Nobuyuki; Dohi, Yasuaki

    2015-07-29

    Although there is a close relationship between dietary sodium and hypertension, the concept that persons with relatively high dietary sodium are at increased risk of developing hypertension compared with those with relatively low dietary sodium has not been studied intensively in a cohort. We conducted an observational study to investigate whether dietary sodium intake predicts future blood pressure and the onset of hypertension in the general population. Individual sodium intake was estimated by calculating 24-hour urinary sodium excretion from spot urine in 4523 normotensive participants who visited our hospital for a health checkup. After a baseline examination, they were followed for a median of 1143 days, with the end point being development of hypertension. During the follow-up period, hypertension developed in 1027 participants (22.7%). The risk of developing hypertension was higher in those with higher rather than lower sodium intake (hazard ratio 1.25, 95% CI 1.04 to 1.50). In multivariate Cox proportional hazards regression analysis, baseline sodium intake and the yearly change in sodium intake during the follow-up period (as continuous variables) correlated with the incidence of hypertension. Furthermore, both the yearly increase in sodium intake and baseline sodium intake showed significant correlations with the yearly increase in systolic blood pressure in multivariate regression analysis after adjustment for possible risk factors. Both relatively high levels of dietary sodium intake and gradual increases in dietary sodium are associated with future increases in blood pressure and the incidence of hypertension in the Japanese general population. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  6. Do depression and anxiety reduce the likelihood of remission in rheumatoid arthritis and psoriatic arthritis? Data from the prospective multicentre NOR-DMARD study.

    Science.gov (United States)

    Michelsen, Brigitte; Kristianslund, Eirik Klami; Sexton, Joseph; Hammer, Hilde Berner; Fagerli, Karen Minde; Lie, Elisabeth; Wierød, Ada; Kalstad, Synøve; Rødevand, Erik; Krøll, Frode; Haugeberg, Glenn; Kvien, Tore K

    2017-11-01

    To investigate the predictive value of baseline depression/anxiety on the likelihood of achieving joint remission in rheumatoid arthritis (RA) and psoriatic arthritis (PsA) as well as the associations between baseline depression/anxiety and the components of the remission criteria at follow-up. We included 1326 patients with RA and 728 patients with PsA from the prospective observational NOR-DMARD study starting first-time tumour necrosis factor inhibitors or methotrexate. The predictive value of depression/anxiety on remission was explored in prespecified logistic regression models and the associations between baseline depression/anxiety and the components of the remission criteria in prespecified multiple linear regression models. Baseline depression/anxiety according to EuroQoL-5D-3L, Short Form-36 (SF-36) Mental Health subscale ≤56 and SF-36 Mental Component Summary ≤38 negatively predicted 28-joint Disease Activity Score anxiety was associated with increased patient's and evaluator's global assessment, tender joint count and joint pain in RA at follow-up, but not with swollen joint count and acute phase reactants. Depression and anxiety may reduce likelihood of joint remission based on composite scores in RA and PsA and should be taken into account in individual patients when making a shared decision on a treatment target. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    Science.gov (United States)

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Predicting Parents' Experiences with Coresident Adult Children.

    Science.gov (United States)

    Aquilino, William S.

    1991-01-01

    Examined likelihood of parent-adult child coresidence and implications of coresidence for quality of life as perceived by parents. Data from 1987-88 National Survey of Families and Households showed that positive home environment was strong selection factor in predicting probability of coresidence. Middle-class parents reported more negative…

  9. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  10. Incidence of pulmonary embolism and other chest findings in younger patients using multidetector computed tomography

    International Nuclear Information System (INIS)

    Heredia, Vasco; Ramalho, Miguel; Zapparoli, Mauricio; Semelka, Richard C.

    2010-01-01

    Background: Multidetector computed tomography (MDCT) has become the first-line modality for imaging patients with suspected pulmonary embolism (PE). The disadvantages of MDCT, the use of ionizing radiation and iodinated contrast agents, are a reasonable cause of concern, especially in young patients, and therefore it is critical to understand the likelihood of PE in these patients to evaluate a risk benefit analysis. Purpose: To calculate the incidence of PE and other chest findings on MDCT in a young adult population investigated for PE. Material and Methods: 387 consecutive patients (age 31.5±13.5 years) underwent chest MDCT for clinically suspected PE between January 2004 and August 2006. Incidence of PE and other chest findings were calculated with a confidence interval of 95% using binomial distribution. Results: PE incidence was 5%; negative PE with other chest findings was 60%. In 89% of the patients with other chest findings, these included findings of the pleura and/or lung parenchyma. The main patterns of disease were lung opacification suggesting pneumonia (41%), atelectasis (12.4%), and nodular/mass findings (17.5%). In 34% of the patients, there was no PE and no other findings present. Conclusion: There is a low incidence of PE in young patients imaged for PE with MDCT

  11. Self-Reported Mental Health Predicts Acute Respiratory Infection.

    Science.gov (United States)

    Maxwell, Lizzie; Barrett, Bruce; Chase, Joseph; Brown, Roger; Ewers, Tola

    2015-06-01

    Poor mental health conditions, including stress and depression, have been recognized as a risk factor for the development of acute respiratory infection. Very few studies have considered the role of general mental health in acute respiratory infection occurrence. The aim of this analysis is to determine if overall mental health, as assessed by the mental component of the Short Form 12 Health Survey, predicts incidence, duration, or severity of acute respiratory infection. Data utilized for this analysis came from the National Institute of Health-funded Meditation or Exercise for Preventing Acute Respiratory Infection (MEPARI) and MEPARI-2 randomized controlled trials examining the effects of meditation or exercise on acute respiratory infection among adults aged > 30 years in Madison, Wisconsin. A Kendall tau rank correlation compared the Short Form 12 mental component, completed by participants at baseline, with acute respiratory infection incidence, duration, and area-under-the-curve (global) severity, as assessed by the Wisconsin Upper Respiratory Symptom Survey. Participants were recruited from Madison, Wis, using advertisements in local media. Short Form 12 mental health scores significantly predicted incidence (P = 0.037) of acute respiratory infection, but not duration (P = 0.077) or severity (P = 0.073). The Positive and Negative Affect Schedule (PANAS) negative emotion measure significantly predicted global severity (P = 0.036), but not incidence (P = 0.081) or duration (P = 0.125). Mindful Attention Awareness Scale scores significantly predicted incidence of acute respiratory infection (P = 0.040), but not duration (P = 0.053) or severity (P = 0.70). The PHQ-9, PSS-10, and PANAS positive measures did not show significant predictive associations with any of the acute respiratory infection outcomes. Self-reported overall mental health, as measured by the mental component of Short Form 12, predicts acute respiratory infection incidence.

  12. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  13. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  14. Maximum Likelihood Approach for RFID Tag Set Cardinality Estimation with Detection Errors

    DEFF Research Database (Denmark)

    Nguyen, Chuyen T.; Hayashi, Kazunori; Kaneko, Megumi

    2013-01-01

    Abstract Estimation schemes of Radio Frequency IDentification (RFID) tag set cardinality are studied in this paper using Maximum Likelihood (ML) approach. We consider the estimation problem under the model of multiple independent reader sessions with detection errors due to unreliable radio...... is evaluated under dierent system parameters and compared with that of the conventional method via computer simulations assuming flat Rayleigh fading environments and framed-slotted ALOHA based protocol. Keywords RFID tag cardinality estimation maximum likelihood detection error...

  15. Cosmic shear measurement with maximum likelihood and maximum a posteriori inference

    Science.gov (United States)

    Hall, Alex; Taylor, Andy

    2017-06-01

    We investigate the problem of noise bias in maximum likelihood and maximum a posteriori estimators for cosmic shear. We derive the leading and next-to-leading order biases and compute them in the context of galaxy ellipticity measurements, extending previous work on maximum likelihood inference for weak lensing. We show that a large part of the bias on these point estimators can be removed using information already contained in the likelihood when a galaxy model is specified, without the need for external calibration. We test these bias-corrected estimators on simulated galaxy images similar to those expected from planned space-based weak lensing surveys, with promising results. We find that the introduction of an intrinsic shape prior can help with mitigation of noise bias, such that the maximum a posteriori estimate can be made less biased than the maximum likelihood estimate. Second-order terms offer a check on the convergence of the estimators, but are largely subdominant. We show how biases propagate to shear estimates, demonstrating in our simple set-up that shear biases can be reduced by orders of magnitude and potentially to within the requirements of planned space-based surveys at mild signal-to-noise ratio. We find that second-order terms can exhibit significant cancellations at low signal-to-noise ratio when Gaussian noise is assumed, which has implications for inferring the performance of shear-measurement algorithms from simplified simulations. We discuss the viability of our point estimators as tools for lensing inference, arguing that they allow for the robust measurement of ellipticity and shear.

  16. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Juul, Anders

    2004-01-01

    -like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  17. The skewed weak lensing likelihood: why biases arise, despite data and theory being sound.

    Science.gov (United States)

    Sellentin, Elena; Heymans, Catherine; Harnois-Déraps, Joachim

    2018-04-01

    We derive the essentials of the skewed weak lensing likelihood via a simple Hierarchical Forward Model. Our likelihood passes four objective and cosmology-independent tests which a standard Gaussian likelihood fails. We demonstrate that sound weak lensing data are naturally biased low, since they are drawn from a skewed distribution. This occurs already in the framework of ΛCDM. Mathematically, the biases arise because noisy two-point functions follow skewed distributions. This form of bias is already known from CMB analyses, where the low multipoles have asymmetric error bars. Weak lensing is more strongly affected by this asymmetry as galaxies form a discrete set of shear tracer particles, in contrast to a smooth shear field. We demonstrate that the biases can be up to 30% of the standard deviation per data point, dependent on the properties of the weak lensing survey and the employed filter function. Our likelihood provides a versatile framework with which to address this bias in future weak lensing analyses.

  18. [Application of R-based multiple seasonal ARIMA model, in predicting the incidence of hand, foot and mouth disease in Shaanxi province].

    Science.gov (United States)

    Liu, F; Zhu, N; Qiu, L; Wang, J J; Wang, W H

    2016-08-10

    To apply the ' auto-regressive integrated moving average product seasonal model' in predicting the number of hand, foot and mouth disease in Shaanxi province. In Shaanxi province, the trend of hand, foot and mouth disease was analyzed and tested, under the use of R software, between January 2009 and June 2015. Multiple seasonal ARIMA model was then fitted under time series to predict the number of hand, foot and mouth disease in 2016 and 2017. Seasonal effect was seen in hand, foot and mouth disease in Shaanxi province. A multiple seasonal ARIMA (2,1,0)×(1,1,0)12 was established, with the equation as (1 -B)(1 -B12)Ln (Xt) =((1-1.000B)/(1-0.532B-0.363B(2))*(1-0.644B12-0.454B12(2)))*Epsilont. The mean of absolute error and the relative error were 531.535 and 0.114, respectively when compared to the simulated number of patients from Jun to Dec in 2015. RESULTS under the prediction of multiple seasonal ARIMA model showed that the numbers of patients in both 2016 and 2017 were similar to that of 2015 in Shaanxi province. Multiple seasonal ARIMA (2,1,0)×(1,1,0)12 model could be used to successfully predict the incidence of hand, foot and mouth disease in Shaanxi province.

  19. Full likelihood analysis of genetic risk with variable age at onset disease--combining population-based registry data and demographic information.

    Directory of Open Access Journals (Sweden)

    Janne Pitkäniemi

    Full Text Available BACKGROUND: In genetic studies of rare complex diseases it is common to ascertain familial data from population based registries through all incident cases diagnosed during a pre-defined enrollment period. Such an ascertainment procedure is typically taken into account in the statistical analysis of the familial data by constructing either a retrospective or prospective likelihood expression, which conditions on the ascertainment event. Both of these approaches lead to a substantial loss of valuable data. METHODOLOGY AND FINDINGS: Here we consider instead the possibilities provided by a Bayesian approach to risk analysis, which also incorporates the ascertainment procedure and reference information concerning the genetic composition of the target population to the considered statistical model. Furthermore, the proposed Bayesian hierarchical survival model does not require the considered genotype or haplotype effects be expressed as functions of corresponding allelic effects. Our modeling strategy is illustrated by a risk analysis of type 1 diabetes mellitus (T1D in the Finnish population-based on the HLA-A, HLA-B and DRB1 human leucocyte antigen (HLA information available for both ascertained sibships and a large number of unrelated individuals from the Finnish bone marrow donor registry. The heterozygous genotype DR3/DR4 at the DRB1 locus was associated with the lowest predictive probability of T1D free survival to the age of 15, the estimate being 0.936 (0.926; 0.945 95% credible interval compared to the average population T1D free survival probability of 0.995. SIGNIFICANCE: The proposed statistical method can be modified to other population-based family data ascertained from a disease registry provided that the ascertainment process is well documented, and that external information concerning the sizes of birth cohorts and a suitable reference sample are available. We confirm the earlier findings from the same data concerning the HLA-DR3

  20. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  1. Review of Elaboration Likelihood Model of persuasion

    OpenAIRE

    藤原, 武弘; 神山, 貴弥

    1989-01-01

    This article mainly introduces Elaboration Likelihood Model (ELM), proposed by Petty & Cacioppo, that is, a general attitude change theory. ELM posturates two routes to persuasion; central and peripheral route. Attitude change by central route is viewed as resulting from a diligent consideration of the issue-relevant informations presented. On the other hand, attitude change by peripheral route is viewed as resulting from peripheral cues in the persuasion context. Secondly we compare these tw...

  2. Baseline and changes in serum uric acid independently predict 11-year incidence of metabolic syndrome among community-dwelling women.

    Science.gov (United States)

    Kawamoto, R; Ninomiya, D; Kasai, Y; Senzaki, K; Kusunoki, T; Ohtsuka, N; Kumagi, T

    2018-02-19

    Metabolic syndrome (MetS) is associated with an increased risk of major cardiovascular events. In women, increased serum uric acid (SUA) levels are associated with MetS and its components. However, whether baseline and changes in SUA predict incidence of MetS and its components remains unclear. The subjects comprised 407 women aged 71 ± 8 years from a rural village. We have identified participants who underwent a similar examination 11 years ago, and examined the relationship between baseline and changes in SUA, and MetS based on the modified criteria of the National Cholesterol Education Program's Adult Treatment Panel (NCEP-ATP) III report. Of these subjects, 83 (20.4%) women at baseline and 190 (46.7%) women at follow-up had MetS. Multiple linear regression analysis was performed to evaluate the contribution of each confounding factor for MetS; both baseline and changes in SUA as well as history of cardiovascular disease, low-density lipoprotein cholesterol, and estimated glomerular filtration ratio (eGFR) were independently and significantly associated with the number of MetS components during an 11-year follow-up. The adjusted odds ratios (ORs) (95% confidence interval) for incident MetS across tertiles of baseline SUA and changes in SUA were 1.00, 1.47 (0.82-2.65), and 3.11 (1.66-5.83), and 1.00, 1.88 (1.03-3.40), and 2.49 (1.38-4.47), respectively. In addition, the combined effect between increased baseline and changes in SUA was also a significant and independent determinant for the accumulation of MetS components (F = 20.29, p baseline MetS. These results suggested that combined assessment of baseline and changes in SUA levels provides increased information for incident MetS, independent of other confounding factors in community-dwelling women.

  3. Validation of software for calculating the likelihood ratio for parentage and kinship.

    Science.gov (United States)

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  4. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  5. Likelihood for transcriptions in a genetic regulatory system under asymmetric stable Lévy noise.

    Science.gov (United States)

    Wang, Hui; Cheng, Xiujun; Duan, Jinqiao; Kurths, Jürgen; Li, Xiaofan

    2018-01-01

    This work is devoted to investigating the evolution of concentration in a genetic regulation system, when the synthesis reaction rate is under additive and multiplicative asymmetric stable Lévy fluctuations. By focusing on the impact of skewness (i.e., non-symmetry) in the probability distributions of noise, we find that via examining the mean first exit time (MFET) and the first escape probability (FEP), the asymmetric fluctuations, interacting with nonlinearity in the system, lead to peculiar likelihood for transcription. This includes, in the additive noise case, realizing higher likelihood of transcription for larger positive skewness (i.e., asymmetry) index β, causing a stochastic bifurcation at the non-Gaussianity index value α = 1 (i.e., it is a separating point or line for the likelihood for transcription), and achieving a turning point at the threshold value β≈-0.5 (i.e., beyond which the likelihood for transcription suddenly reversed for α values). The stochastic bifurcation and turning point phenomena do not occur in the symmetric noise case (β = 0). While in the multiplicative noise case, non-Gaussianity index value α = 1 is a separating point or line for both the MFET and the FEP. We also investigate the noise enhanced stability phenomenon. Additionally, we are able to specify the regions in the whole parameter space for the asymmetric noise, in which we attain desired likelihood for transcription. We have conducted a series of numerical experiments in "regulating" the likelihood of gene transcription by tuning asymmetric stable Lévy noise indexes. This work offers insights for possible ways of achieving gene regulation in experimental research.

  6. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  7. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  8. Incidence of hypocalcemia in patients receiving denosumab for prevention of skeletal-related events in bone metastasis.

    Science.gov (United States)

    Yerram, Prakirthi; Kansagra, Shraddha; Abdelghany, Osama

    2017-04-01

    Background Denosumab therapy is commonly used for the prevention of skeletal-related events in patients with bone metastasis. However, a common side effect of denosumab is hypocalcemia. Objective The aim of the study is to determine the incidence of hypocalcemia in patients receiving denosumab for prevention of skeletal-related events in bone metastasis and evaluate risk factors for developing hypocalcemia. Methods This was a retrospective medication use evaluation reviewing the incidence of hypocalcemia in patients receiving outpatient denosumab for prevention of skeletal-related events at Yale-New Haven Hospital. Additionally, various risk factors were reviewed to determine their risk of developing hypocalcemia. Results As per Common Terminology Criteria for Adverse Events v4.03, of the 106 patients included in the study population, 37 (35%) patients had an incidence of hypocalcemia within 30 days of denosumab administration. Fourteen patients (13.2%) had an incidence of grade 1, 13 patients (12.3%) had an incidence of grade 2 hypocalcemia, and 7 patients (6.6%) had an incidence of grade 3 hypocalcemia. Grade 4 hypocalcemia occurred in three (2.8%) patients. Calcium supplementation did not decrease the risk of developing hypocalcemia. Patients who had one or more episodes of acute kidney insufficiency were at a higher risk of developing hypocalcemia (odds ratio = 7.5 (95% confidence interval = 1.8-36.3), p = 0.001). Conclusion This study found that the overall incidence of hypocalcemia and severe hypocalcemia was higher than reported in clinical trials. Additionally, calcium supplementation did not have an effect on incidence of hypocalcemia, while patients who experienced acute kidney insufficiency while on denosumab had a higher likelihood of developing hypocalcemia.

  9. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio.

    Science.gov (United States)

    Rottman, Benjamin Margolin

    2017-02-01

    Whether humans can accurately make decisions in line with Bayes' rule has been one of the most important yet contentious topics in cognitive psychology. Though a number of paradigms have been used for studying Bayesian updating, rarely have subjects been allowed to use their own preexisting beliefs about the prior and the likelihood. A study is reported in which physicians judged the posttest probability of a diagnosis for a patient vignette after receiving a test result, and the physicians' posttest judgments were compared to the normative posttest calculated from their own beliefs in the sensitivity and false positive rate of the test (likelihood ratio) and prior probability of the diagnosis. On the one hand, the posttest judgments were strongly related to the physicians' beliefs about both the prior probability as well as the likelihood ratio, and the priors were used considerably more strongly than in previous research. On the other hand, both the prior and the likelihoods were still not used quite as much as they should have been, and there was evidence of other nonnormative aspects to the updating, such as updating independent of the likelihood beliefs. By focusing on how physicians use their own prior beliefs for Bayesian updating, this study provides insight into how well experts perform probabilistic inference in settings in which they rely upon their own prior beliefs rather than experimenter-provided cues. It suggests that there is reason to be optimistic about experts' abilities, but that there is still considerable need for improvement.

  10. Powdered alcohol: Awareness and likelihood of use among a sample of college students.

    Science.gov (United States)

    Vail-Smith, Karen; Chaney, Beth H; Martin, Ryan J; Don Chaney, J

    2016-01-01

    In March 2015, the Alcohol and Tobacco Tax and Trade Bureau approved the sale of Palcohol, the first powdered alcohol product to be marketed and sold in the U.S. Powdered alcohol is freeze-dried, and one individual-serving size packet added to 6 ounces of liquid is equivalent to a standard drink. This study assessed awareness of powered alcohol and likelihood to use and/or misuse powdered alcohol among college students. Surveys were administered to a convenience sample of 1,841 undergraduate students. Only 16.4% of respondents had heard of powdered alcohol. After being provided a brief description of powdered alcohol, 23% indicated that they would use the product if available, and of those, 62.1% also indicated likelihood of misusing the product (eg, snorting it, mixing it with alcohol). Caucasian students (OR = 1.5) and hazardous drinkers (based on AUDIT-C scores; OR = 4.7) were significantly more likely to indicate likelihood of use. Hazardous drinkers were also six times more likely to indicate likelihood to misuse the product. These findings can inform upstream prevention efforts in states debating bans on powdered alcohol. In states where powdered alcohol will soon be available, alcohol education initiatives should be updated to include information on the potential risks of use and be targeted to those populations most likely to misuse. This is the first peer-reviewed study to assess the awareness of and likelihood to use and/or misuse powdered alcohol, a potentially emerging form of alcohol. © American Academy of Addiction Psychiatry.

  11. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  12. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  13. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  14. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  15. Maximum Likelihood Compton Polarimetry with the Compton Spectrometer and Imager

    Energy Technology Data Exchange (ETDEWEB)

    Lowell, A. W.; Boggs, S. E; Chiu, C. L.; Kierans, C. A.; Sleator, C.; Tomsick, J. A.; Zoglauer, A. C. [Space Sciences Laboratory, University of California, Berkeley (United States); Chang, H.-K.; Tseng, C.-H.; Yang, C.-Y. [Institute of Astronomy, National Tsing Hua University, Taiwan (China); Jean, P.; Ballmoos, P. von [IRAP Toulouse (France); Lin, C.-H. [Institute of Physics, Academia Sinica, Taiwan (China); Amman, M. [Lawrence Berkeley National Laboratory (United States)

    2017-10-20

    Astrophysical polarization measurements in the soft gamma-ray band are becoming more feasible as detectors with high position and energy resolution are deployed. Previous work has shown that the minimum detectable polarization (MDP) of an ideal Compton polarimeter can be improved by ∼21% when an unbinned, maximum likelihood method (MLM) is used instead of the standard approach of fitting a sinusoid to a histogram of azimuthal scattering angles. Here we outline a procedure for implementing this maximum likelihood approach for real, nonideal polarimeters. As an example, we use the recent observation of GRB 160530A with the Compton Spectrometer and Imager. We find that the MDP for this observation is reduced by 20% when the MLM is used instead of the standard method.

  16. Statistical analysis of COMPTEL maximum likelihood-ratio distributions: evidence for a signal from previously undetected AGN

    International Nuclear Information System (INIS)

    Williams, O. R.; Bennett, K.; Much, R.; Schoenfelder, V.; Blom, J. J.; Ryan, J.

    1997-01-01

    The maximum likelihood-ratio method is frequently used in COMPTEL analysis to determine the significance of a point source at a given location. In this paper we do not consider whether the likelihood-ratio at a particular location indicates a detection, but rather whether distributions of likelihood-ratios derived from many locations depart from that expected for source free data. We have constructed distributions of likelihood-ratios by reading values from standard COMPTEL maximum-likelihood ratio maps at positions corresponding to the locations of different categories of AGN. Distributions derived from the locations of Seyfert galaxies are indistinguishable, according to a Kolmogorov-Smirnov test, from those obtained from ''random'' locations, but differ slightly from those obtained from the locations of flat spectrum radio loud quasars, OVVs, and BL Lac objects. This difference is not due to known COMPTEL sources, since regions near these sources are excluded from the analysis. We suggest that it might arise from a number of sources with fluxes below the COMPTEL detection threshold

  17. Predictors of Likelihood of Speaking Up about Safety Concerns in Labour and Delivery

    Science.gov (United States)

    Lyndon, Audrey; Sexton, J. Bryan; Simpson, Kathleen Rice; Rosenstein, Alan; Lee, Kathryn A.; Wachter, Robert M.

    2011-01-01

    Background Despite widespread emphasis on promoting “assertive communication” by caregivers as essential to patient safety improvement efforts, fairly little is known about when and how clinicians speak up to address safety concerns. In this cross-sectional study we use a new measure of speaking up to begin exploring this issue in maternity care. Methods We developed a scenario-based measure of clinician’s assessment of potential harm and likelihood of speaking up in response to perceived harm. We embedded this scale in a survey with measures of safety climate, teamwork climate, disruptive behaviour, work stress, and personality traits of bravery and assertiveness. The survey was distributed to all registered nurses and obstetricians practicing in two US Labour & Delivery units. Results The response rate was 54% (125 of 230 potential respondents). Respondents were experienced clinicians (13.7 ± 11 years in specialty). Higher perception of harm, respondent role, specialty experience, and site predicted likelihood of speaking up when controlling for bravery and assertiveness. Physicians rated potential harm in common clinical scenarios lower than nurses did (7.5 vs. 8.4 on 2–10 scale; p<0.001). Some participants (12%) indicated they were unlikely to speak up despite perceiving high potential for harm in certain situations. Discussion This exploratory study found nurses and physicians differed in their harm ratings, and harm rating was a predictor of speaking up. This may partially explain persistent discrepancies between physicians and nurses in teamwork climate scores. Differing assessments of potential harms inherent in everyday practice may be a target for teamwork intervention in maternity care. PMID:22927492

  18. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  19. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  20. Link Prediction in Evolving Networks Based on Popularity of Nodes.

    Science.gov (United States)

    Wang, Tong; He, Xing-Sheng; Zhou, Ming-Yang; Fu, Zhong-Qian

    2017-08-02

    Link prediction aims to uncover the underlying relationship behind networks, which could be utilized to predict missing edges or identify the spurious edges. The key issue of link prediction is to estimate the likelihood of potential links in networks. Most classical static-structure based methods ignore the temporal aspects of networks, limited by the time-varying features, such approaches perform poorly in evolving networks. In this paper, we propose a hypothesis that the ability of each node to attract links depends not only on its structural importance, but also on its current popularity (activeness), since active nodes have much more probability to attract future links. Then a novel approach named popularity based structural perturbation method (PBSPM) and its fast algorithm are proposed to characterize the likelihood of an edge from both existing connectivity structure and current popularity of its two endpoints. Experiments on six evolving networks show that the proposed methods outperform state-of-the-art methods in accuracy and robustness. Besides, visual results and statistical analysis reveal that the proposed methods are inclined to predict future edges between active nodes, rather than edges between inactive nodes.

  1. Explaining and predicting workplace accidents using data-mining techniques

    International Nuclear Information System (INIS)

    Rivas, T.; Paz, M.; Martin, J.E.; Matias, J.M.; Garcia, J.F.; Taboada, J.

    2011-01-01

    Current research into workplace risk is mainly conducted using conventional descriptive statistics, which, however, fail to properly identify cause-effect relationships and are unable to construct models that could predict accidents. The authors of the present study modelled incidents and accidents in two companies in the mining and construction sectors in order to identify the most important causes of accidents and develop predictive models. Data-mining techniques (decision rules, Bayesian networks, support vector machines and classification trees) were used to model accident and incident data compiled from the mining and construction sectors and obtained in interviews conducted soon after an incident/accident occurred. The results were compared with those for a classical statistical techniques (logistic regression), revealing the superiority of decision rules, classification trees and Bayesian networks in predicting and identifying the factors underlying accidents/incidents.

  2. Explaining and predicting workplace accidents using data-mining techniques

    Energy Technology Data Exchange (ETDEWEB)

    Rivas, T., E-mail: trivas@uvigo.e [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Paz, M., E-mail: mpaz.minas@gmail.co [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Martin, J.E., E-mail: jmartin@cippinternacional.co [CIPP International, S.L. Parque Tecnologico de Asturias, Parcela 43, Oficina 11, 33428 Llanera (Spain); Matias, J.M., E-mail: jmmatias@uvigo.e [Dpto. Estadistica e Investigacion Operativa, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain); Garcia, J.F., E-mail: jgarcia@cippinternacional.co [CIPP International, S.L. Parque Tecnologico de Asturias, Parcela 43, Oficina 11, 33428 Llanera (Spain); Taboada, J., E-mail: jtaboada@uvigo.e [Dpto. Ingenieria de los Recursos Naturales y Medio Ambiente, E.T.S.I. Minas, University of Vigo, Campus Lagoas, 36310 Vigo (Spain)

    2011-07-15

    Current research into workplace risk is mainly conducted using conventional descriptive statistics, which, however, fail to properly identify cause-effect relationships and are unable to construct models that could predict accidents. The authors of the present study modelled incidents and accidents in two companies in the mining and construction sectors in order to identify the most important causes of accidents and develop predictive models. Data-mining techniques (decision rules, Bayesian networks, support vector machines and classification trees) were used to model accident and incident data compiled from the mining and construction sectors and obtained in interviews conducted soon after an incident/accident occurred. The results were compared with those for a classical statistical techniques (logistic regression), revealing the superiority of decision rules, classification trees and Bayesian networks in predicting and identifying the factors underlying accidents/incidents.

  3. A conceptual framework for predicting temperate ecosystem sensitivity to human impacts on fire regimes

    Science.gov (United States)

    D. B. McWethy; P. E. Higuera; C. Whitlock; T. T. Veblen; D. M. J. S. Bowman; G. J. Cary; S. G. Haberle; R. E. Keane; B. D. Maxwell; M. S. McGlone; G. L. W. Perry; J. M. Wilmshurst

    2013-01-01

    The increased incidence of large fires around much of the world in recent decades raises questions about human and non-human drivers of fire and the likelihood of increased fire activity in the future. The purpose of this paper is to outline a conceptual framework for examining where human-set fires and feedbacks are likely to be most pronounced in temperate forests...

  4. MRI findings associated with development of incident knee pain over 48 months: data from the osteoarthritis initiative

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Gabby B.; Hou, Stephanie W.; Nardo, Lorenzo; Heilmeier, Ursula; Link, Thomas M. [University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Nevitt, Michael C.; McCulloch, Charles E. [University of California, San Francisco, Department of Epidemiology and Biostatistics, San Francisco, CA (United States)

    2016-05-15

    The purpose of this nested case-control study was to identify baseline, incident, and progressive MRI findings visible on standard MRI clinical sequences that were associated with development of incident knee pain in subjects at risk for OA over a period of 48 months. We analyzed 60 case knees developing incident pain (WOMAC{sub pain} = 0 at baseline and WOMAC{sub pain} ≥ 5 at 48 months) and 60 control knees (WOMAC{sub pain} = 0 at baseline and WOMAC{sub pain} = 0 at 48 months) from the Osteoarthritis Initiative. 3 T knee MRIs were analyzed using a modified WORMS score (cartilage, meniscus, bone marrow) at baseline and after 48 months. Baseline and longitudinal findings were grouped into logistic regression models and compared using likelihood-ratio tests. For each model that was significant, a stepwise elimination was used to isolate significant MRI findings. One baseline MRI finding and three findings that changed from baseline to 48 months were associated with the development of pain: at baseline, the severity of a cartilage lesion in the medial tibia was associated with incident pain - (odds ratio (OR) for incident pain = 3.05; P = 0.030). Longitudinally, an incident effusion (OR = 9.78; P = 0.005), a progressive cartilage lesion of the patella (OR = 4.59; P = 0.009), and an incident medial meniscus tear (OR = 4.91; P = 0.028) were associated with the development of pain. Our results demonstrate that baseline abnormalities of the medial tibia cartilage as well as an incident joint effusion, progressive patella cartilage defects, and an incident medial meniscus tear over 48 months may be associated with incident knee pain. Clinically, this study helps identify MRI findings that are associated with the development of knee pain. (orig.)

  5. MRI findings associated with development of incident knee pain over 48 months: data from the osteoarthritis initiative

    International Nuclear Information System (INIS)

    Joseph, Gabby B.; Hou, Stephanie W.; Nardo, Lorenzo; Heilmeier, Ursula; Link, Thomas M.; Nevitt, Michael C.; McCulloch, Charles E.

    2016-01-01

    The purpose of this nested case-control study was to identify baseline, incident, and progressive MRI findings visible on standard MRI clinical sequences that were associated with development of incident knee pain in subjects at risk for OA over a period of 48 months. We analyzed 60 case knees developing incident pain (WOMAC pain = 0 at baseline and WOMAC pain ≥ 5 at 48 months) and 60 control knees (WOMAC pain = 0 at baseline and WOMAC pain = 0 at 48 months) from the Osteoarthritis Initiative. 3 T knee MRIs were analyzed using a modified WORMS score (cartilage, meniscus, bone marrow) at baseline and after 48 months. Baseline and longitudinal findings were grouped into logistic regression models and compared using likelihood-ratio tests. For each model that was significant, a stepwise elimination was used to isolate significant MRI findings. One baseline MRI finding and three findings that changed from baseline to 48 months were associated with the development of pain: at baseline, the severity of a cartilage lesion in the medial tibia was associated with incident pain - (odds ratio (OR) for incident pain = 3.05; P = 0.030). Longitudinally, an incident effusion (OR = 9.78; P = 0.005), a progressive cartilage lesion of the patella (OR = 4.59; P = 0.009), and an incident medial meniscus tear (OR = 4.91; P = 0.028) were associated with the development of pain. Our results demonstrate that baseline abnormalities of the medial tibia cartilage as well as an incident joint effusion, progressive patella cartilage defects, and an incident medial meniscus tear over 48 months may be associated with incident knee pain. Clinically, this study helps identify MRI findings that are associated with the development of knee pain. (orig.)

  6. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  7. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  8. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2012-01-01

    such that the process X_{t} is fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b, and no other fractionality order is possible. We define the statistical model by 0inference when the true values satisfy b0¿1/2 and d0-b0......We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  9. Medication incidents reported to an online incident reporting system.

    LENUS (Irish Health Repository)

    Alrwisan, Adel

    2011-01-15

    AIMS: Approximately 20% of deaths from adverse events are related to medication incidents, costing the NHS an additional £500 million annually. Less than 5% of adverse events are reported. This study aims to assess the reporting rate of medication incidents in NHS facilities in the north east of Scotland, and to describe the types and outcomes of reported incidents among different services. Furthermore, we wished to quantify the proportion of reported incidents according to the reporters\\' profession. METHODS: A retrospective description was made of medication incidents reported to an online reporting system (DATIX) over a 46-month-period (July 2005 to April 2009). Reports originated from acute and community hospitals, mental health, and primary care facilities. RESULTS: Over the study period there were 2,666 incidents reported with a mean monthly reporting rate of 78.2\\/month (SD±16.9). 6.1% of all incidents resulted in harm, with insulin being the most commonly implicated medication. Nearly three-quarters (74.2%, n=1,978) of total incidents originated from acute hospitals. Administration incidents were implicated in the majority of the reported medication incidents (59%), followed by prescribing (10.8%) and dispensing (9.9%), while the nondescript "other medication incidents" accounted for 20.3% of total incidents. The majority of reports were made by nursing and midwifery staff (80%), with medical and dental professionals reporting the lowest number of incidents (n=56, 2%). CONCLUSIONS: The majority of medication incidents in this study were reported by nursing and midwifery staff, and were due to administration incidents. There is a clear need to elucidate the reasons for the limited contribution of the medical and dental professionals to reporting medication incidents.

  10. Sexually transmitted infection incidence among adolescents in Ireland.

    LENUS (Irish Health Repository)

    Davoren, Martin P

    2014-10-01

    The burden of sexually transmitted infections (STIs) rests with young people, yet in Ireland there has been very little research into this population. The purpose of this study was to determine the incidence rate and establish risk factors that predict STI occurrence among adolescents in Ireland.

  11. Organizational Justice and Men's Likelihood to Sexually Harass: The Moderating Role of Sexism and Personality

    Science.gov (United States)

    Krings, Franciska; Facchin, Stephanie

    2009-01-01

    This study demonstrated relations between men's perceptions of organizational justice and increased sexual harassment proclivities. Respondents reported higher likelihood to sexually harass under conditions of low interactional justice, suggesting that sexual harassment likelihood may increase as a response to perceived injustice. Moreover, the…

  12. Scrub Typhus Incidence Modeling with Meteorological Factors in South Korea.

    Science.gov (United States)

    Kwak, Jaewon; Kim, Soojun; Kim, Gilho; Singh, Vijay P; Hong, Seungjin; Kim, Hung Soo

    2015-06-29

    Since its recurrence in 1986, scrub typhus has been occurring annually and it is considered as one of the most prevalent diseases in Korea. Scrub typhus is a 3rd grade nationally notifiable disease that has greatly increased in Korea since 2000. The objective of this study is to construct a disease incidence model for prediction and quantification of the incidences of scrub typhus. Using data from 2001 to 2010, the incidence Artificial Neural Network (ANN) model, which considers the time-lag between scrub typhus and minimum temperature, precipitation and average wind speed based on the Granger causality and spectral analysis, is constructed and tested for 2011 to 2012. Results show reliable simulation of scrub typhus incidences with selected predictors, and indicate that the seasonality in meteorological data should be considered.

  13. Scrub Typhus Incidence Modeling with Meteorological Factors in South Korea

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2015-06-01

    Full Text Available Since its recurrence in 1986, scrub typhus has been occurring annually and it is considered as one of the most prevalent diseases in Korea. Scrub typhus is a 3rd grade nationally notifiable disease that has greatly increased in Korea since 2000. The objective of this study is to construct a disease incidence model for prediction and quantification of the incidences of scrub typhus. Using data from 2001 to 2010, the incidence Artificial Neural Network (ANN model, which considers the time-lag between scrub typhus and minimum temperature, precipitation and average wind speed based on the Granger causality and spectral analysis, is constructed and tested for 2011 to 2012. Results show reliable simulation of scrub typhus incidences with selected predictors, and indicate that the seasonality in meteorological data should be considered.

  14. Gender differences in the factors predicting initial engagement at cardiac rehabilitation.

    Science.gov (United States)

    Galdas, Paul Michael; Harrison, Alexander Stephen; Doherty, Patrick

    2018-01-01

    To determine whether there are gender differences in the factors that predict attendance at the initial cardiac rehabilitation baseline assessment (CR engagement) after referral. Using data from the National Audit of Cardiac Rehabilitation, we analysed data on 95 638 patients referred to CR following a cardiovascular diagnosis/treatment between 2013 and 2016. Eighteen factors that have been shown in previous research to be important predictors of CR participation were investigated and grouped into four categories: sociodemographic factors, cardiac risk factors, patient medical status and service-level factors. Logistic binary regression models were built for male patients and female patients, assessing the likelihood for CR engagement. Each included predictors such as age, number of comorbidities and social deprivation score. There were no important differences in the factors that predict the likelihood of CR engagement in men and women. Seven factors associated with a reduced probability of CR engagement, and eight factors associated with increased probability, were identified. Fourteen of the 15 factors identified as predicting the likelihood for engagement/non-engagement were the same for both men and women. Increasing age, being South Asian or non-white ethnicity (other than Black) and being single were all associated with a reduced likelihood of attending an initial CR baseline assessment in both men and women. Male patients with diabetes were 11% less likely to engage with CR; however, there was no significant association in women. Results showed that the overwhelmingly important determinant of CR engagement observed in both men and women was receiving an invitation to attend an assessment session (OR 4.223 men/4.033women; pgender differences in predictors of CR uptake should probably be more nuanced and informed by the stage of the patient care pathway.

  15. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  16. Trends in Dementia Incidence in a Birth Cohort Analysis of the Einstein Aging Study.

    Science.gov (United States)

    Derby, Carol A; Katz, Mindy J; Lipton, Richard B; Hall, Charles B

    2017-11-01

    Trends in dementia incidence rates have important implications for planning and prevention. To better understand incidence trends over time requires separation of age and cohort effects, and few prior studies have used this approach. To examine trends in dementia incidence and concomitant trends in cardiovascular comorbidities among individuals aged 70 years or older who were enrolled in the Einstein Aging Study between 1993 and 2015. In this birth cohort analysis of all-cause dementia incidence in persons enrolled in the Einstein Aging Study from October 20, 1993, through November 17, 2015, a systematically recruited, population-based sample of 1348 participants from Bronx County, New York, who were 70 years or older without dementia at enrollment and at least one annual follow-up was studied. Poisson regression was used to model dementia incidence as a function of age, sex, educational level, race, and birth cohort, with profile likelihood used to identify the timing of significant increases or decreases in incidence. Birth year and age. Incident dementia defined by consensus case conference based on annual, standardized neuropsychological and neurologic examination findings, using criteria from the DSM-IV. Among 1348 individuals (mean [SD] baseline age, 78.5 [5.4] years; 830 [61.6%] female; 915 [67.9%] non-Hispanic white), 150 incident dementia cases developed during 5932 person-years (mean [SD] follow-up, 4.4 [3.4] years). Dementia incidence decreased in successive birth cohorts. Incidence per 100 person-years was 5.09 in birth cohorts before 1920, 3.11 in the 1920 through 1924 birth cohorts, 1.73 in the 1925 through 1929 birth cohorts, and 0.23 in cohorts born after 1929. Change point analyses identified a significant decrease in dementia incidence among those born after July 1929 (95% CI, June 1929 to January 1930). The relative rate for birth cohorts before July 1929 vs after was 0.13 (95% CI, 0.04-0.41). Prevalence of stroke and myocardial infarction

  17. INCIDENCE OF CENTRAL DIABETES INSIPIDUS IN CHILDREN PRESENTING WITH POLYDIPSIA AND POLYURIA.

    Science.gov (United States)

    Haddad, Nadine G; Nabhan, Zeina M; Eugster, Erica A

    2016-12-01

    Polydipsia and polyuria are common reasons for referral to the Pediatric Endocrine clinic. In the absence of hyperglycemia, diabetes insipidus (DI) should be considered. The objectives of the study were to determine the prevalence of central DI (CDI) in a group of children presenting for evaluation of polydipsia and polyuria, and to determine if predictive features were present in patients in whom the diagnosis of DI was made. The study was a retrospective chart review of children presenting to the endocrine clinic with complaints of polydipsia and polyuria over a 5-year period. The charts of 41 patients (mean age 4.9 ± 3.7 years, 28 males) were reviewed. CDI was diagnosed in 8 (20%) children based on abnormal water deprivation test (WDT) results. All but one patient had abnormal magnetic resonance imaging (MRI) findings, the most common being pituitary stalk thickening. Children with DI were older (7.86 ± 4.40 vs. 4.18 ± 3.20 years, P = .01) and had a higher propensity for cold beverages intake and unusual water-seeking behaviors compared to those without DI. Baseline WDT also revealed higher serum sodium (Na) and osmolality. The incidence of CDI in children presenting with polydipsia and polyuria is low. Factors associated with higher likelihood of pathology include older age, propensity for cold beverage intake, and higher baseline serum Na and osmolality on a WDT. BMI = body mass index CDI = central diabetes insipidus DI = diabetes insipidus Na = sodium WDT = water deprivation test.

  18. Sustained high incidence of injuries from burns in a densely populated urban slum in Kenya: an emerging public health priority.

    Science.gov (United States)

    Wong, Joshua M; Nyachieo, Dhillon O; Benzekri, Noelle A; Cosmas, Leonard; Ondari, Daniel; Yekta, Shahla; Montgomery, Joel M; Williamson, John M; Breiman, Robert F

    2014-09-01

    Ninety-five percent of burn deaths occur in low- and middle-income countries (LMICs); however, longitudinal household-level studies have not been done in urban slum settings, where overcrowding and unsafe cook stoves may increase likelihood of injury. Using a prospective, population-based disease surveillance system in the urban slum of Kibera in Kenya, we examined the incidence of household-level burns of all severities from 2006-2011. Of approximately 28,500 enrolled individuals (6000 households), we identified 3072 burns. The overall incidence was 27.9/1000 person-years-of-observation. Children slums rapidly increases in many African countries, characterizing and addressing the rising burden of burns is likely to become a public health priority. Copyright © 2014 Elsevier Ltd and ISBI. All rights reserved.

  19. Predicting Porosity and Permeability for the Canyon Formation, SACROC Unit (Kelly-Snyder Field), Using the Geologic Analysis via Maximum Likelihood System

    International Nuclear Information System (INIS)

    Reinaldo Gonzalez; Scott R. Reeves; Eric Eslinger

    2007-01-01

    Accurate, high-resolution, three-dimensional (3D) reservoir characterization can provide substantial benefits for effective oilfield management. By doing so, the predictive reliability of reservoir flow models, which are routinely used as the basis for significant investment decisions designed to recover millions of barrels of oil, can be substantially improved. This is particularly true when Secondary Oil Recovery (SOR) or Enhanced Oil Recovery (EOR) operations are planned. If injectants such as water, hydrocarbon gases, steam, CO2, etc. are to be used; an understanding of fluid migration paths can mean the difference between economic success and failure. SOR/EOR projects will increasingly take place in heterogeneous reservoirs where interwell complexity is high and difficult to understand. Although reasonable reservoir characterization information often exists at the wellbore, the only economical way to sample the interwell region is with seismic methods which makes today's standard practice for developing a 3D reservoir description to resort to the use of seismic inversion techniques. However, the application of these methods brings other technical drawbacks than can render them inefficient. The industry therefore needs improved reservoir characterization approaches that are quicker, more accurate, and less expensive than today's standard methods. To achieve this objective, the Department of Energy (DOE) has been promoting some studies with the goal of evaluating whether robust relationships between data at vastly different scales of measurement could be established using advanced pattern recognition (soft computing) methods. Advanced Resources International (ARI) has performed two of these projects with encouraging results showing the feasibility of establishing critical relationships between data at different measurement scales to create high-resolution reservoir characterization. In this third study performed by ARI and also funded by the DOE, a model

  20. Gaussian likelihood inference on data from trans-Gaussian random fields with Matérn covariance function

    KAUST Repository

    Yan, Yuan; Genton, Marc G.

    2017-01-01

    Gaussian likelihood inference has been studied and used extensively in both statistical theory and applications due to its simplicity. However, in practice, the assumption of Gaussianity is rarely met in the analysis of spatial data. In this paper, we study the effect of non-Gaussianity on Gaussian likelihood inference for the parameters of the Matérn covariance model. By using Monte Carlo simulations, we generate spatial data from a Tukey g-and-h random field, a flexible trans-Gaussian random field, with the Matérn covariance function, where g controls skewness and h controls tail heaviness. We use maximum likelihood based on the multivariate Gaussian distribution to estimate the parameters of the Matérn covariance function. We illustrate the effects of non-Gaussianity of the data on the estimated covariance function by means of functional boxplots. Thanks to our tailored simulation design, a comparison of the maximum likelihood estimator under both the increasing and fixed domain asymptotics for spatial data is performed. We find that the maximum likelihood estimator based on Gaussian likelihood is overall satisfying and preferable than the non-distribution-based weighted least squares estimator for data from the Tukey g-and-h random field. We also present the result for Gaussian kriging based on Matérn covariance estimates with data from the Tukey g-and-h random field and observe an overall satisfactory performance.

  1. Gaussian likelihood inference on data from trans-Gaussian random fields with Matérn covariance function

    KAUST Repository

    Yan, Yuan

    2017-07-13

    Gaussian likelihood inference has been studied and used extensively in both statistical theory and applications due to its simplicity. However, in practice, the assumption of Gaussianity is rarely met in the analysis of spatial data. In this paper, we study the effect of non-Gaussianity on Gaussian likelihood inference for the parameters of the Matérn covariance model. By using Monte Carlo simulations, we generate spatial data from a Tukey g-and-h random field, a flexible trans-Gaussian random field, with the Matérn covariance function, where g controls skewness and h controls tail heaviness. We use maximum likelihood based on the multivariate Gaussian distribution to estimate the parameters of the Matérn covariance function. We illustrate the effects of non-Gaussianity of the data on the estimated covariance function by means of functional boxplots. Thanks to our tailored simulation design, a comparison of the maximum likelihood estimator under both the increasing and fixed domain asymptotics for spatial data is performed. We find that the maximum likelihood estimator based on Gaussian likelihood is overall satisfying and preferable than the non-distribution-based weighted least squares estimator for data from the Tukey g-and-h random field. We also present the result for Gaussian kriging based on Matérn covariance estimates with data from the Tukey g-and-h random field and observe an overall satisfactory performance.

  2. Identifying aspects of neighbourhood deprivation associated with increased incidence of schizophrenia.

    Science.gov (United States)

    Bhavsar, Vishal; Boydell, Jane; Murray, Robin; Power, Paddy

    2014-06-01

    Several studies have found an association between area deprivation and incidence of schizophrenia. However, not all studies have concurred and definitions of deprivation have varied between studies. Relative deprivation and inequality seem to be particularly important, but which aspects of deprivation or how this effect might operate is not known. The Lambeth Early Onset case register is a database of all cases of first episode psychosis aged 16 to 35years from the London Borough of Lambeth, a highly urban area. We identified 405 people with first onset schizophrenia who presented between 2000 and 2007. We calculated the overall incidence of first onset schizophrenia and tested for an association with area-level deprivation, using a multi-domain index of deprivation (IMD 2004). Specific analyses into associations with individual sub-domains of deprivation were then undertaken. Incidence rates, directly standardized for age and gender, were calculated for Lambeth at two geographical levels (small and large neighbourhood level). The Poisson regression model predicting incidence rate ratios for schizophrenia using overall deprivation score was statistically significant at both levels after adjusting for ethnicity, ethnic density, population density and population turnover. The incidence rate ratio for electoral ward deprivation was 1.03 (95% CI=1.004-1.04) and for the super output area deprivation was 1.04 (95% CI=1.02-1.06). The individual domains of crime, employment deprivation and educational deprivation were statistically significant predictors of incidence but, after adjusting for the other domains as well as age, gender, ethnicity and population density, only crime and educational deprivation, remained statistically significant. Low income, poor housing and deprived living environment did not predict incidence. In a highly urban area, an association was found between area-level deprivation and incidence of schizophrenia, after controlling for age, gender

  3. L.U.St: a tool for approximated maximum likelihood supertree reconstruction.

    Science.gov (United States)

    Akanni, Wasiu A; Creevey, Christopher J; Wilkinson, Mark; Pisani, Davide

    2014-06-12

    Supertrees combine disparate, partially overlapping trees to generate a synthesis that provides a high level perspective that cannot be attained from the inspection of individual phylogenies. Supertrees can be seen as meta-analytical tools that can be used to make inferences based on results of previous scientific studies. Their meta-analytical application has increased in popularity since it was realised that the power of statistical tests for the study of evolutionary trends critically depends on the use of taxon-dense phylogenies. Further to that, supertrees have found applications in phylogenomics where they are used to combine gene trees and recover species phylogenies based on genome-scale data sets. Here, we present the L.U.St package, a python tool for approximate maximum likelihood supertree inference and illustrate its application using a genomic data set for the placental mammals. L.U.St allows the calculation of the approximate likelihood of a supertree, given a set of input trees, performs heuristic searches to look for the supertree of highest likelihood, and performs statistical tests of two or more supertrees. To this end, L.U.St implements a winning sites test allowing ranking of a collection of a-priori selected hypotheses, given as a collection of input supertree topologies. It also outputs a file of input-tree-wise likelihood scores that can be used as input to CONSEL for calculation of standard tests of two trees (e.g. Kishino-Hasegawa, Shimidoara-Hasegawa and Approximately Unbiased tests). This is the first fully parametric implementation of a supertree method, it has clearly understood properties, and provides several advantages over currently available supertree approaches. It is easy to implement and works on any platform that has python installed. bitBucket page - https://afro-juju@bitbucket.org/afro-juju/l.u.st.git. Davide.Pisani@bristol.ac.uk.

  4. Performance of penalized maximum likelihood in estimation of genetic covariances matrices

    Directory of Open Access Journals (Sweden)

    Meyer Karin

    2011-11-01

    Full Text Available Abstract Background Estimation of genetic covariance matrices for multivariate problems comprising more than a few traits is inherently problematic, since sampling variation increases dramatically with the number of traits. This paper investigates the efficacy of regularized estimation of covariance components in a maximum likelihood framework, imposing a penalty on the likelihood designed to reduce sampling variation. In particular, penalties that "borrow strength" from the phenotypic covariance matrix are considered. Methods An extensive simulation study was carried out to investigate the reduction in average 'loss', i.e. the deviation in estimated matrices from the population values, and the accompanying bias for a range of parameter values and sample sizes. A number of penalties are examined, penalizing either the canonical eigenvalues or the genetic covariance or correlation matrices. In addition, several strategies to determine the amount of penalization to be applied, i.e. to estimate the appropriate tuning factor, are explored. Results It is shown that substantial reductions in loss for estimates of genetic covariance can be achieved for small to moderate sample sizes. While no penalty performed best overall, penalizing the variance among the estimated canonical eigenvalues on the logarithmic scale or shrinking the genetic towards the phenotypic correlation matrix appeared most advantageous. Estimating the tuning factor using cross-validation resulted in a loss reduction 10 to 15% less than that obtained if population values were known. Applying a mild penalty, chosen so that the deviation in likelihood from the maximum was non-significant, performed as well if not better than cross-validation and can be recommended as a pragmatic strategy. Conclusions Penalized maximum likelihood estimation provides the means to 'make the most' of limited and precious data and facilitates more stable estimation for multi-dimensional analyses. It should

  5. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  6. Major Accidents (Gray Swans) Likelihood Modeling Using Accident Precursors and Approximate Reasoning.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-07-01

    Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.

  7. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  8. Determination of point of maximum likelihood in failure domain using genetic algorithms

    International Nuclear Information System (INIS)

    Obadage, A.S.; Harnpornchai, N.

    2006-01-01

    The point of maximum likelihood in a failure domain yields the highest value of the probability density function in the failure domain. The maximum-likelihood point thus represents the worst combination of random variables that contribute in the failure event. In this work Genetic Algorithms (GAs) with an adaptive penalty scheme have been proposed as a tool for the determination of the maximum likelihood point. The utilization of only numerical values in the GAs operation makes the algorithms applicable to cases of non-linear and implicit single and multiple limit state function(s). The algorithmic simplicity readily extends its application to higher dimensional problems. When combined with Monte Carlo Simulation, the proposed methodology will reduce the computational complexity and at the same time will enhance the possibility in rare-event analysis under limited computational resources. Since, there is no approximation done in the procedure, the solution obtained is considered accurate. Consequently, GAs can be used as a tool for increasing the computational efficiency in the element and system reliability analyses

  9. Incidence, clinical predictors and outcome of acute renal failure among North Indian trauma patients

    Science.gov (United States)

    Medha; Subramanian, Arulselvi; Pandey, Ravindra Mohan; Sawhney, Chhavi; Upadhayay, Ashish Dutt; Albert, Venencia

    2013-01-01

    Context: There is a need for identifying risk factors aggravating development of acute renal failure after attaining trauma and defining new parameters for better assessment and management. Aim of the study was to determine the incidence of acute renal failure among trauma patients, and its correlation with various laboratory and clinical parameters recorded at the time of admission and in-hospital mortality. Subjects and Methods: The retrospective cohort study included admitted 208 trauma patients over a period of one year. 135 trauma patients at the serum creatinine level >2.0 mg/dL were enrolled in under the group of acute renal failure. 73 patients who had normal creatinine level made the control group. They were further assessed with clinical details and laboratory investigations. Results: Incidence of acute renal failure was 3.1%. There were 118 (87.4%) males and average length of stay was 9 (1, 83) days. Severity of injury (ISS, GCS) was relatively more among the renal failure group. Renal failure was transient in 35 (25.9%) patients. They had higher incidence of bone fracture (54.0%) (P= 0.04). Statistically significant association was observed between patients with head trauma and mortality 72 (59.0%) (P= 0.001). Prevalence of septic 24 (59.7%) and hemorrhagic 9 (7.4%) shock affected the renal failure group. Conclusion: Trauma patients at the urea level >50 mg/dL, ISS >24 on the first day of admission had 23 times and 7 times the risk of developing renal failure. Similarly, patients with hepatic dysfunction and pulmonary dysfunction were 12 times and 6 times. Patients who developed cardiovascular dysfunction, hematological dysfunction and post-trauma renal failure during the hospital stay had risk for mortality 29, 7 and 8 times, respectively. The final prognostic score obtained was: 14*hepatic dysfunction + 11*cISS + 18*cUrea + 12*cGlucose + 10*pulmonary dysfunction. Optimal score cut-off for prediction of renal failure was found to be ≥25 with

  10. Incidence, clinical predictors and outcome of acute renal failure among North Indian trauma patients

    Directory of Open Access Journals (Sweden)

    Medha

    2013-01-01

    Full Text Available Context: There is a need for identifying risk factors aggravating development of acute renal failure after attaining trauma and defining new parameters for better assessment and management. Aim of the study was to determine the incidence of acute renal failure among trauma patients, and its correlation with various laboratory and clinical parameters recorded at the time of admission and in-hospital mortality. Subjects and Methods: The retrospective cohort study included admitted 208 trauma patients over a period of one year. 135 trauma patients at the serum creatinine level >2.0 mg/dL were enrolled in under the group of acute renal failure. 73 patients who had normal creatinine level made the control group. They were further assessed with clinical details and laboratory investigations. Results: Incidence of acute renal failure was 3.1%. There were 118 (87.4% males and average length of stay was 9 (1, 83 days. Severity of injury (ISS, GCS was relatively more among the renal failure group. Renal failure was transient in 35 (25.9% patients. They had higher incidence of bone fracture (54.0% (P = 0.04. Statistically significant association was observed between patients with head trauma and mortality 72 (59.0% (P = 0.001. Prevalence of septic 24 (59.7% and hemorrhagic 9 (7.4% shock affected the renal failure group. Conclusion: Trauma patients at the urea level >50 mg/dL, ISS >24 on the first day of admission had 23 times and 7 times the risk of developing renal failure. Similarly, patients with hepatic dysfunction and pulmonary dysfunction were 12 times and 6 times. Patients who developed cardiovascular dysfunction, hematological dysfunction and post-trauma renal failure during the hospital stay had risk for mortality 29, 7 and 8 times, respectively. The final prognostic score obtained was: 14FNx01hepatic dysfunction + 11FNx01cISS + 18FNx01cUrea + 12FNx01cGlucose + 10FNx01pulmonary dysfunction. Optimal score cut-off for prediction of renal failure was

  11. On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.

    Science.gov (United States)

    Karabatsos, George

    2018-06-01

    This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.

  12. Maximum Likelihood Time-of-Arrival Estimation of Optical Pulses via Photon-Counting Photodetectors

    Science.gov (United States)

    Erkmen, Baris I.; Moision, Bruce E.

    2010-01-01

    Many optical imaging, ranging, and communications systems rely on the estimation of the arrival time of an optical pulse. Recently, such systems have been increasingly employing photon-counting photodetector technology, which changes the statistics of the observed photocurrent. This requires time-of-arrival estimators to be developed and their performances characterized. The statistics of the output of an ideal photodetector, which are well modeled as a Poisson point process, were considered. An analytical model was developed for the mean-square error of the maximum likelihood (ML) estimator, demonstrating two phenomena that cause deviations from the minimum achievable error at low signal power. An approximation was derived to the threshold at which the ML estimator essentially fails to provide better than a random guess of the pulse arrival time. Comparing the analytic model performance predictions to those obtained via simulations, it was verified that the model accurately predicts the ML performance over all regimes considered. There is little prior art that attempts to understand the fundamental limitations to time-of-arrival estimation from Poisson statistics. This work establishes both a simple mathematical description of the error behavior, and the associated physical processes that yield this behavior. Previous work on mean-square error characterization for ML estimators has predominantly focused on additive Gaussian noise. This work demonstrates that the discrete nature of the Poisson noise process leads to a distinctly different error behavior.

  13. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq; Al-Naffouri, Tareq Y.; Al-Ghadhban, Samir N.

    2012-01-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous

  14. Incidence, presentation and outcome of toxoplasmosis in HIV infected in the combination antiretroviral therapy era

    DEFF Research Database (Denmark)

    Martin-Iguacel, Raquel; Ahlstrom, Magnus Glindvad; Touma, Madeleine

    2017-01-01

    Background: HIV-associated incidence and prognosis of cerebral toxoplasmosis (CTX) is not well established during later years. Methods: From the Danish HIV Cohort Study, we identified 6325 HIV-infected individuals. We assessed incidence, mortality, predictive and prognostic factors of CTX during...

  15. An Efficient UD-Based Algorithm for the Computation of Maximum Likelihood Sensitivity of Continuous-Discrete Systems

    DEFF Research Database (Denmark)

    Boiroux, Dimitri; Juhl, Rune; Madsen, Henrik

    2016-01-01

    This paper addresses maximum likelihood parameter estimation of continuous-time nonlinear systems with discrete-time measurements. We derive an efficient algorithm for the computation of the log-likelihood function and its gradient, which can be used in gradient-based optimization algorithms....... This algorithm uses UD decomposition of symmetric matrices and the array algorithm for covariance update and gradient computation. We test our algorithm on the Lotka-Volterra equations. Compared to the maximum likelihood estimation based on finite difference gradient computation, we get a significant speedup...

  16. Comparison between artificial neural networks and maximum likelihood classification in digital soil mapping

    Directory of Open Access Journals (Sweden)

    César da Silva Chagas

    2013-04-01

    Full Text Available Soil surveys are the main source of spatial information on soils and have a range of different applications, mainly in agriculture. The continuity of this activity has however been severely compromised, mainly due to a lack of governmental funding. The purpose of this study was to evaluate the feasibility of two different classifiers (artificial neural networks and a maximum likelihood algorithm in the prediction of soil classes in the northwest of the state of Rio de Janeiro. Terrain attributes such as elevation, slope, aspect, plan curvature and compound topographic index (CTI and indices of clay minerals, iron oxide and Normalized Difference Vegetation Index (NDVI, derived from Landsat 7 ETM+ sensor imagery, were used as discriminating variables. The two classifiers were trained and validated for each soil class using 300 and 150 samples respectively, representing the characteristics of these classes in terms of the discriminating variables. According to the statistical tests, the accuracy of the classifier based on artificial neural networks (ANNs was greater than of the classic Maximum Likelihood Classifier (MLC. Comparing the results with 126 points of reference showed that the resulting ANN map (73.81 % was superior to the MLC map (57.94 %. The main errors when using the two classifiers were caused by: a the geological heterogeneity of the area coupled with problems related to the geological map; b the depth of lithic contact and/or rock exposure, and c problems with the environmental correlation model used due to the polygenetic nature of the soils. This study confirms that the use of terrain attributes together with remote sensing data by an ANN approach can be a tool to facilitate soil mapping in Brazil, primarily due to the availability of low-cost remote sensing data and the ease by which terrain attributes can be obtained.

  17. DREAM3: network inference using dynamic context likelihood of relatedness and the inferelator.

    Directory of Open Access Journals (Sweden)

    Aviv Madar

    2010-03-01

    Full Text Available Many current works aiming to learn regulatory networks from systems biology data must balance model complexity with respect to data availability and quality. Methods that learn regulatory associations based on unit-less metrics, such as Mutual Information, are attractive in that they scale well and reduce the number of free parameters (model complexity per interaction to a minimum. In contrast, methods for learning regulatory networks based on explicit dynamical models are more complex and scale less gracefully, but are attractive as they may allow direct prediction of transcriptional dynamics and resolve the directionality of many regulatory interactions.We aim to investigate whether scalable information based methods (like the Context Likelihood of Relatedness method and more explicit dynamical models (like Inferelator 1.0 prove synergistic when combined. We test a pipeline where a novel modification of the Context Likelihood of Relatedness (mixed-CLR, modified to use time series data is first used to define likely regulatory interactions and then Inferelator 1.0 is used for final model selection and to build an explicit dynamical model.Our method ranked 2nd out of 22 in the DREAM3 100-gene in silico networks challenge. Mixed-CLR and Inferelator 1.0 are complementary, demonstrating a large performance gain relative to any single tested method, with precision being especially high at low recall values. Partitioning the provided data set into four groups (knock-down, knock-out, time-series, and combined revealed that using comprehensive knock-out data alone provides optimal performance. Inferelator 1.0 proved particularly powerful at resolving the directionality of regulatory interactions, i.e. "who regulates who" (approximately of identified true positives were correctly resolved. Performance drops for high in-degree genes, i.e. as the number of regulators per target gene increases, but not with out-degree, i.e. performance is not affected by

  18. Risk and course of motor complications in a population-based incident Parkinson's disease cohort.

    Science.gov (United States)

    Bjornestad, Anders; Forsaa, Elin B; Pedersen, Kenn Freddy; Tysnes, Ole-Bjorn; Larsen, Jan Petter; Alves, Guido

    2016-01-01

    Motor complications may become major challenges in the management of patients with Parkinson's disease. In this study, we sought to determine the incidence, risk factors, evolution, and treatment of motor fluctuations and dyskinesias in a population-representative, incident Parkinson's disease cohort. In this prospective population-based 5-year longitudinal study, we followed 189 incident and initially drug-naïve Parkinson's disease patients biannually for detailed examination of dyskinesias and motor fluctuations as defined by the Unified Parkinson's disease Rating Scale. We performed Kaplan-Meier survival and Cox regression analyses to assess cumulative incidence and risk factors of these motor complications. The 5-year cumulative incidence of motor complications was 52.4%. Motor fluctuations occurred in 42.9% and dyskinesias in 24.3%. Besides higher motor severity predicting both motor fluctuations (p = 0.016) and dyskinesias (p motor fluctuations (p = 0.001), whereas female gender predicted dyskinesias (p = 0.001). Actual levodopa dose at onset of motor fluctuations (p = 0.037) or dyskinesias (p 0.1) independently predicted development of motor complications. Motor fluctuations reversed in 37% and dyskinesias in 49% of patients on oral treatment and remained generally mild in those with persistent complications. No patients received device-aided therapies during the study. More than 50% in the general Parkinson's disease population develop motor complications within 5 years of diagnosis. However, they remain mild in the vast majority and are reversible in a substantial proportion of patients. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Combined influence of multiple climatic factors on the incidence of bacterial foodborne diseases.

    Science.gov (United States)

    Park, Myoung Su; Park, Ki Hwan; Bahk, Gyung Jin

    2018-01-01

    Information regarding the relationship between the incidence of foodborne diseases (FBD) and climatic factors is useful in designing preventive strategies for FBD based on anticipated future climate change. To better predict the effect of climate change on foodborne pathogens, the present study investigated the combined influence of multiple climatic factors on bacterial FBD incidence in South Korea. During 2011-2015, the relationships between 8 climatic factors and the incidences of 13 bacterial FBD, were determined based on inpatient stays, on a monthly basis using the Pearson correlation analyses, multicollinearity tests, principal component analysis (PCA), and the seasonal autoregressive integrated moving average (SARIMA) modeling. Of the 8 climatic variables, the combination of temperature, relative humidity, precipitation, insolation, and cloudiness was significantly associated with salmonellosis (Pclimatic factors. These findings indicate that the relationships between multiple climatic factors and bacterial FBD incidence can be valuable for the development of prediction models for future patterns of diseases in response to changes in climate. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Approximate likelihood approaches for detecting the influence of primordial gravitational waves in cosmic microwave background polarization

    Science.gov (United States)

    Pan, Zhen; Anderes, Ethan; Knox, Lloyd

    2018-05-01

    One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.

  1. Quantifying the Establishment Likelihood of Invasive Alien Species Introductions Through Ports with Application to Honeybees in Australia.

    Science.gov (United States)

    Heersink, Daniel K; Caley, Peter; Paini, Dean R; Barry, Simon C

    2016-05-01

    The cost of an uncontrolled incursion of invasive alien species (IAS) arising from undetected entry through ports can be substantial, and knowledge of port-specific risks is needed to help allocate limited surveillance resources. Quantifying the establishment likelihood of such an incursion requires quantifying the ability of a species to enter, establish, and spread. Estimation of the approach rate of IAS into ports provides a measure of likelihood of entry. Data on the approach rate of IAS are typically sparse, and the combinations of risk factors relating to country of origin and port of arrival diverse. This presents challenges to making formal statistical inference on establishment likelihood. Here we demonstrate how these challenges can be overcome with judicious use of mixed-effects models when estimating the incursion likelihood into Australia of the European (Apis mellifera) and Asian (A. cerana) honeybees, along with the invasive parasites of biosecurity concern they host (e.g., Varroa destructor). Our results demonstrate how skewed the establishment likelihood is, with one-tenth of the ports accounting for 80% or more of the likelihood for both species. These results have been utilized by biosecurity agencies in the allocation of resources to the surveillance of maritime ports. © 2015 Society for Risk Analysis.

  2. A simple route to maximum-likelihood estimates of two-locus

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 94; Issue 3. A simple route to maximum-likelihood estimates of two-locus recombination fractions under inequality restrictions. Iain L. Macdonald Philasande Nkalashe. Research Note Volume 94 Issue 3 September 2015 pp 479-481 ...

  3. A prediction model for lymph node metastasis in T1 esophageal squamous cell carcinoma.

    Science.gov (United States)

    Wu, Jie; Chen, Qi-Xun; Shen, Di-Jian; Zhao, Qiang

    2018-04-01

    Endoscopic resection is widely used for the treatment of T1 esophageal cancer, but it cannot be used to treat lymph node metastasis (LNM). This study aimed to develop a prediction model for LNM in patients with T1 esophageal squamous cell carcinoma. A prospectively maintained database of all patients who underwent surgery for esophageal cancer between January 2002 and June 2010 was retrospectively reviewed, and patients with T1 squamous cell carcinoma were included in this study. Correlations between LNM and clinicopathological variables were evaluated using univariable and multivariable logistic regression analyses. The penalized maximum likelihood method was used to estimate regression coefficients. A prediction model was developed and internally validated using a bootstrap resampling method. Model performance was evaluated in terms of calibration, discrimination, and clinical usefulness. A total of 240 patients (197 male, 43 female) with a mean age of 57.9 years (standard deviation ± 8.3 years) were included in the analysis. The incidence of LNM was 16.3%. The prediction model consisted of four variables: grade, T1 stage, tumor location and tumor length. The model showed good calibration and good discrimination with a C-index of 0.787 (95% confidence interval [CI], 0.711-0.863). After internal validation, the optimism-corrected C-index was 0.762 (95% CI, 0.686-0.838). Decision curve analysis demonstrated that the prediction model was clinically useful. Our prediction model can facilitate individualized prediction of LNM in patients with T1 esophageal squamous cell carcinoma. This model can aid surgical decision making in patients who have undergone endoscopic resection. Copyright © 2017 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  4. The asymptotic behaviour of the maximum likelihood function of Kriging approximations using the Gaussian correlation function

    CSIR Research Space (South Africa)

    Kok, S

    2012-07-01

    Full Text Available continuously as the correlation function hyper-parameters approach zero. Since the global minimizer of the maximum likelihood function is an asymptote in this case, it is unclear if maximum likelihood estimation (MLE) remains valid. Numerical ill...

  5. Increased Kawasaki Disease Incidence Associated With Higher Precipitation and Lower Temperatures, Japan, 1991-2004.

    Science.gov (United States)

    Abrams, Joseph Y; Blase, Jennifer L; Belay, Ermias D; Uehara, Ritei; Maddox, Ryan A; Schonberger, Lawrence B; Nakamura, Yosikazu

    2018-06-01

    Kawasaki disease (KD) is an acute febrile vasculitis, which primarily affects children. The etiology of KD is unknown; while certain characteristics of the disease suggest an infectious origin, genetic or environmental factors may also be important. Seasonal patterns of KD incidence are well documented, but it is unclear whether these patterns are caused by changes in climate or by other unknown seasonal effects. The relationship between KD incidence and deviations from expected temperature and precipitation were analyzed using KD incidence data from Japanese nationwide epidemiologic surveys (1991-2004) and climate data from 136 weather stations of the Japan Meteorological Agency. Seven separate Poisson-distributed generalized linear regression models were run to examine the effects of temperature and precipitation on KD incidence in the same month as KD onset and the previous 1, 2, 3, 4, 5 and 6 months, controlling for geography as well as seasonal and long-term trends in KD incidence. KD incidence was negatively associated with temperature in the previous 2, 3, 4 and 5 months and positively associated with precipitation in the previous 1 and 2 months. The model that best predicted variations in KD incidence used climate data from the previous 2 months. An increase in total monthly precipitation by 100 mm was associated with increased KD incidence (rate ratio [RR] 1.012, 95% confidence interval [CI]: 1.005-1.019), and an increase of monthly mean temperature by 1°C was associated with decreased KD incidence (RR 0.984, 95% CI: 0.978-0.990). KD incidence was significantly affected by temperature and precipitation in previous months independent of other unknown seasonal factors. Climate data from the previous 2 months best predicted the variations in KD incidence. Although fairly minor, the effect of temperature and precipitation independent of season may provide additional clues to the etiology of KD.

  6. Do Judgments of Learning Predict Automatic Influences of Memory?

    Science.gov (United States)

    Undorf, Monika; Böhm, Simon; Cüpper, Lutz

    2016-01-01

    Current memory theories generally assume that memory performance reflects both recollection and automatic influences of memory. Research on people's predictions about the likelihood of remembering recently studied information on a memory test, that is, on judgments of learning (JOLs), suggests that both magnitude and resolution of JOLs are linked…

  7. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  8. Cancer incidence in Canada: trends and projections (1983-2032

    Directory of Open Access Journals (Sweden)

    Lin Xie

    2015-01-01

    Full Text Available In this monograph, we present historical and projected cancer incidence frequencies and rates for Canada, excluding non-melanoma skin cancers (i.e. basal and squamous carcinomas, in 1983 to 2032. The information is intended to help in planning strategy and allocating resources and infrastructure for future cancer control and health care. Projected changes in cancer incidence rates: From 2003-2007 to 2028-2032, the agestandardized incidence rates (ASIRs for all cancers combined are predicted to decrease in Canadian males by 5%, from 464.8 to 443.2 per 100 000 population, and increase in Canadian females by 4%, from 358.3 to 371.0 per 100 000. The overall decrease in cancer rates in males will be driven by the decrease in lung cancer rates in men aged 65Endnote * or older and in prostate cancer rates in men aged 75 or older. The overall increase in cancer rates in females reflects the predicted rise in lung cancer rates in women aged 65 or older. The increase also represents the expected increase in cancers of the uterus, thyroid, breast (in females under 45, leukemia, pancreas, kidney and melanoma. The largest changes in ASIRs projected over the 25-year forecasting horizon are increases in thyroid cancer (55% in males and 65% in females and liver cancer in males (43% and decreases in larynx cancer (47% in males and 59% in females, lung cancer in males (34% and stomach cancer (30% in males and 24% in females. The incidence rate of lung cancer in females is projected to continue to rise by 2% from 2003-2007 to 2008-2012 and then start to decrease in the last 20 projection years, by 18%. Breast cancer incidence is expected to change the least (an increase of less than 1% of all cancers in females. The predicted changes in the rates for colorectal cancer are below the medians in all cancers, with a decrease of 6% for both males and females during the entire projection period. The rates for prostate cancer are projected to be stable, based on an

  9. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  10. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  11. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    International Nuclear Information System (INIS)

    Lerche, Ch.W.; Ros, A.; Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A.; Sanchez, F.; Benlloch, J.M.

    2009-01-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  12. Maximum likelihood positioning for gamma-ray imaging detectors with depth of interaction measurement

    Energy Technology Data Exchange (ETDEWEB)

    Lerche, Ch.W. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain)], E-mail: lerche@ific.uv.es; Ros, A. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain); Monzo, J.M.; Aliaga, R.J.; Ferrando, N.; Martinez, J.D.; Herrero, V.; Esteve, R.; Gadea, R.; Colom, R.J.; Toledo, J.; Mateo, F.; Sebastia, A. [Grupo de Sistemas Digitales, ITACA, Universidad Politecnica de Valencia, 46022 Valencia (Spain); Sanchez, F.; Benlloch, J.M. [Grupo de Fisica Medica Nuclear, IFIC, Universidad de Valencia-Consejo Superior de Investigaciones Cientificas, 46980 Paterna (Spain)

    2009-06-01

    The center of gravity algorithm leads to strong artifacts for gamma-ray imaging detectors that are based on monolithic scintillation crystals and position sensitive photo-detectors. This is a consequence of using the centroids as position estimates. The fact that charge division circuits can also be used to compute the standard deviation of the scintillation light distribution opens a way out of this drawback. We studied the feasibility of maximum likelihood estimation for computing the true gamma-ray photo-conversion position from the centroids and the standard deviation of the light distribution. The method was evaluated on a test detector that consists of the position sensitive photomultiplier tube H8500 and a monolithic LSO crystal (42mmx42mmx10mm). Spatial resolution was measured for the centroids and the maximum likelihood estimates. The results suggest that the maximum likelihood positioning is feasible and partially removes the strong artifacts of the center of gravity algorithm.

  13. An error taxonomy system for analysis of haemodialysis incidents.

    Science.gov (United States)

    Gu, Xiuzhu; Itoh, Kenji; Suzuki, Satoshi

    2014-12-01

    This paper describes the development of a haemodialysis error taxonomy system for analysing incidents and predicting the safety status of a dialysis organisation. The error taxonomy system was developed by adapting an error taxonomy system which assumed no specific specialty to haemodialysis situations. Its application was conducted with 1,909 incident reports collected from two dialysis facilities in Japan. Over 70% of haemodialysis incidents were reported as problems or complications related to dialyser, circuit, medication and setting of dialysis condition. Approximately 70% of errors took place immediately before and after the four hours of haemodialysis therapy. Error types most frequently made in the dialysis unit were omission and qualitative errors. Failures or complications classified to staff human factors, communication, task and organisational factors were found in most dialysis incidents. Device/equipment/materials, medicine and clinical documents were most likely to be involved in errors. Haemodialysis nurses were involved in more incidents related to medicine and documents, whereas dialysis technologists made more errors with device/equipment/materials. This error taxonomy system is able to investigate incidents and adverse events occurring in the dialysis setting but is also able to estimate safety-related status of an organisation, such as reporting culture. © 2014 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  14. Lay understanding of forensic statistics: Evaluation of random match probabilities, likelihood ratios, and verbal equivalents.

    Science.gov (United States)

    Thompson, William C; Newman, Eryn J

    2015-08-01

    Forensic scientists have come under increasing pressure to quantify the strength of their evidence, but it is not clear which of several possible formats for presenting quantitative conclusions will be easiest for lay people, such as jurors, to understand. This experiment examined the way that people recruited from Amazon's Mechanical Turk (n = 541) responded to 2 types of forensic evidence--a DNA comparison and a shoeprint comparison--when an expert explained the strength of this evidence 3 different ways: using random match probabilities (RMPs), likelihood ratios (LRs), or verbal equivalents of likelihood ratios (VEs). We found that verdicts were sensitive to the strength of DNA evidence regardless of how the expert explained it, but verdicts were sensitive to the strength of shoeprint evidence only when the expert used RMPs. The weight given to DNA evidence was consistent with the predictions of a Bayesian network model that incorporated the perceived risk of a false match from 3 causes (coincidence, a laboratory error, and a frame-up), but shoeprint evidence was undervalued relative to the same Bayesian model. Fallacious interpretations of the expert's testimony (consistent with the source probability error and the defense attorney's fallacy) were common and were associated with the weight given to the evidence and verdicts. The findings indicate that perceptions of forensic science evidence are shaped by prior beliefs and expectations as well as expert testimony and consequently that the best way to characterize and explain forensic evidence may vary across forensic disciplines. (c) 2015 APA, all rights reserved).

  15. An evaluation of the effect of natural background radiation on cancer incidence

    International Nuclear Information System (INIS)

    Cohen, Jerry J.

    1978-01-01

    Previous studies on the relationship between levels of natural background radiation and cancer incidence indicate no significant correlation. This observation is shown to be consistent with certain predicted effect levels of ionizing radiation on malignancy production (BEIR, ICRP). Other theoretical predictions on the effects of ionizing radiation indicate induction rates to be as high as 8 x 10 -3 cancers/person-rem. Assuming this factor were correct, then roughly one-half of the cancer incidence in the USA could be attributed to exposure to natural background radiation. By statistically testing various hypothetically assigned cancer induction rates against observed data, it is possible to develop a probabilistic perspective on the cause-effect relationship. Tests have been performed using normalized (by age, death rate, etc.) cancer incidence by state against levels of background radiation. This evaluation allows for the determination of the probability of observing the actual data given that the hypotheses were correct. Graphic relationships between hypothetically assigned radiation induced cancer rates vs. the probability of observing the actual incidence are developed and presented. It is shown that if the cancer induction rate were in excess of ∼10 -3 cancers/person-rem, it would be highly improbable that there would, in fact, be a lack of correlation between the rates of natural background radiation and cancer incidence. (author)

  16. A Social Psychological Model for Predicting Sexual Harassment.

    Science.gov (United States)

    Pryor, John B.; And Others

    1995-01-01

    Presents a Person X Situation (PXS) model of sexual harassment suggesting that sexually harassing behavior may be predicted from an analysis of social situational and personal factors. Research on sexual harassment proclivities in men is reviewed, and a profile of men who have a high a likelihood to sexually harass is discussed. Possible PXS…

  17. Maximum likelihood pixel labeling using a spatially variant finite mixture model

    International Nuclear Information System (INIS)

    Gopal, S.S.; Hebert, T.J.

    1996-01-01

    We propose a spatially-variant mixture model for pixel labeling. Based on this spatially-variant mixture model we derive an expectation maximization algorithm for maximum likelihood estimation of the pixel labels. While most algorithms using mixture models entail the subsequent use of a Bayes classifier for pixel labeling, the proposed algorithm yields maximum likelihood estimates of the labels themselves and results in unambiguous pixel labels. The proposed algorithm is fast, robust, easy to implement, flexible in that it can be applied to any arbitrary image data where the number of classes is known and, most importantly, obviates the need for an explicit labeling rule. The algorithm is evaluated both quantitatively and qualitatively on simulated data and on clinical magnetic resonance images of the human brain

  18. The predictive value of current haemoglobin levels for incident tuberculosis and/or mortality during long-term antiretroviral therapy in South Africa: a cohort study.

    Science.gov (United States)

    Kerkhoff, Andrew D; Wood, Robin; Cobelens, Frank G; Gupta-Wright, Ankur; Bekker, Linda-Gail; Lawn, Stephen D

    2015-04-02

    Low haemoglobin concentrations may be predictive of incident tuberculosis (TB) and death in HIV-infected patients receiving antiretroviral therapy (ART), but data are limited and inconsistent. We examined these relationships retrospectively in a long-term South African ART cohort with multiple time-updated haemoglobin measurements. Prospectively collected clinical data on patients receiving ART for up to 8 years in a community-based cohort were analysed. Time-updated haemoglobin concentrations, CD4 counts and HIV viral loads were recorded, and TB diagnoses and deaths from all causes were ascertained. Anaemia severity was classified using World Health Organization criteria. TB incidence and mortality rates were calculated and Poisson regression models were used to identify independent predictors of incident TB and mortality, respectively. During a median follow-up of 5.0 years (IQR, 2.5-5.8) of 1,521 patients, 476 cases of incident TB and 192 deaths occurred during 6,459 person-years (PYs) of follow-up. TB incidence rates were strongly associated with time-updated anaemia severity; those without anaemia had a rate of 4.4 (95%CI, 3.8-5.1) cases/100 PYs compared to 10.0 (95%CI, 8.3-12.1), 26.6 (95%CI, 22.5-31.7) and 87.8 (95%CI, 57.0-138.2) cases/100 PYs in those with mild, moderate and severe anaemia, respectively. Similarly, mortality rates in those with no anaemia or mild, moderate and severe time-updated anaemia were 1.1 (95%CI, 0.8-1.5), 3.5 (95%CI, 2.7-4.8), 11.8 (95%CI, 9.5-14.8) and 28.2 (95%CI, 16.5-51.5) cases/100 PYs, respectively. Moderate and severe anaemia (time-updated) during ART were the strongest independent predictors for incident TB (adjusted IRR = 3.8 [95%CI, 3.0-4.8] and 8.2 [95%CI, 5.3-12.7], respectively) and for mortality (adjusted IRR = 6.0 [95%CI, 3.9-9.2] and adjusted IRR = 8.0 [95%CI, 3.9-16.4], respectively). Increasing severity of anaemia was associated with exceptionally high rates of both incident TB and mortality during

  19. Microsatellite Status of Primary Colorectal Cancer Predicts the Incidence of Postoperative Colorectal Neoplasms.

    Science.gov (United States)

    Takiyama, Aki; Tanaka, Toshiaki; Yamamoto, Yoko; Hata, Keisuke; Ishihara, Soichiro; Nozawa, Hiroaki; Kawai, Kazushige; Kiyomatsu, Tomomichi; Nishikawa, Takeshi; Otani, Kensuke; Sasaki, Kazuhito; Watanabe, Toshiaki

    2017-10-01

    Few studies have evaluated the risk of postoperative colorectal neoplasms stratified by the nature of primary colorectal cancer (CRC). In this study, we revealed it on the basis of the microsatellite (MS) status of primary CRC. We retrospectively reviewed 338 patients with CRC and calculated the risk of neoplasms during postoperative surveillance colonoscopy in association with the MS status of primary CRC. A propensity score method was applied. We identified a higher incidence of metachronous rectal neoplasms after the resection of MS stable CRC than MS instable CRC (adjusted HR 5.74, p=0.04). We also observed a higher incidence of colorectal tubular adenoma in patients with MSS CRC (adjusted hazard ratio 7.09, pcolorectal cancer influenced the risk of postoperative colorectal neoplasms. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  20. Computational study of jet interaction flow field with and without incidence

    International Nuclear Information System (INIS)

    Asif, M.; Zahir, S.; Khan, M.A.

    2004-01-01

    The objective was to study the interaction of a side jet with the incoming supersonic flow and hypersonic flow. Qualitatively same Cp trends have been obtained as found experimentally. Also in aerodynamic coefficients side jet interaction results in additional pitching moment which is because of the high pressure region in upstream of the jet and a low pressure region in the downstream of the jet. Also jet interaction results in the rise in the lift coefficient. Whereas in the incidence case, simulation has been performed for the hypersonic flows over a biconic body with supersonic lateral jet at Mach 9.7 and incidence of 0 o to incidence of -12 o and 12 o . The results obtained were compared with the experimental and CFD code CFL3D results. PAK-3D over predicts the surface pressure as compared to the CFL3D and experimental results, whereas the qualitative trends are the same. Finally the integrated aerodynamic force coefficients were compared with CFL3D predicted results. (author)

  1. Qualitative release assessment to estimate the likelihood of henipavirus entering the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Emma L Snary

    Full Text Available The genus Henipavirus includes Hendra virus (HeV and Nipah virus (NiV, for which fruit bats (particularly those of the genus Pteropus are considered to be the wildlife reservoir. The recognition of henipaviruses occurring across a wider geographic and host range suggests the possibility of the virus entering the United Kingdom (UK. To estimate the likelihood of henipaviruses entering the UK, a qualitative release assessment was undertaken. To facilitate the release assessment, the world was divided into four zones according to location of outbreaks of henipaviruses, isolation of henipaviruses, proximity to other countries where incidents of henipaviruses have occurred and the distribution of Pteropus spp. fruit bats. From this release assessment, the key findings are that the importation of fruit from Zone 1 and 2 and bat bushmeat from Zone 1 each have a Low annual probability of release of henipaviruses into the UK. Similarly, the importation of bat meat from Zone 2, horses and companion animals from Zone 1 and people travelling from Zone 1 and entering the UK was estimated to pose a Very Low probability of release. The annual probability of release for all other release routes was assessed to be Negligible. It is recommended that the release assessment be periodically re-assessed to reflect changes in knowledge and circumstances over time.

  2. Lactate Clearance sebagai Prediktor Mortalitas pada Pasien Sepsis Berat dan Syok Septik di Intesive Care Unit Rumah Sakit Dr. Hasan Sadikin Bandung

    Directory of Open Access Journals (Sweden)

    Muhammad Budi Kurniawan

    2017-04-01

    Full Text Available Mortality incidence predictor used for sepsis and shock septic in Intensive Care Unit (ICU were measured using Acute Physiology and Chronic Health Evaluation (APACHE II score, which needs many complex examinations. The purpose of this study was to examine lactate clearance as an alternative mortality predictor. Decreased percentage of lactate clearance is related to poor perfusion in microcirculation which leads to the possibility that lactate clearance can be used to predict mortality incidence in severe sepsis and shock septic patients in the ICU of Dr. Hasan Sadikin General Hospital Bandung. This was a prospective observational cohort study involving 51 patients who met sepsis and shock septic criteria during the period of September to November 2015. Lactate was examined continuously in all patients at first hour (H0 and H24 and then the lactate clearance value was measured using the following formula: lactate initial–lactate delayed/lactate initial x100%. Subjects were divided into two groups according to the low lactate clearance(40%. The Mann Whitney test was used for numeric data and exact Fisher test was used for categorical data. Results showed that the lactate clearance had a sensitivity of 100%, specificity of 88.4%, positive predictive value of 89.2%, negative predictive value of 100%, likelihood ratio positive of 86.6%, likelihood ratio negative of 0% and accuracy of 94.11%. Thus, lactate clearance can be used to predict mortality incidence in severe sepsis and shock septic patients.

  3. Bias Correction for the Maximum Likelihood Estimate of Ability. Research Report. ETS RR-05-15

    Science.gov (United States)

    Zhang, Jinming

    2005-01-01

    Lord's bias function and the weighted likelihood estimation method are effective in reducing the bias of the maximum likelihood estimate of an examinee's ability under the assumption that the true item parameters are known. This paper presents simulation studies to determine the effectiveness of these two methods in reducing the bias when the item…

  4. Parameter estimation in astronomy through application of the likelihood ratio. [satellite data analysis techniques

    Science.gov (United States)

    Cash, W.

    1979-01-01

    Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.

  5. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  6. Physical activity and incidence of sarcopenia: the population-based AGES—Reykjavik Study

    Science.gov (United States)

    Mijnarends, Donja M.; Koster, Annemarie; Schols, Jos M. G. A.; Meijers, Judith M. M.; Halfens, Ruud J. G.; Gudnason, Vilmundur; Eiriksdottir, Gudny; Siggeirsdottir, Kristin; Sigurdsson, Sigurdur; Jónsson, Pálmi V.; Meirelles, Osorio; Harris, Tamara

    2016-01-01

    Background: the prevalence of sarcopenia increases with age. Physical activity might slow the rate of muscle loss and therewith the incidence of sarcopenia. Objective: to examine the association of physical activity with incident sarcopenia over a 5-year period. Design: data from the population-based Age, Gene/Environment, Susceptibility–Reykjavik Study were used. Setting: people residing in the Reykjavik area at the start of the study. Subjects: the study included people aged 66–93 years (n = 2309). Methods: the amount of moderate–vigorous physical activity (MVPA) was assessed by a self-reported questionnaire. Sarcopenia was identified using the European Working Group on Sarcopenia in Older People algorithm, including muscle mass (computed tomography imaging), grip strength (computerised dynamometer) and gait speed (6 m). Results: mean age of the participants was 74.9 ± 4.7 years. The prevalence of sarcopenia was 7.3% at baseline and 16.8% at follow-up. The incidence proportion of sarcopenia over 5 years was 14.8% in the least-active individuals and 9.0% in the most-active individuals. Compared with the least-active participants, those reporting a moderate–high amount of MVPA had a significantly lower likelihood of incident sarcopenia (OR = 0.64, 95% CI 0.45–0.91). Participants with a high amount of MVPA had higher baseline levels of muscle mass, strength and walking speed, but baseline MVPA was not associated with the rate of muscle loss. Conclusion: a higher amount of MVPA seems to contribute to counteracting the development of sarcopenia. To delay the onset of sarcopenia and its potential adverse outcomes, attention should be paid to increasing physical activity levels in older adults. PMID:27189729

  7. Uncertainty about the true source. A note on the likelihood ratio at the activity level.

    Science.gov (United States)

    Taroni, Franco; Biedermann, Alex; Bozza, Silvia; Comte, Jennifer; Garbolino, Paolo

    2012-07-10

    This paper focuses on likelihood ratio based evaluations of fibre evidence in cases in which there is uncertainty about whether or not the reference item available for analysis - that is, an item typically taken from the suspect or seized at his home - is the item actually worn at the time of the offence. A likelihood ratio approach is proposed that, for situations in which certain categorical assumptions can be made about additionally introduced parameters, converges to formula described in existing literature. The properties of the proposed likelihood ratio approach are analysed through sensitivity analyses and discussed with respect to possible argumentative implications that arise in practice. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Using Implicit and Explicit Measures to Predict Nonsuicidal Self-Injury Among Adolescent Inpatients.

    Science.gov (United States)

    Cha, Christine B; Augenstein, Tara M; Frost, Katherine H; Gallagher, Katie; D'Angelo, Eugene J; Nock, Matthew K

    2016-01-01

    To examine the use of implicit and explicit measures to predict adolescent nonsuicidal self-injury (NSSI) before, during, and after inpatient hospitalization. Participants were 123 adolescent psychiatric inpatients who completed measures at hospital admission and discharge. The implicit measure (Self-Injury Implicit Association Test [SI-IAT]) and one of the explicit measures pertained to the NSSI method of cutting. Patients were interviewed at multiple time points at which they reported whether they had engaged in NSSI before their hospital stay, during their hospital stay, and within 3 months after discharge. At baseline, SI-IAT scores differentiated past-year self-injurers and noninjurers (t121 = 4.02, p < .001, d = 0.73). These SI-IAT effects were stronger among patients who engaged in cutting (versus noncutting NSSI methods). Controlling for NSSI history and prospective risk factors, SI-IAT scores predicted patients' subsequent cutting behavior during their hospital stay (odds ratio (OR) = 8.19, CI = 1.56-42.98, p < .05). Patients' explicit self-report uniquely predicted hospital-based and postdischarge cutting, even after controlling for SI-IAT scores (ORs = 1.82-2.34, CIs = 1.25-3.87, p values <.01). Exploratory analyses revealed that in specific cases in which patients explicitly reported low likelihood of NSSI, SI-IAT scores still predicted hospital-based cutting. The SI-IAT is an implicit measure that is outcome-specific, a short-term predictor above and beyond NSSI history, and potentially helpful in cases in which patients at risk for NSSI explicitly report that they would not do so in the future. Ultimately, both implicit and explicit measures can help to predict future incidents of cutting among adolescent inpatients. Copyright © 2016 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  9. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  10. Evaluating Fast Maximum Likelihood-Based Phylogenetic Programs Using Empirical Phylogenomic Data Sets

    Science.gov (United States)

    Zhou, Xiaofan; Shen, Xing-Xing; Hittinger, Chris Todd

    2018-01-01

    Abstract The sizes of the data matrices assembled to resolve branches of the tree of life have increased dramatically, motivating the development of programs for fast, yet accurate, inference. For example, several different fast programs have been developed in the very popular maximum likelihood framework, including RAxML/ExaML, PhyML, IQ-TREE, and FastTree. Although these programs are widely used, a systematic evaluation and comparison of their performance using empirical genome-scale data matrices has so far been lacking. To address this question, we evaluated these four programs on 19 empirical phylogenomic data sets with hundreds to thousands of genes and up to 200 taxa with respect to likelihood maximization, tree topology, and computational speed. For single-gene tree inference, we found that the more exhaustive and slower strategies (ten searches per alignment) outperformed faster strategies (one tree search per alignment) using RAxML, PhyML, or IQ-TREE. Interestingly, single-gene trees inferred by the three programs yielded comparable coalescent-based species tree estimations. For concatenation-based species tree inference, IQ-TREE consistently achieved the best-observed likelihoods for all data sets, and RAxML/ExaML was a close second. In contrast, PhyML often failed to complete concatenation-based analyses, whereas FastTree was the fastest but generated lower likelihood values and more dissimilar tree topologies in both types of analyses. Finally, data matrix properties, such as the number of taxa and the strength of phylogenetic signal, sometimes substantially influenced the programs’ relative performance. Our results provide real-world gene and species tree phylogenetic inference benchmarks to inform the design and execution of large-scale phylogenomic data analyses. PMID:29177474

  11. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    International Nuclear Information System (INIS)

    Bechtle, Philip; Heinemeyer, Sven; Staal, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-01-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full 8 TeV dataset. In addition to publishing a 95 % C.L. exclusion limit, the full likelihood information for the narrowresonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the ττ search and the rate measurements of the SM-like Higgs at 125 GeV in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http:// higgsbounds.hepforge.org. (orig.)

  12. Existence and uniqueness of the maximum likelihood estimator for models with a Kronecker product covariance structure

    NARCIS (Netherlands)

    Ros, B.P.; Bijma, F.; de Munck, J.C.; de Gunst, M.C.M.

    2016-01-01

    This paper deals with multivariate Gaussian models for which the covariance matrix is a Kronecker product of two matrices. We consider maximum likelihood estimation of the model parameters, in particular of the covariance matrix. There is no explicit expression for the maximum likelihood estimator

  13. Computation of the Likelihood in Biallelic Diffusion Models Using Orthogonal Polynomials

    Directory of Open Access Journals (Sweden)

    Claus Vogl

    2014-11-01

    Full Text Available In population genetics, parameters describing forces such as mutation, migration and drift are generally inferred from molecular data. Lately, approximate methods based on simulations and summary statistics have been widely applied for such inference, even though these methods waste information. In contrast, probabilistic methods of inference can be shown to be optimal, if their assumptions are met. In genomic regions where recombination rates are high relative to mutation rates, polymorphic nucleotide sites can be assumed to evolve independently from each other. The distribution of allele frequencies at a large number of such sites has been called “allele-frequency spectrum” or “site-frequency spectrum” (SFS. Conditional on the allelic proportions, the likelihoods of such data can be modeled as binomial. A simple model representing the evolution of allelic proportions is the biallelic mutation-drift or mutation-directional selection-drift diffusion model. With series of orthogonal polynomials, specifically Jacobi and Gegenbauer polynomials, or the related spheroidal wave function, the diffusion equations can be solved efficiently. In the neutral case, the product of the binomial likelihoods with the sum of such polynomials leads to finite series of polynomials, i.e., relatively simple equations, from which the exact likelihoods can be calculated. In this article, the use of orthogonal polynomials for inferring population genetic parameters is investigated.

  14. Elevated HbA1c and Fasting Plasma Glucose in Predicting Diabetes Incidence Among Older Adults

    Science.gov (United States)

    Lipska, Kasia J.; Inzucchi, Silvio E.; Van Ness, Peter H.; Gill, Thomas M.; Kanaya, Alka; Strotmeyer, Elsa S.; Koster, Annemarie; Johnson, Karen C.; Goodpaster, Bret H.; Harris, Tamara; De Rekeneire, Nathalie

    2013-01-01

    OBJECTIVE To determine which measures—impaired fasting glucose (IFG), elevated HbA1c, or both—best predict incident diabetes in older adults. RESEARCH DESIGN AND METHODS From the Health, Aging, and Body Composition study, we selected individuals without diabetes, and we defined IFG (100–125 mg/dL) and elevated HbA1c (5.7–6.4%) per American Diabetes Association guidelines. Incident diabetes was based on self-report, use of antihyperglycemic medicines, or HbA1c ≥6.5% during 7 years of follow-up. Logistic regression analyses were adjusted for age, sex, race, site, BMI, smoking, blood pressure, and physical activity. Discrimination and calibration were assessed for models with IFG and with both IFG and elevated HbA1c. RESULTS Among 1,690 adults (mean age 76.5, 46% men, 32% black), 183 (10.8%) developed diabetes over 7 years. Adjusted odds ratios of diabetes were 6.2 (95% CI 4.4–8.8) in those with IFG (versus those with fasting plasma glucose [FPG] HbA1c (versus those with HbA1c HbA1c were considered together, odds ratios were 3.5 (1.9–6.3) in those with IFG only, 8.0 (4.8–13.2) in those with elevated HbA1c only, and 26.2 (16.3–42.1) in those with both IFG and elevated HbA1c (versus those with normal FPG and HbA1c). Addition of elevated HbA1c to the model with IFG resulted in improved discrimination and calibration. CONCLUSIONS Older adults with both IFG and elevated HbA1c have a substantially increased odds of developing diabetes over 7 years. Combined screening with FPG and HbA1c may identify older adults at very high risk for diabetes. PMID:24135387

  15. Incidence, Prognostic Impact, and Predictive Factors of Readmission for Heart Failure After Transcatheter Aortic Valve Replacement.

    Science.gov (United States)

    Durand, Eric; Doutriaux, Maxime; Bettinger, Nicolas; Tron, Christophe; Fauvel, Charles; Bauer, Fabrice; Dacher, Jean-Nicolas; Bouhzam, Najime; Litzler, Pierre-Yves; Cribier, Alain; Eltchaninoff, Hélène

    2017-12-11

    The aim of this study was to assess the incidence, prognostic impact, and predictive factors of readmission for congestive heart failure (CHF) in patients with severe aortic stenosis treated by transcatheter aortic valve replacement (TAVR). TAVR is indicated in patients with severe symptomatic aortic stenosis in whom surgery is considered high risk or is contraindicated. Readmission for CHF after TAVR remains a challenge, and data on prognostic and predictive factors are lacking. All patients who underwent TAVR from January 2010 to December 2014 were included. Follow-up was achieved for at least 1 year and included clinical and echocardiographic data. Readmission for CHF was analyzed retrospectively. This study included 546 patients, 534 (97.8%) of whom were implanted with balloon-expandable valves preferentially via the transfemoral approach in 87.8% of cases. After 1 year, 285 patients (52.2%) had been readmitted at least once, 132 (24.1%) for CHF. Patients readmitted for CHF had an increased risk for death (p < 0.0001) and cardiac death (p < 0.0001) compared with those not readmitted for CHF. On multivariate analysis, aortic mean gradient (hazard ratio [HR]: 0.88; 95% confidence interval [CI]: 0.79 to 0.99; p = 0.03), post-procedural blood transfusion (HR: 2.27; 95% CI: 1.13 to 5.56; p = 0.009), severe post-procedural pulmonary hypertension (HR: 1.04; 95% CI: 1.00 to 1.07; p < 0.0001), and left atrial diameter (HR: 1.47; 95% CI: 1.08 to 2.01; p = 0.02) were independently associated with CHF readmission at 1 year. Readmission for CHF after TAVR was frequent and was strongly associated with 1-year mortality. Low gradient, persistent pulmonary hypertension, left atrial dilatation, and transfusions were predictive of readmission for CHF. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  16. Derivation of LDA log likelihood ratio one-to-one classifier

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan

    2014-01-01

    The common expression for the Likelihood Ratio classifier using LDA assumes that the reference class mean is available. In biometrics, this is often not the case and only a single sample of the reference class is available. In this paper expressions are derived for biometric comparison between

  17. Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.

    Science.gov (United States)

    Heesacker, Martin

    1986-01-01

    Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…

  18. Syphilis Predicts HIV Incidence Among Men and Transgender Women Who Have Sex With Men in a Preexposure Prophylaxis Trial

    Science.gov (United States)

    Solomon, Marc M.; Mayer, Kenneth H.; Glidden, David V.; Liu, Albert Y.; McMahan, Vanessa M.; Guanira, Juan V.; Chariyalertsak, Suwat; Fernandez, Telmo; Grant, Robert M.; Bekker, Linda-Gail; Buchbinder, Susan; Casapia, Martin; Chariyalertsak, Suwat; Guanira, Juan; Kallas, Esper; Lama, Javier; Mayer, Kenneth; Montoya, Orlando; Schechter, Mauro; Veloso, Valdiléa

    2014-01-01

    Background. Syphilis infection may potentiate transmission of human immunodeficiency virus (HIV). We sought to determine the extent to which HIV acquisition was associated with syphilis infection within an HIV preexposure prophylaxis (PrEP) trial and whether emtricitabine/tenofovir (FTC/TDF) modified that association. Methods. The Preexposure Prophylaxis Initiative (iPrEx) study randomly assigned 2499 HIV-seronegative men and transgender women who have sex with men (MSM) to receive oral daily FTC/TDF or placebo. Syphilis prevalence at screening and incidence during follow-up were measured. Hazard ratios for the effect of incident syphilis on HIV acquisition were calculated. The effect of FTC/TDF on incident syphilis and HIV acquisition was assessed. Results. Of 2499 individuals, 360 (14.4%) had a positive rapid plasma reagin test at screening; 333 (92.5%) had a positive confirmatory test, which did not differ between the arms (FTC/TDF vs placebo, P = .81). The overall syphilis incidence during the trial was 7.3 cases per 100 person-years. There was no difference in syphilis incidence between the study arms (7.8 cases per 100 person-years for FTC/TDF vs 6.8 cases per 100 person-years for placebo, P = .304). HIV incidence varied by incident syphilis (2.8 cases per 100 person-years for no syphilis vs 8.0 cases per 100 person-years for incident syphilis), reflecting a hazard ratio of 2.6 (95% confidence interval, 1.6–4.4; P < .001). There was no evidence for interaction between randomization to the FTC/TDF arm and incident syphilis on HIV incidence. Conclusions. In HIV-seronegative MSM, syphilis infection was associated with HIV acquisition in this PrEP trial; a syphilis diagnosis should prompt providers to offer PrEP unless otherwise contraindicated. PMID:24928295

  19. Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States)

    2016-03-14

    We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.

  20. Likelihood updating of random process load and resistance parameters by monitoring

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    that maximum likelihood estimation is a rational alternative to an arbitrary weighting for least square fitting. The derived likelihood function gets singularities if the spectrum is prescribed with zero values at some frequencies. This is often the case for models of technically relevant processes......, even though it is of complicated mathematical form, allows an approximate Bayesian updating and control of the time development of the parameters. Some of these parameters can be structural parameters that by too much change reveal progressing damage or other malfunctioning. Thus current process......Spectral parameters for a stationary Gaussian process are most often estimated by Fourier transformation of a realization followed by some smoothing procedure. This smoothing is often a weighted least square fitting of some prespecified parametric form of the spectrum. In this paper it is shown...

  1. Menyoal Elaboration Likelihood Model (ELM) dan Teori Retorika

    OpenAIRE

    Yudi Perbawaningsih

    2012-01-01

    Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of ...

  2. Menyoal Elaboration Likelihood Model (ELM) Dan Teori Retorika

    OpenAIRE

    Perbawaningsih, Yudi

    2012-01-01

    : Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the mess...

  3. Democracy, Autocracy and the Likelihood of International Conflict

    OpenAIRE

    Tangerås, Thomas

    2008-01-01

    This is a game-theoretic analysis of the link between regime type and international conflict. The democratic electorate can credibly punish the leader for bad conflict outcomes, whereas the autocratic selectorate cannot. For the fear of being thrown out of office, democratic leaders are (i) more selective about the wars they initiate and (ii) on average win more of the wars they start. Foreign policy behaviour is found to display strategic complementarities. The likelihood of interstate war, ...

  4. Maximum-likelihood fitting of data dominated by Poisson statistical uncertainties

    International Nuclear Information System (INIS)

    Stoneking, M.R.; Den Hartog, D.J.

    1996-06-01

    The fitting of data by χ 2 -minimization is valid only when the uncertainties in the data are normally distributed. When analyzing spectroscopic or particle counting data at very low signal level (e.g., a Thomson scattering diagnostic), the uncertainties are distributed with a Poisson distribution. The authors have developed a maximum-likelihood method for fitting data that correctly treats the Poisson statistical character of the uncertainties. This method maximizes the total probability that the observed data are drawn from the assumed fit function using the Poisson probability function to determine the probability for each data point. The algorithm also returns uncertainty estimates for the fit parameters. They compare this method with a χ 2 -minimization routine applied to both simulated and real data. Differences in the returned fits are greater at low signal level (less than ∼20 counts per measurement). the maximum-likelihood method is found to be more accurate and robust, returning a narrower distribution of values for the fit parameters with fewer outliers

  5. Adverse incidents, patient flow and nursing variables on acute psychiatric wards: the Tompkins Acute Ward study

    NARCIS (Netherlands)

    Bowers, L.; Simpson, A.; Warren, J.; Allan, T.; Nijman, H.L.I.

    2007-01-01

    Background: Adverse incidents (violence, self-harm and absconding) can cause significant harm to patients and staff, are difficult to predict, and are driving an increase in security measures and defensive practice. Aims: To explore the relationship between adverse incidents on acute psychiatric

  6. Incidence of refeeding syndrome in internal medicine patients.

    Science.gov (United States)

    Kraaijenbrink, B V C; Lambers, W M; Mathus-Vliegen, E M H; Siegert, C E H

    2016-03-01

    Refeeding syndrome is a potentially fatal shift of fluids and electrolytes that may occur after reintroducing nutrition in a malnourished patient. Its incidence in internal medicine patients is not known. We aimed at determining the incidence in a heterogeneous group of patients acutely admitted to a department of internal medicine. All patients acutely admitted to the department of internal medicine of a teaching community hospital in Amsterdam, the Netherlands, between 22 February 2011 and 29 April 2011, were included. We applied the National Institute for Health and Care Excellence (NICE) criteria for determining people at risk of refeeding syndrome and took hypophosphataemia as the main indicator for the presence of this syndrome. Of 178 patients included in the study, 97 (54%) were considered to be at risk of developing refeeding syndrome and 14 patients actually developed the syndrome (14% of patients at risk and 8% of study population). Patients with a malignancy or previous malignancy were at increased risk of developing refeeding syndrome (p refeeding syndrome. The Short Nutritional Assessment Questionnaire score had a positive and negative predictive value of 13% and 95% respectively. The incidence of refeeding syndrome was relatively high in patients acutely admitted to the department of internal medicine. Oncology patients are at increased risk of developing refeeding syndrome. When taking the occurrence of hypophosphataemia as a hallmark, no other single clinical or composite parameter could be identified that accurately predicts the development of refeeding syndrome.

  7. A multifactorial likelihood model for MMR gene variant classification incorporating probabilities based on sequence bioinformatics and tumor characteristics: a report from the Colon Cancer Family Registry.

    Science.gov (United States)

    Thompson, Bryony A; Goldgar, David E; Paterson, Carol; Clendenning, Mark; Walters, Rhiannon; Arnold, Sven; Parsons, Michael T; Michael D, Walsh; Gallinger, Steven; Haile, Robert W; Hopper, John L; Jenkins, Mark A; Lemarchand, Loic; Lindor, Noralane M; Newcomb, Polly A; Thibodeau, Stephen N; Young, Joanne P; Buchanan, Daniel D; Tavtigian, Sean V; Spurdle, Amanda B

    2013-01-01

    Mismatch repair (MMR) gene sequence variants of uncertain clinical significance are often identified in suspected Lynch syndrome families, and this constitutes a challenge for both researchers and clinicians. Multifactorial likelihood model approaches provide a quantitative measure of MMR variant pathogenicity, but first require input of likelihood ratios (LRs) for different MMR variation-associated characteristics from appropriate, well-characterized reference datasets. Microsatellite instability (MSI) and somatic BRAF tumor data for unselected colorectal cancer probands of known pathogenic variant status were used to derive LRs for tumor characteristics using the Colon Cancer Family Registry (CFR) resource. These tumor LRs were combined with variant segregation within families, and estimates of prior probability of pathogenicity based on sequence conservation and position, to analyze 44 unclassified variants identified initially in Australasian Colon CFR families. In addition, in vitro splicing analyses were conducted on the subset of variants based on bioinformatic splicing predictions. The LR in favor of pathogenicity was estimated to be ~12-fold for a colorectal tumor with a BRAF mutation-negative MSI-H phenotype. For 31 of the 44 variants, the posterior probabilities of pathogenicity were such that altered clinical management would be indicated. Our findings provide a working multifactorial likelihood model for classification that carefully considers mode of ascertainment for gene testing. © 2012 Wiley Periodicals, Inc.

  8. Modeling the relationship between precipitation and malaria incidence in children from a holoendemic area in Ghana.

    Science.gov (United States)

    Krefis, Anne Caroline; Schwarz, Norbert Georg; Krüger, Andreas; Fobil, Julius; Nkrumah, Bernard; Acquah, Samuel; Loag, Wibke; Sarpong, Nimako; Adu-Sarkodie, Yaw; Ranft, Ulrich; May, Jürgen

    2011-02-01

    Climatic factors influence the incidence of vector-borne diseases such as malaria. They modify the abundance of mosquito populations, the length of the extrinsic parasite cycle in the mosquito, the malarial dynamics, and the emergence of epidemics in areas of low endemicity. The objective of this study was to investigate temporal associations between weekly malaria incidence in 1,993 children < 15 years of age and weekly rainfall. A time series analysis was conducted by using cross-correlation function and autoregressive modeling. The regression model showed that the level of rainfall predicted the malaria incidence after a time lag of 9 weeks (mean = 60 days) and after a time lag between one and two weeks. The analyses provide evidence that high-resolution precipitation data can directly predict malaria incidence in a highly endemic area. Such models might enable the development of early warning systems and support intervention measures.

  9. Ciprofloxacin Resistance and Gonorrhea Incidence Rates in 17 Cities, United States, 1991–2006

    Science.gov (United States)

    Kirkcaldy, Robert D.; Gift, Thomas L.; Owusu-Edusei, Kwame; Weinstock, Hillard S.

    2014-01-01

    Antimicrobial drug resistance can hinder gonorrhea prevention and control efforts. In this study, we analyzed historical ciprofloxacin resistance data and gonorrhea incidence data to examine the possible effect of antimicrobial drug resistance on gonorrhea incidence at the population level. We analyzed data from the Gonococcal Isolate Surveillance Project and city-level gonorrhea incidence rates from surveillance data for 17 cities during 1991–2006. We found a strong positive association between ciprofloxacin resistance and gonorrhea incidence rates at the city level during this period. Their association was consistent with predictions of mathematical models in which resistance to treatment can increase gonorrhea incidence rates through factors such as increased duration of infection. These findings highlight the possibility of future increases in gonorrhea incidence caused by emerging cephalosporin resistance. PMID:24655615

  10. Rodenticide incidents of exposure and adverse effects on non-raptor birds

    Science.gov (United States)

    Vyas, Nimish B.

    2017-01-01

    Interest in the adverse effects of rodenticides on birds has focused primarily on raptors. However, non-raptor birds are also poisoned (rodenticide exposure resulting in adverse effects including mortality) by rodenticides through consumption of the rodenticide bait and contaminated prey. A literature search for rodenticide incidents (evidence of exposure to a rodenticide, adverse effects, or exposure to placebo baits) involving non-raptor birds returned 641 records spanning the years 1931 to 2016. The incidents included 17 orders, 58 families, and 190 non-raptor bird species. Nineteen anticoagulant and non-anticoagulant rodenticide active ingredients were associated with the incidents. The number of incidents and species detected were compared by surveillance method. An incident was considered to have been reported through passive surveillance if it was voluntarily reported to the authorities whereas the report of an incident found through field work that was conducted with the objective of documenting adverse effects on birds was determined to be from active surveillance. More incidents were reported from passive surveillance than with active surveillance but a significantly greater number of species were detected in proportion to the number of incidents found through active surveillance than with passive surveillance (z = 7.61, p raptor bird poisonings from rodenticides may increase incident reportings and can strengthen the predictions of harm characterized by risk assessments.

  11. Robust Likelihoods for Inflationary Gravitational Waves from Maps of Cosmic Microwave Background Polarization

    Science.gov (United States)

    Switzer, Eric Ryan; Watts, Duncan J.

    2016-01-01

    The B-mode polarization of the cosmic microwave background provides a unique window into tensor perturbations from inflationary gravitational waves. Survey effects complicate the estimation and description of the power spectrum on the largest angular scales. The pixel-space likelihood yields parameter distributions without the power spectrum as an intermediate step, but it does not have the large suite of tests available to power spectral methods. Searches for primordial B-modes must rigorously reject and rule out contamination. Many forms of contamination vary or are uncorrelated across epochs, frequencies, surveys, or other data treatment subsets. The cross power and the power spectrum of the difference of subset maps provide approaches to reject and isolate excess variance. We develop an analogous joint pixel-space likelihood. Contamination not modeled in the likelihood produces parameter-dependent bias and complicates the interpretation of the difference map. We describe a null test that consistently weights the difference map. Excess variance should either be explicitly modeled in the covariance or be removed through reprocessing the data.

  12. Entrepreneurial behavior : New perspectives gained through the critical incident technique

    NARCIS (Netherlands)

    Nandram, S.S.; Samsom, K.J.

    2007-01-01

    Responding to criticism of the trait approach in studying entrepreneurship, a process and context oriented methodology was applied using the Critical Incident Technique (CIT) in predicting success and failure. The actions of entrepreneurs were subsequently translated into (1) dynamic traits with a

  13. Remission and incidence of obstructive sleep apnea from middle childhood to late adolescence.

    Science.gov (United States)

    Spilsbury, James C; Storfer-Isser, Amy; Rosen, Carol L; Redline, Susan

    2015-01-01

    To study the incidence, remission, and prediction of obstructive sleep apnea (OSA) from middle childhood to late adolescence. Longitudinal analysis. The Cleveland Children's Sleep and Health Study, an ethnically mixed, urban, community-based cohort, followed 8 y. There were 490 participants with overnight polysomnography data available at ages 8-11 and 16-19 y. Baseline participant characteristics and health history were ascertained from parent report and US census data. OSA was defined as an obstructive apnea- hypopnea index ≥ 5 or an obstructive apnea index ≥ 1. OSA prevalence was approximately 4% at each examination, but OSA largely did not persist from middle childhood to late adolescence. Habitual snoring and obesity predicted OSA in cross-sectional analyses at each time point. Residence in a disadvantaged neighborhood, African-American race, and premature birth also predicted OSA in middle childhood, whereas male sex, high body mass index, and history of tonsillectomy or adenoidectomy were risk factors among adolescents. Obesity, but not habitual snoring, in middle childhood predicted adolescent OSA. Because OSA in middle childhood usually remitted by adolescence and most adolescent cases were incident cases, criteria other than concern alone over OSA persistence or incidence should be used when making treatment decisions for pediatric OSA. Moreover, OSA's distinct risk factors at each time point underscore the need for alternative risk-factor assessments across pediatric ages. The greater importance of middle childhood obesity compared to snoring in predicting adolescent OSA provides support for screening, preventing, and treating obesity in childhood. © 2014 Associated Professional Sleep Societies, LLC.

  14. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  15. Likelihood-Based Inference of B Cell Clonal Families.

    Directory of Open Access Journals (Sweden)

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  16. Prediction of "BRCAness" in breast cancer by array comparative genomic hybridization

    NARCIS (Netherlands)

    Joosse, Simon Andreas

    2012-01-01

    Predicting the likelihood that an individual is a BRCA mutation carrier is the first step to genetic counseling, followed by germ-line mutation testing in many family cancer clinics. Individuals who have been diagnosed as BRCA mutation-positive are offered special medical care; however, clinical

  17. Predicting an optimal outcome after radical prostatectomy: the trifecta nomogram.

    Science.gov (United States)

    Eastham, James A; Scardino, Peter T; Kattan, Michael W

    2008-06-01

    The optimal outcome after radical prostatectomy for clinically localized prostate cancer is freedom from biochemical recurrence along with the recovery of continence and erectile function, a so-called trifecta. We evaluated our series of open radical prostatectomy cases to determine the likelihood of this outcome and develop a nomogram predicting the trifecta. We reviewed the records of patients undergoing open radical prostatectomy for clinical stage T1c-T3a prostate cancer at our center during 2000 to 2006. Men were excluded if they received preoperative hormonal therapy, chemotherapy or radiation therapy, if pretreatment prostate specific antigen was more than 50 ng/ml, or if they were impotent or incontinent before radical prostatectomy. A total of 1,577 men were included in the study. Freedom from biochemical recurrence was defined as post-radical prostatectomy prostate specific antigen less than 0.2 ng/ml. Continence was defined as not having to wear any protective pads. Potency was defined as erection adequate for intercourse upon most attempts with or without phosphodiesterase-5 inhibitor. Mean patient age was 58 years and mean pretreatment prostate specific antigen was 6.4 ng/ml. A trifecta outcome (cancer-free status with recovery of continence and potency) was achieved in 62% of patients. In a nomogram developed to predict the likelihood of the trifecta baseline prostate specific antigen was the major predictive factor. Area under the ROC curve for the nomogram was 0.773 and calibration appeared excellent. A trifecta (optimal) outcome can be achieved in most men undergoing radical prostatectomy. The nomogram permits patients to estimate preoperatively their likelihood of an optimal outcome after radical prostatectomy.

  18. A biclustering algorithm for binary matrices based on penalized Bernoulli likelihood

    KAUST Repository

    Lee, Seokho; Huang, Jianhua Z.

    2013-01-01

    We propose a new biclustering method for binary data matrices using the maximum penalized Bernoulli likelihood estimation. Our method applies a multi-layer model defined on the logits of the success probabilities, where each layer represents a

  19. Coalescent-based species tree inference from gene tree topologies under incomplete lineage sorting by maximum likelihood.

    Science.gov (United States)

    Wu, Yufeng

    2012-03-01

    Incomplete lineage sorting can cause incongruence between the phylogenetic history of genes (the gene tree) and that of the species (the species tree), which can complicate the inference of phylogenies. In this article, I present a new coalescent-based algorithm for species tree inference with maximum likelihood. I first describe an improved method for computing the probability of a gene tree topology given a species tree, which is much faster than an existing algorithm by Degnan and Salter (2005). Based on this method, I develop a practical algorithm that takes a set of gene tree topologies and infers species trees with maximum likelihood. This algorithm searches for the best species tree by starting from initial species trees and performing heuristic search to obtain better trees with higher likelihood. This algorithm, called STELLS (which stands for Species Tree InfErence with Likelihood for Lineage Sorting), has been implemented in a program that is downloadable from the author's web page. The simulation results show that the STELLS algorithm is more accurate than an existing maximum likelihood method for many datasets, especially when there is noise in gene trees. I also show that the STELLS algorithm is efficient and can be applied to real biological datasets. © 2011 The Author. Evolution© 2011 The Society for the Study of Evolution.

  20. Modified Moment, Maximum Likelihood and Percentile Estimators for the Parameters of the Power Function Distribution

    Directory of Open Access Journals (Sweden)

    Azam Zaka

    2014-10-01

    Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.

  1. Predictive power of the DASA-IV: Variations in rating method and timescales.

    Science.gov (United States)

    Nqwaku, Mphindisi; Draycott, Simon; Aldridge-Waddon, Luke; Bush, Emma-Louise; Tsirimokou, Alexandra; Jones, Dominic; Puzzo, Ignazio

    2018-05-10

    This project evaluated the predictive validity of the Dynamic Appraisal of Situational Aggression - Inpatient Version (DASA-IV) in a high-secure psychiatric hospital in the UK over 24 hours and over a single nursing shift. DASA-IV scores from three sequential nursing shifts over a 24-hour period were compared with the mean (average of three scores across the 24-hour period) and peak (highest of the three scores across the 24-hour period) scores across these shifts. In addition, scores from a single nursing shift were used to predict aggressive incidents over each of the following three shifts. The DASA-IV was completed by nursing staff during handover meetings, rating 43 male psychiatric inpatients over a period of 6 months. Data were compared to incident reports recorded over the same period. Receiver operating characteristic (ROC) curves and generalized estimating equations assessed the predictive ability of various DASA-IV scores over 24-hour and single-shift timescales. Scores from the DASA-IV based on a single shift had moderate predictive ability for aggressive incidents occurring the next calendar day, whereas scores based on all three shifts had excellent predictive ability. DASA-IV scores from a single shift showed moderate predictive ability for each of the following three shifts. The DASA-IV has excellent predictive ability for aggressive incidents within a secure setting when data are summarized over a 24-hour period, as opposed to when a single rating is taken. In addition, it has moderate value for predicting incidents over even shorter timescales. © 2018 Australian College of Mental Health Nurses Inc.

  2. Incidence and time trends of Herpes zoster in rheumatoid arthritis: a population-based cohort study

    Science.gov (United States)

    Veetil, Bharath Manu Akkara; Myasoedova, Elena; Matteson, Eric L.; Gabriel, Sherine E.; Green, Abigail B.; Crowson, Cynthia S.

    2012-01-01

    Objective To determine the incidence, time trends, risk factors and severity of herpes zoster (HZ) in a population-based incidence cohort of patients with rheumatoid arthritis (RA) compared to a group of individuals without RA from the same population. Methods All residents of Olmsted County, MN who first fulfilled 1987 American College of Rheumatology criteria for RA between 1/1/1980 and 12/31/2007 and a cohort of similar residents without RA were assembled and followed by retrospective chart review until death, migration, or 12/31/2008. Results There was no difference in the presence of HZ prior to RA incidence/index date between the cohorts (p=0.85). During follow-up 84 patients with RA (rate: 12.1 per 1000 person-years) and 44 subjects without RA (rate: 5.4 per 1000 person-years) developed HZ. Patients with RA were more likely to develop HZ than those without RA (hazard ratio: 2.4; 95% confidence interval: 1.7, 3.5). Patients diagnosed with RA in 1995–2007 had a higher likelihood of developing HZ than those diagnosed in 1980–1994. Erosive disease, previous joint surgery, use of hydroxychloroquine and corticosteroids were significantly associated with the development of HZ in RA, while the use of methotrexate or biologic agents was not. Complications of HZ occurred at a similar rate in both cohorts. Conclusion The incidence of HZ is increased in RA and has risen in recent years. The increasing incidence of HZ in more recent years is also noted in the general population. RA disease severity is associated with development of HZ. PMID:23281295

  3. MEM spectral analysis for predicting influenza epidemics in Japan.

    Science.gov (United States)

    Sumi, Ayako; Kamo, Ken-ichi

    2012-03-01

    The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.

  4. Failed refutations: further comments on parsimony and likelihood methods and their relationship to Popper's degree of corroboration.

    Science.gov (United States)

    de Queiroz, Kevin; Poe, Steven

    2003-06-01

    Kluge's (2001, Syst. Biol. 50:322-330) continued arguments that phylogenetic methods based on the statistical principle of likelihood are incompatible with the philosophy of science described by Karl Popper are based on false premises related to Kluge's misrepresentations of Popper's philosophy. Contrary to Kluge's conjectures, likelihood methods are not inherently verificationist; they do not treat every instance of a hypothesis as confirmation of that hypothesis. The historical nature of phylogeny does not preclude phylogenetic hypotheses from being evaluated using the probability of evidence. The low absolute probabilities of hypotheses are irrelevant to the correct interpretation of Popper's concept termed degree of corroboration, which is defined entirely in terms of relative probabilities. Popper did not advocate minimizing background knowledge; in any case, the background knowledge of both parsimony and likelihood methods consists of the general assumption of descent with modification and additional assumptions that are deterministic, concerning which tree is considered most highly corroborated. Although parsimony methods do not assume (in the sense of entailing) that homoplasy is rare, they do assume (in the sense of requiring to obtain a correct phylogenetic inference) certain things about patterns of homoplasy. Both parsimony and likelihood methods assume (in the sense of implying by the manner in which they operate) various things about evolutionary processes, although violation of those assumptions does not always cause the methods to yield incorrect phylogenetic inferences. Test severity is increased by sampling additional relevant characters rather than by character reanalysis, although either interpretation is compatible with the use of phylogenetic likelihood methods. Neither parsimony nor likelihood methods assess test severity (critical evidence) when used to identify a most highly corroborated tree(s) based on a single method or model and a

  5. Statistical Bias in Maximum Likelihood Estimators of Item Parameters.

    Science.gov (United States)

    1982-04-01

    34 a> E r’r~e r ,C Ie I# ne,..,.rVi rnd Id.,flfv b1 - bindk numb.r) I; ,t-i i-cd I ’ tiie bias in the maximum likelihood ,st i- i;, ’ t iIeiIrs in...NTC, IL 60088 Psychometric Laboratory University of North Carolina I ERIC Facility-Acquisitions Davie Hall 013A 4833 Rugby Avenue Chapel Hill, NC

  6. Performances of the likelihood-ratio classifier based on different data modelings

    NARCIS (Netherlands)

    Chen, C.; Veldhuis, Raymond N.J.

    2008-01-01

    The classical likelihood ratio classifier easily collapses in many biometric applications especially with independent training-test subjects. The reason lies in the inaccurate estimation of the underlying user-specific feature density. Firstly, the feature density estimation suffers from

  7. Task-based detectability in CT image reconstruction by filtered backprojection and penalized likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Grace J. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (Canada); Stayman, J. Webster; Zbijewski, Wojciech [Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (United States); Siewerdsen, Jeffrey H., E-mail: jeff.siewerdsen@jhu.edu [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2014-08-15

    Purpose: Nonstationarity is an important aspect of imaging performance in CT and cone-beam CT (CBCT), especially for systems employing iterative reconstruction. This work presents a theoretical framework for both filtered-backprojection (FBP) and penalized-likelihood (PL) reconstruction that includes explicit descriptions of nonstationary noise, spatial resolution, and task-based detectability index. Potential utility of the model was demonstrated in the optimal selection of regularization parameters in PL reconstruction. Methods: Analytical models for local modulation transfer function (MTF) and noise-power spectrum (NPS) were investigated for both FBP and PL reconstruction, including explicit dependence on the object and spatial location. For FBP, a cascaded systems analysis framework was adapted to account for nonstationarity by separately calculating fluence and system gains for each ray passing through any given voxel. For PL, the point-spread function and covariance were derived using the implicit function theorem and first-order Taylor expansion according toFessler [“Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): Applications to tomography,” IEEE Trans. Image Process. 5(3), 493–506 (1996)]. Detectability index was calculated for a variety of simple tasks. The model for PL was used in selecting the regularization strength parameter to optimize task-based performance, with both a constant and a spatially varying regularization map. Results: Theoretical models of FBP and PL were validated in 2D simulated fan-beam data and found to yield accurate predictions of local MTF and NPS as a function of the object and the spatial location. The NPS for both FBP and PL exhibit similar anisotropic nature depending on the pathlength (and therefore, the object and spatial location within the object) traversed by each ray, with the PL NPS experiencing greater smoothing along directions with higher noise. The MTF of FBP

  8. The incidence of associated abnormalities in patients with sacrococcygeal teratoma

    NARCIS (Netherlands)

    Kremer, Marijke E. B.; Althof, Jessica F.; Derikx, Joep P. M.; van Baren, Robertine; Heij, Hugo A.; Wijnen, Marc H. W. A.; Wijnen, René M. H.; van der Zee, David C.; van Heurn, L. W. Ernest

    2018-01-01

    Gross genetic causes for SCT are unknown; however, it might be associated with other abnormalities. We assessed the incidence of associated abnormalities in a large national cohort of neonates with SCT and aimed to identify predictive risk factors. The medical records were reviewed of 235

  9. Predicting the incidence of hand, foot and mouth disease in Sichuan province, China using the ARIMA model.

    Science.gov (United States)

    Liu, L; Luan, R S; Yin, F; Zhu, X P; Lü, Q

    2016-01-01

    Hand, foot and mouth disease (HFMD) is an infectious disease caused by enteroviruses, which usually occurs in children aged ARIMA) model to forecast HFMD incidence in Sichuan province, China. HFMD infection data from January 2010 to June 2014 were used to fit the ARIMA model. The coefficient of determination (R 2), normalized Bayesian Information Criterion (BIC) and mean absolute percentage of error (MAPE) were used to evaluate the goodness-of-fit of the constructed models. The fitted ARIMA model was applied to forecast the incidence of HMFD from April to June 2014. The goodness-of-fit test generated the optimum general multiplicative seasonal ARIMA (1,0,1) × (0,1,0)12 model (R 2 = 0·692, MAPE = 15·982, BIC = 5·265), which also showed non-significant autocorrelations in the residuals of the model (P = 0·893). The forecast incidence values of the ARIMA (1,0,1) × (0,1,0)12 model from July to December 2014 were 4103-9987, which were proximate forecasts. The ARIMA model could be applied to forecast HMFD incidence trend and provide support for HMFD prevention and control. Further observations should be carried out continually into the time sequence, and the parameters of the models could be adjusted because HMFD incidence will not be absolutely stationary in the future.

  10. Is Global Warming likely to cause an increased incidence of Malaria?

    Science.gov (United States)

    Nabi, SA; Qader, SS

    2009-01-01

    The rise in the average temperature of earth has been described as global warming which is mainly attributed to the increasing phenomenon of the greenhouse effect. It is believed that global warming can have several harmful effects on human health, both directly and indirectly. Since malaria is greatly influenced by climatic conditions because of its direct relationship with the mosquito population, it is widely assumed that its incidence is likely to increase in a future warmer world. This review article discusses the two contradictory views regarding the association of global warming with an increased incidence of malaria. On one hand, there are many who believe that there is a strong association between the recent increase in malaria incidence and global warming. They predict that as global warming continues, malaria is set to spread in locations where previously it was limited, due to cooler climate. On the other hand, several theories have been put forward which are quite contrary to this prediction. There are multiple other factors which are accountable for the recent upsurge of malaria: for example drug resistance, mosquito control programs, public health facilities, and living standards. PMID:21483497

  11. Subclinical carotid atherosclerosis and triglycerides predict the incidence of chronic kidney disease in the Japanese general population: results from the Kyushu and Okinawa Population Study (KOPS).

    Science.gov (United States)

    Shimizu, Motohiro; Furusyo, Norihiro; Mitsumoto, Fujiko; Takayama, Koji; Ura, Kazuya; Hiramine, Satoshi; Ikezaki, Hiroaki; Ihara, Takeshi; Mukae, Haru; Ogawa, Eiichi; Toyoda, Kazuhiro; Kainuma, Mosaburo; Murata, Masayuki; Hayashi, Jun

    2015-02-01

    To examine whether or not subclinical atherosclerosis independently predicts the incidence of chronic kidney disease (CKD) in the Japanese general population. This study is part of the Kyushu and Okinawa Population Study (KOPS), a survey of vascular events associated with lifestyle-related diseases. Participants who attended both baseline (2004-2007) and follow-up (2009-2012) examinations were eligible. The common carotid intima-media thickness (IMT) was assessed for each participant at baseline. The end point was the incidence of CKD, defined as an estimated glomerular filtration rate (eGFR) triglycerides (1.6 ± 0.8 vs. 1.3 ± 0.7 mmol/L, P triglycerides (OR 1.35, 95% CI 1.06-1.73, P = 0.015) at baseline were independent predictors for the development of CKD. Higher carotid IMT and hypertriglyceridemia were independently associated with the development of CKD in the population studied. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. PERBANDINGAN ESTIMASI KEMAMPUAN LATEN ANTARA METODE MAKSIMUM LIKELIHOOD DAN METODE BAYES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2015-10-01

    Full Text Available Studi ini bertujuan untuk membandingkan ketepatan estimasi kemampuan laten (latent trait pada model logistik dengan metode maksimum likelihood (ML gabungan dan bayes. Studi ini menggunakan metode simulasi Monte Carlo, dengan model data ujian nasional matematika SMP. Variabel simulasi adalah panjang tes dan banyaknya peserta.  Data dibangkitkan dengan menggunakan SAS/IML dengan replikasi 40 kali, dan tiap data diestimasi dengan ML dan Bayes. Hasil estimasi kemudian dibandingkan dengan kemampuan yang sebenarnya, dengan menghitung mean square of error (MSE dan korelasi antara kemampuan laten yang sebenarnya dan hasil estimasi. Metode yang memiliki MSE lebih kecil dikatakan sebagai metode estimasi yang lebih baik. Hasil studi menunjukkan bahwa pada estimasi kemampuan laten dengan 15, 20, 25, dan 30 butir dengan 500 dan 1.000 peserta, hasil MSE belum stabil, namun ketika peserta menjadi 1.500 orang, diperoleh akurasi estimasi kemampuan yang hampir sama baik estimasi antara metode ML dan metode Bayes. Pada estimasi dengan 15 dan 20 butir dan peserta 500, 1.000, dan 1.500, hasil MSE belum stabil, dan ketika estimasi melibatkan 25 dan 30 butir, baik dengan peserta 500, 1.000, maupun 1.500 akan diperoleh hasil yang lebih akurat dengan metode ML. Kata kunci: estimasi kemampuan, metode maksimum likelihood, metode Bayes     THE COMPARISON OF ESTIMATION OF LATENT TRAITS USING MAXIMUM LIKELIHOOD AND BAYES METHODS Abstract This study aimed to compare the accuracy of the estimation of latent ability (latent trait in the logistic model using maximum likelihood (ML and Bayes methods. This study uses a quantitative approach that is the Monte Carlo simulation method using students responses to national examination as data model, and variables are the length of the test and the number of participants. The data were generated using SAS/IML with replication 40 times, and each datum is then estimated by ML and Bayes. The estimation results are then compared with the

  13. How to Improve the Likelihood of CDM Approval?

    DEFF Research Database (Denmark)

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2014-01-01

    How can the likelihood of Clean Development Mechanism (CDM) approval be improved in the face of institutional shortcomings? To answer this question, we focus on the three institutional shortcomings of income sharing, risk sharing and corruption prevention concerning afforestation/reforestation (A....../R). Furthermore, three main stakeholders are identified, namely investors, governments and agents in a principal-agent model regarding monitoring and enforcement capacity. Developing countries such as West Africa have, despite huge potentials, not been integrated in A/R CDM projects yet. Remote sensing, however...

  14. Dynamic prediction of patient outcomes during ongoing cardiopulmonary resuscitation.

    Science.gov (United States)

    Kim, Joonghee; Kim, Kyuseok; Callaway, Clifton W; Doh, Kibbeum; Choi, Jungho; Park, Jongdae; Jo, You Hwan; Lee, Jae Hyuk

    2017-02-01

    The probability of the return of spontaneous circulation (ROSC) and subsequent favourable outcomes changes dynamically during advanced cardiac life support (ACLS). We sought to model these changes using time-to-event analysis in out-of-hospital cardiac arrest (OHCA) patients. Adult (≥18 years old), non-traumatic OHCA patients without prehospital ROSC were included. Utstein variables and initial arterial blood gas measurements were used as predictors. The incidence rate of ROSC during the first 30min of ACLS in the emergency department (ED) was modelled using spline-based parametric survival analysis. Conditional probabilities of subsequent outcomes after ROSC (1-week and 1-month survival and 6-month neurologic recovery) were modelled using multivariable logistic regression. The ROSC and conditional probability models were then combined to estimate the likelihood of achieving ROSC and subsequent outcomes by providing k additional minutes of effort. A total of 727 patients were analyzed. The incidence rate of ROSC increased rapidly until the 10th minute of ED ACLS, and it subsequently decreased. The conditional probabilities of subsequent outcomes after ROSC were also dependent on the duration of resuscitation with odds ratios for 1-week and 1-month survival and neurologic recovery of 0.93 (95% CI: 0.90-0.96, p<0.001), 0.93 (0.88-0.97, p=0.001) and 0.93 (0.87-0.99, p=0.031) per 1-min increase, respectively. Calibration testing of the combined models showed good correlation between mean predicted probability and actual prevalence. The probability of ROSC and favourable subsequent outcomes changed according to a multiphasic pattern over the first 30min of ACLS, and modelling of the dynamic changes was feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Perceived Sexual Control, Sex-Related Alcohol Expectancies and Behavior Predict Substance-Related Sexual Revictimization

    Science.gov (United States)

    Walsh, Kate; Messman-Moore, Terri; Zerubavel, Noga; Chandley, Rachel B.; DeNardi, Kathleen A.; Walker, Dave P.

    2013-01-01

    Objectives Although numerous studies have documented linkages between childhood sexual abuse (CSA) and later sexual revictimization, mechanisms underlying revictimization, particularly assaults occurring in the context of substance use, are not well-understood. Consistent with Traumagenic Dynamics theory, the present study tested a path model positing that lowered perceptions of sexual control resulting from CSA may be associated with increased sex-related alcohol expectancies and heightened likelihood of risky sexual behavior, which in turn, may predict adult substance-related rape. Methods Participants were 546 female college students who completed anonymous surveys regarding CSA and adult rape, perceptions of sexual control, sex-related alcohol expectancies, and likelihood of engaging in risky sexual behavior. Results The data fit the hypothesized model well and all hypothesized path coefficients were significant and in the expected directions. As expected, sex-related alcohol expectancies and likelihood of risky sexual behavior only predicted substance-related rape, not forcible rape. Conclusions Findings suggested that low perceived sexual control stemming from CSA is associated with increased sex-related alcohol expectancies and a higher likelihood of engaging in sexual behavior in the context of alcohol use. In turn these proximal risk factors heighten vulnerability to substance-related rape. Programs which aim to reduce risk for substance-related rape could be improved by addressing expectancies and motivations for risky sexual behavior in the context of substance use. Implications and future directions are discussed. PMID:23312991

  16. Maximum likelihood estimation and EM algorithm of Copas-like selection model for publication bias correction.

    Science.gov (United States)

    Ning, Jing; Chen, Yong; Piao, Jin

    2017-07-01

    Publication bias occurs when the published research results are systematically unrepresentative of the population of studies that have been conducted, and is a potential threat to meaningful meta-analysis. The Copas selection model provides a flexible framework for correcting estimates and offers considerable insight into the publication bias. However, maximizing the observed likelihood under the Copas selection model is challenging because the observed data contain very little information on the latent variable. In this article, we study a Copas-like selection model and propose an expectation-maximization (EM) algorithm for estimation based on the full likelihood. Empirical simulation studies show that the EM algorithm and its associated inferential procedure performs well and avoids the non-convergence problem when maximizing the observed likelihood. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Likelihood Inference of Nonlinear Models Based on a Class of Flexible Skewed Distributions

    Directory of Open Access Journals (Sweden)

    Xuedong Chen

    2014-01-01

    Full Text Available This paper deals with the issue of the likelihood inference for nonlinear models with a flexible skew-t-normal (FSTN distribution, which is proposed within a general framework of flexible skew-symmetric (FSS distributions by combining with skew-t-normal (STN distribution. In comparison with the common skewed distributions such as skew normal (SN, and skew-t (ST as well as scale mixtures of skew normal (SMSN, the FSTN distribution can accommodate more flexibility and robustness in the presence of skewed, heavy-tailed, especially multimodal outcomes. However, for this distribution, a usual approach of maximum likelihood estimates based on EM algorithm becomes unavailable and an alternative way is to return to the original Newton-Raphson type method. In order to improve the estimation as well as the way for confidence estimation and hypothesis test for the parameters of interest, a modified Newton-Raphson iterative algorithm is presented in this paper, based on profile likelihood for nonlinear regression models with FSTN distribution, and, then, the confidence interval and hypothesis test are also developed. Furthermore, a real example and simulation are conducted to demonstrate the usefulness and the superiority of our approach.

  18. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    Science.gov (United States)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  19. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1978-01-01

    This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.

  20. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.