WorldWideScience

Sample records for incident likelihood prediction

  1. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  2. Prediction of Safety Incidents

    Data.gov (United States)

    National Aeronautics and Space Administration — Safety incidents, including injuries, property damage and mission failures, cost NASA and contractors thousands of dollars in direct and indirect costs. This project...

  3. The Role of Mechanical Variance and Spatial Clustering on the Likelihood of Tumor Incidence and Growth

    Science.gov (United States)

    Mirzakhel, Zibah

    When considering factors that contribute to cancer progression, modifications to both the biological and mechanical pathways play significant roles. However, less attention is placed on how the mechanical pathways can specifically contribute to cancerous behavior. Experimental studies have found that malignant cells are significantly softer than healthy, normal cells. In a tissue environment where healthy or malignant cells exist, a distribution of cell stiffness values is observed, with the mean values used to differentiate between these two populations. Rather than focus on the mean values, emphasis will be placed on the distribution, where instances of soft and stiff cells exist in the healthy tissue environment. Since cell deformability is a trait associated with cancer, the question arises as to whether the mechanical variation observed in healthy tissue cell stiffness distributions can influence any instances of tumor growth. To approach this, a 3D discrete model of cells is used, able to monitor and predict the behavior of individual cells while determining any instances of tumor growth in a healthy tissue. In addition to the mechanical variance, the spatial arrangement of cells will also be modeled, as cell interaction could further implicate any incidences of tumor-like malignant populations within the tissue. Results have shown that the likelihood of tumor incidence is driven by both by the increases in the mechanical variation in the distributions as well as larger clustering of cells that are mechanically similar, quantified primarily through higher proliferation rates of tumor-like soft cells. This can be observed though prominent negative shifts in the mean of the distribution, as it begins to transition and show instances of earlystage tumor growth. The model reveals the impact that both the mechanical variation and spatial arrangement of cells has on tumor progression, suggesting the use of these parameters as potential novel biomarkers. With a

  4. Predicting incident size from limited information

    International Nuclear Information System (INIS)

    Englehardt, J.D.

    1995-01-01

    Predicting the size of low-probability, high-consequence natural disasters, industrial accidents, and pollutant releases is often difficult due to limitations in the availability of data on rare events and future circumstances. When incident data are available, they may be difficult to fit with a lognormal distribution. Two Bayesian probability distributions for inferring future incident-size probabilities from limited, indirect, and subjective information are proposed in this paper. The distributions are derived from Pareto distributions that are shown to fit data on different incident types and are justified theoretically. The derived distributions incorporate both inherent variability and uncertainty due to information limitations. Results were analyzed to determine the amount of data needed to predict incident-size probabilities in various situations. Information requirements for incident-size prediction using the methods were low, particularly when the population distribution had a thick tail. Use of the distributions to predict accumulated oil-spill consequences was demonstrated

  5. Moral Identity Predicts Doping Likelihood via Moral Disengagement and Anticipated Guilt.

    Science.gov (United States)

    Kavussanu, Maria; Ring, Christopher

    2017-08-01

    In this study, we integrated elements of social cognitive theory of moral thought and action and the social cognitive model of moral identity to better understand doping likelihood in athletes. Participants (N = 398) recruited from a variety of team sports completed measures of moral identity, moral disengagement, anticipated guilt, and doping likelihood. Moral identity predicted doping likelihood indirectly via moral disengagement and anticipated guilt. Anticipated guilt about potential doping mediated the relationship between moral disengagement and doping likelihood. Our findings provide novel evidence to suggest that athletes, who feel that being a moral person is central to their self-concept, are less likely to use banned substances due to their lower tendency to morally disengage and the more intense feelings of guilt they expect to experience for using banned substances.

  6. Maximum Likelihood Method for Predicting Environmental Conditions from Assemblage Composition: The R Package bio.infer

    Directory of Open Access Journals (Sweden)

    Lester L. Yuan

    2007-06-01

    Full Text Available This paper provides a brief introduction to the R package bio.infer, a set of scripts that facilitates the use of maximum likelihood (ML methods for predicting environmental conditions from assemblage composition. Environmental conditions can often be inferred from only biological data, and these inferences are useful when other sources of data are unavailable. ML prediction methods are statistically rigorous and applicable to a broader set of problems than more commonly used weighted averaging techniques. However, ML methods require a substantially greater investment of time to program algorithms and to perform computations. This package is designed to reduce the effort required to apply ML prediction methods.

  7. Fatty liver incidence and predictive variables

    International Nuclear Information System (INIS)

    Tsuneto, Akira; Seto, Shinji; Maemura, Koji; Hida, Ayumi; Sera, Nobuko; Imaizumi, Misa; Ichimaru, Shinichiro; Nakashima, Eiji; Akahoshi, Masazumi

    2010-01-01

    Although fatty liver predicts ischemic heart disease, the incidence and predictors of fatty liver need examination. The objective of this study was to determine fatty liver incidence and predictive variables. Using abdominal ultrasonography, we followed biennially through 2007 (mean follow-up, 11.6±4.6 years) 1635 Nagasaki atomic bomb survivors (606 men) without fatty liver at baseline (November 1990 through October 1992). We examined potential predictive variables with the Cox proportional hazard model and longitudinal trends with the Wilcoxon rank-sum test. In all, 323 (124 men) new fatty liver cases were diagnosed. The incidence was 19.9/1000 person-years (22.3 for men, 18.6 for women) and peaked in the sixth decade of life. After controlling for age, sex, and smoking and drinking habits, obesity (relative risk (RR), 2.93; 95% confidence interval (CI), 2.33-3.69, P<0.001), low high-density lipoprotein-cholesterol (RR, 1.87; 95% CI, 1.42-2.47; P<0.001), hypertriglyceridemia (RR, 2.49; 95% CI, 1.96-3.15; P<0.001), glucose intolerance (RR, 1.51; 95% CI, 1.09-2.10; P=0.013) and hypertension (RR, 1.63; 95% CI, 1.30-2.04; P<0.001) were predictive of fatty liver. In multivariate analysis including all variables, obesity (RR, 2.55; 95% CI, 1.93-3.38; P<0.001), hypertriglyceridemia (RR, 1.92; 95% CI, 1.41-2.62; P<0.001) and hypertension (RR, 1.31; 95% CI, 1.01-1.71; P=0.046) remained predictive. In fatty liver cases, body mass index and serum triglycerides, but not systolic or diastolic blood pressure, increased significantly and steadily up to the time of the diagnosis. Obesity, hypertriglyceridemia and, to a lesser extent, hypertension might serve as predictive variables for fatty liver. (author)

  8. Supervised maximum-likelihood weighting of composite protein networks for complex prediction

    Directory of Open Access Journals (Sweden)

    Yong Chern Han

    2012-12-01

    Full Text Available Abstract Background Protein complexes participate in many important cellular functions, so finding the set of existent complexes is essential for understanding the organization and regulation of processes in the cell. With the availability of large amounts of high-throughput protein-protein interaction (PPI data, many algorithms have been proposed to discover protein complexes from PPI networks. However, such approaches are hindered by the high rate of noise in high-throughput PPI data, including spurious and missing interactions. Furthermore, many transient interactions are detected between proteins that are not from the same complex, while not all proteins from the same complex may actually interact. As a result, predicted complexes often do not match true complexes well, and many true complexes go undetected. Results We address these challenges by integrating PPI data with other heterogeneous data sources to construct a composite protein network, and using a supervised maximum-likelihood approach to weight each edge based on its posterior probability of belonging to a complex. We then use six different clustering algorithms, and an aggregative clustering strategy, to discover complexes in the weighted network. We test our method on Saccharomyces cerevisiae and Homo sapiens, and show that complex discovery is improved: compared to previously proposed supervised and unsupervised weighting approaches, our method recalls more known complexes, achieves higher precision at all recall levels, and generates novel complexes of greater functional similarity. Furthermore, our maximum-likelihood approach allows learned parameters to be used to visualize and evaluate the evidence of novel predictions, aiding human judgment of their credibility. Conclusions Our approach integrates multiple data sources with supervised learning to create a weighted composite protein network, and uses six clustering algorithms with an aggregative clustering strategy to

  9. Bayesian Inference using Neural Net Likelihood Models for Protein Secondary Structure Prediction

    Directory of Open Access Journals (Sweden)

    Seong-Gon Kim

    2011-06-01

    Full Text Available Several techniques such as Neural Networks, Genetic Algorithms, Decision Trees and other statistical or heuristic methods have been used to approach the complex non-linear task of predicting Alpha-helicies, Beta-sheets and Turns of a proteins secondary structure in the past. This project introduces a new machine learning method by using an offline trained Multilayered Perceptrons (MLP as the likelihood models within a Bayesian Inference framework to predict secondary structures proteins. Varying window sizes are used to extract neighboring amino acid information and passed back and forth between the Neural Net models and the Bayesian Inference process until there is a convergence of the posterior secondary structure probability.

  10. A new, accurate predictive model for incident hypertension

    DEFF Research Database (Denmark)

    Völzke, Henry; Fung, Glenn; Ittermann, Till

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures.......Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  11. Age-specific incidence of A/H1N1 2009 influenza infection in England from sequential antibody prevalence data using likelihood-based estimation.

    Directory of Open Access Journals (Sweden)

    Marc Baguelin

    2011-02-01

    Full Text Available Estimating the age-specific incidence of an emerging pathogen is essential for understanding its severity and transmission dynamics. This paper describes a statistical method that uses likelihoods to estimate incidence from sequential serological data. The method requires information on seroconversion intervals and allows integration of information on the temporal distribution of cases from clinical surveillance. Among a family of candidate incidences, a likelihood function is derived by reconstructing the change in seroprevalence from seroconversion following infection and comparing it with the observed sequence of positivity among the samples. This method is applied to derive the cumulative and weekly incidence of A/H1N1 pandemic influenza in England during the second wave using sera taken between September 2009 and February 2010 in four age groups (1-4, 5-14, 15-24, 25-44 years. The highest cumulative incidence was in 5-14 year olds (59%, 95% credible interval (CI: 52%, 68% followed by 1-4 year olds (49%, 95% CI: 38%, 61%, rates 20 and 40 times higher respectively than estimated from clinical surveillance. The method provides a more accurate and continuous measure of incidence than achieved by comparing prevalence in samples grouped by time period.

  12. [Application of ARIMA model on prediction of malaria incidence].

    Science.gov (United States)

    Jing, Xia; Hua-Xun, Zhang; Wen, Lin; Su-Jian, Pei; Ling-Cong, Sun; Xiao-Rong, Dong; Mu-Min, Cao; Dong-Ni, Wu; Shunxiang, Cai

    2016-01-29

    To predict the incidence of local malaria of Hubei Province applying the Autoregressive Integrated Moving Average model (ARIMA). SPSS 13.0 software was applied to construct the ARIMA model based on the monthly local malaria incidence in Hubei Province from 2004 to 2009. The local malaria incidence data of 2010 were used for model validation and evaluation. The model of ARIMA (1, 1, 1) (1, 1, 0) 12 was tested as relatively the best optimal with the AIC of 76.085 and SBC of 84.395. All the actual incidence data were in the range of 95% CI of predicted value of the model. The prediction effect of the model was acceptable. The ARIMA model could effectively fit and predict the incidence of local malaria of Hubei Province.

  13. Predicting the Likelihood of Going to Graduate School: The Importance of Locus of Control

    Science.gov (United States)

    Nordstrom, Cynthia R.; Segrist, Dan J.

    2009-01-01

    Although many undergraduates apply to graduate school, only a fraction will be admitted. A question arises as to what factors relate to the likelihood of pursuing graduate studies. The current research examined this question by surveying students in a Careers in Psychology course. We hypothesized that GPA, a more internal locus of control…

  14. Predicting likelihood of seeking help through the employee assistance program among salaried and union hourly employees.

    Science.gov (United States)

    Delaney, W; Grube, J W; Ames, G M

    1998-03-01

    This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.

  15. Anterior Segment Imaging Predicts Incident Gonioscopic Angle Closure.

    Science.gov (United States)

    Baskaran, Mani; Iyer, Jayant V; Narayanaswamy, Arun K; He, Yingke; Sakata, Lisandro M; Wu, Renyi; Liu, Dianna; Nongpiur, Monisha E; Friedman, David S; Aung, Tin

    2015-12-01

    To investigate the incidence of gonioscopic angle closure after 4 years in subjects with gonioscopically open angles but varying degrees of angle closure detected on anterior segment optical coherence tomography (AS OCT; Visante; Carl Zeiss Meditec, Dublin, CA) at baseline. Prospective, observational study. Three hundred forty-two subjects, mostly Chinese, 50 years of age or older, were recruited, of whom 65 were controls with open angles on gonioscopy and AS OCT at baseline, and 277 were cases with baseline open angles on gonioscopy but closed angles (1-4 quadrants) on AS OCT scans. All subjects underwent gonioscopy and AS OCT at baseline (horizontal and vertical single scans) and after 4 years. The examiner performing gonioscopy was masked to the baseline and AS OCT data. Angle closure in a quadrant was defined as nonvisibility of the posterior trabecular meshwork by gonioscopy and visible iridotrabecular contact beyond the scleral spur in AS OCT scans. Gonioscopic angle closure in 2 or 3 quadrants after 4 years. There were no statistically significant differences in age, ethnicity, or gender between cases and controls. None of the control subjects demonstrated gonioscopic angle closure after 4 years. Forty-eight of the 277 subjects (17.3%; 95% confidence interval [CI], 12.8-23; P < 0.0001) with at least 1 quadrant of angle closure on AS OCT at baseline demonstrated gonioscopic angle closure in 2 or more quadrants, whereas 28 subjects (10.1%; 95% CI, 6.7-14.6; P < 0.004) demonstrated gonioscopic angle closure in 3 or more quadrants after 4 years. Individuals with more quadrants of angle closure on baseline AS OCT scans had a greater likelihood of gonioscopic angle closure developing after 4 years (P < 0.0001, chi-square test for trend for both definitions of angle closure). Anterior segment OCT imaging at baseline predicts incident gonioscopic angle closure after 4 years among subjects who have gonioscopically open angles and iridotrabecular contact on AS OCT at

  16. Numerical Prediction of Green Water Incidents

    DEFF Research Database (Denmark)

    Nielsen, K. B.; Mayer, Stefan

    2004-01-01

    loads on a moored FPSO exposed to head sea waves. Two cases are investigated: first, green water ona fixed vessel has been analysed, where resulting waterheight on deck, and impact pressure on a deck mounted structure have been computed. These results have been compared to experimental data obtained......Green water loads on moored or sailing ships occur when an incoming wave signigicantly exceeds the freeboard and water runs onto the deck. In this paper, a Navier-Stokes solver with a free surface capturing scheme (i.e. the VOF model; Hirt and Nichols, 1981) is used to numerically model green water...... by Greco (2001) and show very favourable agreement. Second, a full green water incident, including vessel motions has been modelled. In these computations, the vertical motion has been modelled by the use of transfer functions for heave and pitch, but the rotational contribution from the pitch motion has...

  17. Phalangeal bone mineral density predicts incident fractures

    DEFF Research Database (Denmark)

    Friis-Holmberg, Teresa; Brixen, Kim; Rubin, Katrine Hass

    2012-01-01

    This prospective study investigates the use of phalangeal bone mineral density (BMD) in predicting fractures in a cohort (15,542) who underwent a BMD scan. In both women and men, a decrease in BMD was associated with an increased risk of fracture when adjusted for age and prevalent fractures...

  18. Straight line fitting and predictions: On a marginal likelihood approach to linear regression and errors-in-variables models

    Science.gov (United States)

    Christiansen, Bo

    2015-04-01

    Linear regression methods are without doubt the most used approaches to describe and predict data in the physical sciences. They are often good first order approximations and they are in general easier to apply and interpret than more advanced methods. However, even the properties of univariate regression can lead to debate over the appropriateness of various models as witnessed by the recent discussion about climate reconstruction methods. Before linear regression is applied important choices have to be made regarding the origins of the noise terms and regarding which of the two variables under consideration that should be treated as the independent variable. These decisions are often not easy to make but they may have a considerable impact on the results. We seek to give a unified probabilistic - Bayesian with flat priors - treatment of univariate linear regression and prediction by taking, as starting point, the general errors-in-variables model (Christiansen, J. Clim., 27, 2014-2031, 2014). Other versions of linear regression can be obtained as limits of this model. We derive the likelihood of the model parameters and predictands of the general errors-in-variables model by marginalizing over the nuisance parameters. The resulting likelihood is relatively simple and easy to analyze and calculate. The well known unidentifiability of the errors-in-variables model is manifested as the absence of a well-defined maximum in the likelihood. However, this does not mean that probabilistic inference can not be made; the marginal likelihoods of model parameters and the predictands have, in general, well-defined maxima. We also include a probabilistic version of classical calibration and show how it is related to the errors-in-variables model. The results are illustrated by an example from the coupling between the lower stratosphere and the troposphere in the Northern Hemisphere winter.

  19. Memory Binding Test Predicts Incident Amnestic Mild Cognitive Impairment.

    Science.gov (United States)

    Mowrey, Wenzhu B; Lipton, Richard B; Katz, Mindy J; Ramratan, Wendy S; Loewenstein, David A; Zimmerman, Molly E; Buschke, Herman

    2016-07-14

    The Memory Binding Test (MBT), previously known as Memory Capacity Test, has demonstrated discriminative validity for distinguishing persons with amnestic mild cognitive impairment (aMCI) and dementia from cognitively normal elderly. We aimed to assess the predictive validity of the MBT for incident aMCI. In a longitudinal, community-based study of adults aged 70+, we administered the MBT to 246 cognitively normal elderly adults at baseline and followed them annually. Based on previous work, a subtle reduction in memory binding at baseline was defined by a Total Items in the Paired (TIP) condition score of ≤22 on the MBT. Cox proportional hazards models were used to assess the predictive validity of the MBT for incident aMCI accounting for the effects of covariates. The hazard ratio of incident aMCI was also assessed for different prediction time windows ranging from 4 to 7 years of follow-up, separately. Among 246 controls who were cognitively normal at baseline, 48 developed incident aMCI during follow-up. A baseline MBT reduction was associated with an increased risk for developing incident aMCI (hazard ratio (HR) = 2.44, 95% confidence interval: 1.30-4.56, p = 0.005). When varying the prediction window from 4-7 years, the MBT reduction remained significant for predicting incident aMCI (HR range: 2.33-3.12, p: 0.0007-0.04). Persons with poor performance on the MBT are at significantly greater risk for developing incident aMCI. High hazard ratios up to seven years of follow-up suggest that the MBT is sensitive to early disease.

  20. Incidence and predicting factors of falls of older inpatients

    Directory of Open Access Journals (Sweden)

    Hellen Cristina de Almeida Abreu

    2015-01-01

    Full Text Available OBJECTIVE To estimate the incidence and predicting factors associated with falls among older inpatients. METHODS Prospective cohort study conducted in clinical units of three hospitals in Cuiaba, MT, Midwestern Brazil, from March to August 2013. In this study, 221 inpatients aged 60 or over were followed until hospital discharge, death, or fall. The method of incidence density was used to calculate incidence rates. Bivariate analysis was performed by Chi-square test, and multiple analysis was performed by Cox regression. RESULTS The incidence of falls was 12.6 per 1,000 patients/day. Predicting factors for falls during hospitalization were: low educational level (RR = 2.48; 95%CI 1.17;5.25, polypharmacy (RR = 4.42; 95%CI 1.77;11.05, visual impairment (RR = 2.06; 95%CI 1.01;4.23, gait and balance impairment (RR = 2.95; 95%CI 1.22;7.14, urinary incontinence (RR = 5.67; 95%CI 2.58;12.44 and use of laxatives (RR = 4.21; 95%CI 1.15;15.39 and antipsychotics (RR = 4.10; 95%CI 1.38;12.13. CONCLUSIONS The incidence of falls of older inpatients is high. Predicting factors found for falls were low education level, polypharmacy, visual impairment, gait and balance impairment, urinary incontinence and use of laxatives and antipsychotics. Measures to prevent falls in hospitals are needed to reduce the incidence of this event.

  1. Prediction Model for Gastric Cancer Incidence in Korean Population.

    Directory of Open Access Journals (Sweden)

    Bang Wool Eom

    Full Text Available Predicting high risk groups for gastric cancer and motivating these groups to receive regular checkups is required for the early detection of gastric cancer. The aim of this study is was to develop a prediction model for gastric cancer incidence based on a large population-based cohort in Korea.Based on the National Health Insurance Corporation data, we analyzed 10 major risk factors for gastric cancer. The Cox proportional hazards model was used to develop gender specific prediction models for gastric cancer development, and the performance of the developed model in terms of discrimination and calibration was also validated using an independent cohort. Discrimination ability was evaluated using Harrell's C-statistics, and the calibration was evaluated using a calibration plot and slope.During a median of 11.4 years of follow-up, 19,465 (1.4% and 5,579 (0.7% newly developed gastric cancer cases were observed among 1,372,424 men and 804,077 women, respectively. The prediction models included age, BMI, family history, meal regularity, salt preference, alcohol consumption, smoking and physical activity for men, and age, BMI, family history, salt preference, alcohol consumption, and smoking for women. This prediction model showed good accuracy and predictability in both the developing and validation cohorts (C-statistics: 0.764 for men, 0.706 for women.In this study, a prediction model for gastric cancer incidence was developed that displayed a good performance.

  2. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  4. Approximate Likelihood

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Most physics results at the LHC end in a likelihood ratio test. This includes discovery and exclusion for searches as well as mass, cross-section, and coupling measurements. The use of Machine Learning (multivariate) algorithms in HEP is mainly restricted to searches, which can be reduced to classification between two fixed distributions: signal vs. background. I will show how we can extend the use of ML classifiers to distributions parameterized by physical quantities like masses and couplings as well as nuisance parameters associated to systematic uncertainties. This allows for one to approximate the likelihood ratio while still using a high dimensional feature vector for the data. Both the MEM and ABC approaches mentioned above aim to provide inference on model parameters (like cross-sections, masses, couplings, etc.). ABC is fundamentally tied Bayesian inference and focuses on the “likelihood free” setting where only a simulator is available and one cannot directly compute the likelihood for the dat...

  5. A new, accurate predictive model for incident hypertension.

    Science.gov (United States)

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-11-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures. The primary study population consisted of 1605 normotensive individuals aged 20-79 years with 5-year follow-up from the population-based study, that is the Study of Health in Pomerania (SHIP). The initial set was randomly split into a training and a testing set. We used a probabilistic graphical model applying a Bayesian network to create a predictive model for incident hypertension and compared the predictive performance with the established Framingham risk score for hypertension. Finally, the model was validated in 2887 participants from INTER99, a Danish community-based intervention study. In the training set of SHIP data, the Bayesian network used a small subset of relevant baseline features including age, mean arterial pressure, rs16998073, serum glucose and urinary albumin concentrations. Furthermore, we detected relevant interactions between age and serum glucose as well as between rs16998073 and urinary albumin concentrations [area under the receiver operating characteristic (AUC 0.76)]. The model was confirmed in the SHIP validation set (AUC 0.78) and externally replicated in INTER99 (AUC 0.77). Compared to the established Framingham risk score for hypertension, the predictive performance of the new model was similar in the SHIP validation set and moderately better in INTER99. Data mining procedures identified a predictive model for incident hypertension, which included innovative and easy-to-measure variables. The findings promise great applicability in screening settings and clinical practice.

  6. Using HPV prevalence to predict cervical cancer incidence.

    Science.gov (United States)

    Sharma, Monisha; Bruni, Laia; Diaz, Mireia; Castellsagué, Xavier; de Sanjosé, Silvia; Bosch, F Xavier; Kim, Jane J

    2013-04-15

    Knowledge of a country's cervical cancer (CC) burden is critical to informing decisions about resource allocation to combat the disease; however, many countries lack cancer registries to provide such data. We developed a prognostic model to estimate CC incidence rates in countries without cancer registries, leveraging information on human papilloma virus (HPV) prevalence, screening, and other country-level factors. We used multivariate linear regression models to identify predictors of CC incidence in 40 countries. We extracted age-specific HPV prevalence (10-year age groups) by country from a meta-analysis in women with normal cytology (N = 40) and matched to most recent CC incidence rates from Cancer Incidence in Five Continents when available (N = 36), or Globocan 2008 (N = 4). We evaluated country-level behavioral, economic, and public health indicators. CC incidence was significantly associated with age-specific HPV prevalence in women aged 35-64 (adjusted R-squared 0.41) ("base model"). Adding geographic region to the base model increased the adjusted R-squared to 0.77, but the further addition of screening was not statistically significant. Similarly, country-level macro-indicators did not improve predictive validity. Age-specific HPV prevalence at older ages was found to be a better predictor of CC incidence than prevalence in women under 35. However, HPV prevalence could not explain the entire CC burden as many factors modify women's risk of progression to cancer. Geographic region seemed to serve as a proxy for these country-level indicators. Our analysis supports the assertion that conducting a population-based HPV survey targeting women over age 35 can be valuable in approximating the CC risk in a given country. Copyright © 2012 UICC.

  7. A likelihood ratio-based method to predict exact pedigrees for complex families from next-generation sequencing data.

    Science.gov (United States)

    Heinrich, Verena; Kamphans, Tom; Mundlos, Stefan; Robinson, Peter N; Krawitz, Peter M

    2017-01-01

    Next generation sequencing technology considerably changed the way we screen for pathogenic mutations in rare Mendelian disorders. However, the identification of the disease-causing mutation amongst thousands of variants of partly unknown relevance is still challenging and efficient techniques that reduce the genomic search space play a decisive role. Often segregation- or linkage analysis are used to prioritize candidates, however, these approaches require correct information about the degree of relationship among the sequenced samples. For quality assurance an automated control of pedigree structures and sample assignment is therefore highly desirable in order to detect label mix-ups that might otherwise corrupt downstream analysis. We developed an algorithm based on likelihood ratios that discriminates between different classes of relationship for an arbitrary number of genotyped samples. By identifying the most likely class we are able to reconstruct entire pedigrees iteratively, even for highly consanguineous families. We tested our approach on exome data of different sequencing studies and achieved high precision for all pedigree predictions. By analyzing the precision for varying degrees of relatedness or inbreeding we could show that a prediction is robust down to magnitudes of a few hundred loci. A java standalone application that computes the relationships between multiple samples as well as a Rscript that visualizes the pedigree information is available for download as well as a web service at www.gene-talk.de CONTACT: heinrich@molgen.mpg.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  8. Incidents Prediction in Road Junctions Using Artificial Neural Networks

    Science.gov (United States)

    Hajji, Tarik; Alami Hassani, Aicha; Ouazzani Jamil, Mohammed

    2018-05-01

    The implementation of an incident detection system (IDS) is an indispensable operation in the analysis of the road traffics. However the IDS may, in no case, represent an alternative to the classical monitoring system controlled by the human eye. The aim of this work is to increase detection and prediction probability of incidents in camera-monitored areas. Knowing that, these areas are monitored by multiple cameras and few supervisors. Our solution is to use Artificial Neural Networks (ANN) to analyze moving objects trajectories on captured images. We first propose a modelling of the trajectories and their characteristics, after we develop a learning database for valid and invalid trajectories, and then we carry out a comparative study to find the artificial neural network architecture that maximizes the rate of valid and invalid trajectories recognition.

  9. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  10. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  11. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  12. Self-stigma of seeking treatment and being male predict an increased likelihood of having an undiagnosed eating disorder.

    Science.gov (United States)

    Griffiths, Scott; Mond, Jonathan M; Li, Zhicheng; Gunatilake, Sanduni; Murray, Stuart B; Sheffield, Jeanie; Touyz, Stephen

    2015-09-01

    To examine whether self-stigma of seeking psychological help and being male would be associated with an increased likelihood of having an undiagnosed eating disorder. A multi-national sample of 360 individuals with diagnosed eating disorders and 125 individuals with undiagnosed eating disorders were recruited. Logistic regression was used to identify variables affecting the likelihood of having an undiagnosed eating disorder, including sex, self-stigma of seeking psychological help, and perceived stigma of having a mental illness, controlling for a broad range of covariates. Being male and reporting greater self-stigma of seeking psychological help was independently associated with an increased likelihood of being undiagnosed. Further, the association between self-stigma of seeking psychological help and increased likelihood of being undiagnosed was significantly stronger for males than for females. Perceived stigma associated with help-seeking may be a salient barrier to treatment for eating disorders-particularly among male sufferers. © 2015 Wiley Periodicals, Inc.

  13. Conditional predictive inference for online surveillance of spatial disease incidence

    Science.gov (United States)

    Corberán-Vallet, Ana; Lawson, Andrew B.

    2012-01-01

    This paper deals with the development of statistical methodology for timely detection of incident disease clusters in space and time. The increasing availability of data on both the time and the location of events enables the construction of multivariate surveillance techniques, which may enhance the ability to detect localized clusters of disease relative to the surveillance of the overall count of disease cases across the entire study region. We introduce the surveillance conditional predictive ordinate as a general Bayesian model-based surveillance technique that allows us to detect small areas of increased disease incidence when spatial data are available. To address the problem of multiple comparisons, we incorporate a common probability that each small area signals an alarm when no change in the risk pattern of disease takes place into the analysis. We investigate the performance of the proposed surveillance technique within the framework of Bayesian hierarchical Poisson models using a simulation study. Finally, we present a case study of salmonellosis in South Carolina. PMID:21898522

  14. Predicting Likelihood of Surgery Prior to First Visit in Patients with Back and Lower Extremity Symptoms: A simple mathematical model based on over 8000 patients.

    Science.gov (United States)

    Boden, Lauren M; Boden, Stephanie A; Premkumar, Ajay; Gottschalk, Michael B; Boden, Scott D

    2018-02-09

    Retrospective analysis of prospectively collected data. To create a data-driven triage system stratifying patients by likelihood of undergoing spinal surgery within one year of presentation. Low back pain (LBP) and radicular lower extremity (LE) symptoms are common musculoskeletal problems. There is currently no standard data-derived triage process based on information that can be obtained prior to the initial physician-patient encounter to direct patients to the optimal physician type. We analyzed patient-reported data from 8006 patients with a chief complaint of LBP and/or LE radicular symptoms who presented to surgeons at a large multidisciplinary spine center between September 1, 2005 and June 30, 2016. Univariate and multivariate analysis identified independent risk factors for undergoing spinal surgery within one year of initial visit. A model incorporating these risk factors was created using a random sample of 80% of the total patients in our cohort, and validated on the remaining 20%. The baseline one-year surgery rate within our cohort was 39% for all patients and 42% for patients with LE symptoms. Those identified as high likelihood by the center's existing triage process had a surgery rate of 45%. The new triage scoring system proposed in this study was able to identify a high likelihood group in which 58% underwent surgery, which is a 46% higher surgery rate than in non-triaged patients and a 29% improvement from our institution's existing triage system. The data-driven triage model and scoring system derived and validated in this study (Spine Surgery Likelihood model [SSL-11]), significantly improved existing processes in predicting the likelihood of undergoing spinal surgery within one year of initial presentation. This triage system will allow centers to more selectively screen for surgical candidates and more effectively direct patients to surgeons or non-operative spine specialists. 4.

  15. Traffic Incident Clearance Time and Arrival Time Prediction Based on Hazard Models

    Directory of Open Access Journals (Sweden)

    Yang beibei Ji

    2014-01-01

    Full Text Available Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an effective input for travel time prediction. In this paper, the hazard based prediction models are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

  16. Real Time Big Data Analytics for Predicting Terrorist Incidents

    Science.gov (United States)

    Toure, Ibrahim

    2017-01-01

    Terrorism is a complex and evolving phenomenon. In the past few decades, we have witnessed an increase in the number of terrorist incidents in the world. The security and stability of many countries is threatened by terrorist groups. Perpetrators now use sophisticated weapons and the attacks are more and more lethal. Currently, terrorist incidents…

  17. Incidence and predictive factors of isolated neonatal penile glanular torsion.

    Science.gov (United States)

    Sarkis, Pierrot E; Sadasivam, Muthurajan

    2007-12-01

    To determine the incidence of isolated neonatal penile glanular torsion, describe the basic characteristics, and explore the relationship between foreskin and glans torsion. A prospective survey was conducted of all male newborns admitted to nursery after delivery, or neonates less than 3 months presenting for circumcision. Cases with associated genital malformations were excluded. The incidence of isolated neonatal penile torsion was 27% (95% CI: 22.2%-31.84%), to the left in 99% of cases. In 3.5% of cases, the penis had an angle 20 degrees. Using Spearman's correlational coefficient, deviation of penile raphe from the midline at the foreskin tip had a better correlation with glans torsion than deviation of raphe at the coronal sulcus (0.727 vs 0.570; both significant at pscope of the study.

  18. Prediction of sound transmission loss through multilayered panels by using Gaussian distribution of directional incident energy

    Science.gov (United States)

    Kang; Ih; Kim; Kim

    2000-03-01

    In this study, a new prediction method is suggested for sound transmission loss (STL) of multilayered panels of infinite extent. Conventional methods such as random or field incidence approach often given significant discrepancies in predicting STL of multilayered panels when compared with the experiments. In this paper, appropriate directional distributions of incident energy to predict the STL of multilayered panels are proposed. In order to find a weighting function to represent the directional distribution of incident energy on the wall in a reverberation chamber, numerical simulations by using a ray-tracing technique are carried out. Simulation results reveal that the directional distribution can be approximately expressed by the Gaussian distribution function in terms of the angle of incidence. The Gaussian function is applied to predict the STL of various multilayered panel configurations as well as single panels. The compared results between the measurement and the prediction show good agreements, which validate the proposed Gaussian function approach.

  19. Identifying Predictive Factors for Incident Reports in Patients Receiving Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Elnahal, Shereef M., E-mail: selnaha1@jhmi.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States); Blackford, Amanda [Department of Oncology Biostatistics, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States); Smith, Koren; Souranis, Annette N.; Briner, Valerie; McNutt, Todd R.; DeWeese, Theodore L.; Wright, Jean L.; Terezakis, Stephanie A. [Department of Radiation Oncology and Molecular Radiation Sciences, Sidney Kimmel Comprehensive Cancer Center, Johns Hopkins University School of Medicine, Baltimore, Maryland (United States)

    2016-04-01

    Purpose: To describe radiation therapy cases during which voluntary incident reporting occurred; and identify patient- or treatment-specific factors that place patients at higher risk for incidents. Methods and Materials: We used our institution's incident learning system to build a database of patients with incident reports filed between January 2011 and December 2013. Patient- and treatment-specific data were reviewed for all patients with reported incidents, which were classified by step in the process and root cause. A control group of patients without events was generated for comparison. Summary statistics, likelihood ratios, and mixed-effect logistic regression models were used for group comparisons. Results: The incident and control groups comprised 794 and 499 patients, respectively. Common root causes included documentation errors (26.5%), communication (22.5%), technical treatment planning (37.5%), and technical treatment delivery (13.5%). Incidents were more frequently reported in minors (age <18 years) than in adult patients (37.7% vs 0.4%, P<.001). Patients with head and neck (16% vs 8%, P<.001) and breast (20% vs 15%, P=.03) primaries more frequently had incidents, whereas brain (18% vs 24%, P=.008) primaries were less frequent. Larger tumors (17% vs 10% had T4 lesions, P=.02), and cases on protocol (9% vs 5%, P=.005) or with intensity modulated radiation therapy/image guided intensity modulated radiation therapy (52% vs 43%, P=.001) were more likely to have incidents. Conclusions: We found several treatment- and patient-specific variables associated with incidents. These factors should be considered by treatment teams at the time of peer review to identify patients at higher risk. Larger datasets are required to recommend changes in care process standards, to minimize safety risks.

  20. Identifying Predictive Factors for Incident Reports in Patients Receiving Radiation Therapy

    International Nuclear Information System (INIS)

    Elnahal, Shereef M.; Blackford, Amanda; Smith, Koren; Souranis, Annette N.; Briner, Valerie; McNutt, Todd R.; DeWeese, Theodore L.; Wright, Jean L.; Terezakis, Stephanie A.

    2016-01-01

    Purpose: To describe radiation therapy cases during which voluntary incident reporting occurred; and identify patient- or treatment-specific factors that place patients at higher risk for incidents. Methods and Materials: We used our institution's incident learning system to build a database of patients with incident reports filed between January 2011 and December 2013. Patient- and treatment-specific data were reviewed for all patients with reported incidents, which were classified by step in the process and root cause. A control group of patients without events was generated for comparison. Summary statistics, likelihood ratios, and mixed-effect logistic regression models were used for group comparisons. Results: The incident and control groups comprised 794 and 499 patients, respectively. Common root causes included documentation errors (26.5%), communication (22.5%), technical treatment planning (37.5%), and technical treatment delivery (13.5%). Incidents were more frequently reported in minors (age <18 years) than in adult patients (37.7% vs 0.4%, P<.001). Patients with head and neck (16% vs 8%, P<.001) and breast (20% vs 15%, P=.03) primaries more frequently had incidents, whereas brain (18% vs 24%, P=.008) primaries were less frequent. Larger tumors (17% vs 10% had T4 lesions, P=.02), and cases on protocol (9% vs 5%, P=.005) or with intensity modulated radiation therapy/image guided intensity modulated radiation therapy (52% vs 43%, P=.001) were more likely to have incidents. Conclusions: We found several treatment- and patient-specific variables associated with incidents. These factors should be considered by treatment teams at the time of peer review to identify patients at higher risk. Larger datasets are required to recommend changes in care process standards, to minimize safety risks.

  1. A formal likelihood function for parameter and predictive inference of hydrologic models with correlated, heteroscedastic, and non-Gaussian errors

    NARCIS (Netherlands)

    Schoups, G.; Vrugt, J.A.

    2010-01-01

    Estimation of parameter and predictive uncertainty of hydrologic models has traditionally relied on several simplifying assumptions. Residual errors are often assumed to be independent and to be adequately described by a Gaussian probability distribution with a mean of zero and a constant variance.

  2. Usefulness and limitations of dK random graph models to predict interactions and functional homogeneity in biological networks under a pseudo-likelihood parameter estimation approach

    Directory of Open Access Journals (Sweden)

    Luan Yihui

    2009-09-01

    Full Text Available Abstract Background Many aspects of biological functions can be modeled by biological networks, such as protein interaction networks, metabolic networks, and gene coexpression networks. Studying the statistical properties of these networks in turn allows us to infer biological function. Complex statistical network models can potentially more accurately describe the networks, but it is not clear whether such complex models are better suited to find biologically meaningful subnetworks. Results Recent studies have shown that the degree distribution of the nodes is not an adequate statistic in many molecular networks. We sought to extend this statistic with 2nd and 3rd order degree correlations and developed a pseudo-likelihood approach to estimate the parameters. The approach was used to analyze the MIPS and BIOGRID yeast protein interaction networks, and two yeast coexpression networks. We showed that 2nd order degree correlation information gave better predictions of gene interactions in both protein interaction and gene coexpression networks. However, in the biologically important task of predicting functionally homogeneous modules, degree correlation information performs marginally better in the case of the MIPS and BIOGRID protein interaction networks, but worse in the case of gene coexpression networks. Conclusion Our use of dK models showed that incorporation of degree correlations could increase predictive power in some contexts, albeit sometimes marginally, but, in all contexts, the use of third-order degree correlations decreased accuracy. However, it is possible that other parameter estimation methods, such as maximum likelihood, will show the usefulness of incorporating 2nd and 3rd degree correlations in predicting functionally homogeneous modules.

  3. Usefulness and limitations of dK random graph models to predict interactions and functional homogeneity in biological networks under a pseudo-likelihood parameter estimation approach.

    Science.gov (United States)

    Wang, Wenhui; Nunez-Iglesias, Juan; Luan, Yihui; Sun, Fengzhu

    2009-09-03

    Many aspects of biological functions can be modeled by biological networks, such as protein interaction networks, metabolic networks, and gene coexpression networks. Studying the statistical properties of these networks in turn allows us to infer biological function. Complex statistical network models can potentially more accurately describe the networks, but it is not clear whether such complex models are better suited to find biologically meaningful subnetworks. Recent studies have shown that the degree distribution of the nodes is not an adequate statistic in many molecular networks. We sought to extend this statistic with 2nd and 3rd order degree correlations and developed a pseudo-likelihood approach to estimate the parameters. The approach was used to analyze the MIPS and BIOGRID yeast protein interaction networks, and two yeast coexpression networks. We showed that 2nd order degree correlation information gave better predictions of gene interactions in both protein interaction and gene coexpression networks. However, in the biologically important task of predicting functionally homogeneous modules, degree correlation information performs marginally better in the case of the MIPS and BIOGRID protein interaction networks, but worse in the case of gene coexpression networks. Our use of dK models showed that incorporation of degree correlations could increase predictive power in some contexts, albeit sometimes marginally, but, in all contexts, the use of third-order degree correlations decreased accuracy. However, it is possible that other parameter estimation methods, such as maximum likelihood, will show the usefulness of incorporating 2nd and 3rd degree correlations in predicting functionally homogeneous modules.

  4. PREDICTIVE MODELS FOR SUPPORT OF INCIDENT MANAGEMENT PROCESS IN IT SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Martin SARNOVSKY

    2018-03-01

    Full Text Available ABSTRACT The work presented in this paper is focused on creating of predictive models that help in the process of incident resolution and implementation of IT infrastructure changes to increase the overall support of IT management. Our main objective was to build the predictive models using machine learning algorithms and CRISP-DM methodology. We used the incident and related changes database obtained from the IT environment of the Rabobank Group company, which contained information about the processing of the incidents during the incident management process. We decided to investigate the dependencies between the incident observation on particular infrastructure component and the actual source of the incident as well as the dependency between the incidents and related changes in the infrastructure. We used Random Forests and Gradient Boosting Machine classifiers in the process of identification of incident source as well as in the prediction of possible impact of the observed incident. Both types of models were tested on testing set and evaluated using defined metrics.

  5. Performance of Lynch syndrome predictive models in quantifying the likelihood of germline mutations in patients with abnormal MLH1 immunoexpression.

    Science.gov (United States)

    Cabreira, Verónica; Pinto, Carla; Pinheiro, Manuela; Lopes, Paula; Peixoto, Ana; Santos, Catarina; Veiga, Isabel; Rocha, Patrícia; Pinto, Pedro; Henrique, Rui; Teixeira, Manuel R

    2017-01-01

    Lynch syndrome (LS) accounts for up to 4 % of all colorectal cancers (CRC). Detection of a pathogenic germline mutation in one of the mismatch repair genes is the definitive criterion for LS diagnosis, but it is time-consuming and expensive. Immunohistochemistry is the most sensitive prescreening test and its predictive value is very high for loss of expression of MSH2, MSH6, and (isolated) PMS2, but not for MLH1. We evaluated if LS predictive models have a role to improve the molecular testing algorithm in this specific setting by studying 38 individuals referred for molecular testing and who were subsequently shown to have loss of MLH1 immunoexpression in their tumors. For each proband we calculated a risk score, which represents the probability that the patient with CRC carries a pathogenic MLH1 germline mutation, using the PREMM 1,2,6 and MMRpro predictive models. Of the 38 individuals, 18.4 % had a pathogenic MLH1 germline mutation. MMRpro performed better for the purpose of this study, presenting a AUC of 0.83 (95 % CI 0.67-0.9; P < 0.001) compared with a AUC of 0.68 (95 % CI 0.51-0.82, P = 0.09) for PREMM 1,2,6 . Considering a threshold of 5 %, MMRpro would eliminate unnecessary germline mutation analysis in a significant proportion of cases while keeping very high sensitivity. We conclude that MMRpro is useful to correctly predict who should be screened for a germline MLH1 gene mutation and propose an algorithm to improve the cost-effectiveness of LS diagnosis.

  6. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis – data from the Osteoarthritis Initiative

    Science.gov (United States)

    Emmanuel, K.; Quinn, E.; Niu, J.; Guermazi, A.; Roemer, F.; Wirth, W.; Eckstein, F.; Felson, D.

    2017-01-01

    SUMMARY Objective To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. Methods 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Results Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Conclusion Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. PMID:26318658

  7. Quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis--data from the Osteoarthritis Initiative.

    Science.gov (United States)

    Emmanuel, K; Quinn, E; Niu, J; Guermazi, A; Roemer, F; Wirth, W; Eckstein, F; Felson, D

    2016-02-01

    To test the hypothesis that quantitative measures of meniscus extrusion predict incident radiographic knee osteoarthritis (KOA), prior to the advent of radiographic disease. 206 knees with incident radiographic KOA (Kellgren Lawrence Grade (KLG) 0 or 1 at baseline, developing KLG 2 or greater with a definite osteophyte and joint space narrowing (JSN) grade ≥1 by year 4) were matched to 232 control knees not developing incident KOA. Manual segmentation of the central five slices of the medial and lateral meniscus was performed on coronal 3T DESS MRI and quantitative meniscus position was determined. Cases and controls were compared using conditional logistic regression adjusting for age, sex, BMI, race and clinical site. Sensitivity analyses of early (year [Y] 1/2) and late (Y3/4) incidence was performed. Mean medial extrusion distance was significantly greater for incident compared to non-incident knees (1.56 mean ± 1.12 mm SD vs 1.29 ± 0.99 mm; +21%, P meniscus (25.8 ± 15.8% vs 22.0 ± 13.5%; +17%, P meniscus in incident medial KOA, or for the tibial plateau coverage between incident and non-incident knees. Restricting the analysis to medial incident KOA at Y1/2 differences were attenuated, but reached significance for extrusion distance, whereas no significant differences were observed at incident KOA in Y3/4. Greater medial meniscus extrusion predicts incident radiographic KOA. Early onset KOA showed greater differences for meniscus position between incident and non-incident knees than late onset KOA. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  8. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  9. Predicting hepatitis B monthly incidence rates using weighted Markov chains and time series methods.

    Science.gov (United States)

    Shahdoust, Maryam; Sadeghifar, Majid; Poorolajal, Jalal; Javanrooh, Niloofar; Amini, Payam

    2015-01-01

    Hepatitis B (HB) is a major global mortality. Accurately predicting the trend of the disease can provide an appropriate view to make health policy disease prevention. This paper aimed to apply three different to predict monthly incidence rates of HB. This historical cohort study was conducted on the HB incidence data of Hamadan Province, the west of Iran, from 2004 to 2012. Weighted Markov Chain (WMC) method based on Markov chain theory and two time series models including Holt Exponential Smoothing (HES) and SARIMA were applied on the data. The results of different applied methods were compared to correct percentages of predicted incidence rates. The monthly incidence rates were clustered into two clusters as state of Markov chain. The correct predicted percentage of the first and second clusters for WMC, HES and SARIMA methods was (100, 0), (84, 67) and (79, 47) respectively. The overall incidence rate of HBV is estimated to decrease over time. The comparison of results of the three models indicated that in respect to existing seasonality trend and non-stationarity, the HES had the most accurate prediction of the incidence rates.

  10. Prediction of unsteady airfoil flows at large angles of incidence

    Science.gov (United States)

    Cebeci, Tuncer; Jang, H. M.; Chen, H. H.

    1992-01-01

    The effect of the unsteady motion of an airfoil on its stall behavior is of considerable interest to many practical applications including the blades of helicopter rotors and of axial compressors and turbines. Experiments with oscillating airfoils, for example, have shown that the flow can remain attached for angles of attack greater than those which would cause stall to occur in a stationary system. This result appears to stem from the formation of a vortex close to the surface of the airfoil which continues to provide lift. It is also evident that the onset of dynamic stall depends strongly on the airfoil section, and as a result, great care is required in the development of a calculation method which will accurately predict this behavior.

  11. Pelvic Incidence: A Predictive Factor for Three-Dimensional Acetabular Orientation—A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Christophe Boulay

    2014-01-01

    Full Text Available Acetabular cup orientation (inclination and anteversion is a fundamental topic in orthopaedics and depends on pelvis tilt (positional parameter emphasising the notion of a safe range of pelvis tilt. The hypothesis was that pelvic incidence (morphologic parameter could yield a more accurate and reliable assessment than pelvis tilt. The aim was to find out a predictive equation of acetabular 3D orientation parameters which were determined by pelvic incidence to include in the model. The second aim was to consider the asymmetry between the right and left acetabulae. Twelve pelvic anatomic specimens were measured with an electromagnetic Fastrak system (Polhemus Society providing 3D position of anatomical landmarks to allow measurement of acetabular and pelvic parameters. Acetabulum and pelvis data were correlated by a Spearman matrix. A robust linear regression analysis provided prediction of acetabulum axes. The orientation of each acetabulum could be predicted by the incidence. The incidence is correlated with the morphology of acetabula. The asymmetry of the acetabular roof was correlated with pelvic incidence. This study allowed analysis of relationships of acetabular orientation and pelvic incidence. Pelvic incidence (morphologic parameter could determine the safe range of pelvis tilt (positional parameter for an individual and not a group.

  12. Factors predictive for incidence and remission of internet addiction in young adolescents: a prospective study.

    Science.gov (United States)

    Ko, Chih-Hung; Yen, Ju-Yu; Yen, Cheng-Fang; Lin, Huang-Chi; Yang, Ming-Jen

    2007-08-01

    The aim of the study is to determine the incidence and remission rates for Internet addiction and the associated predictive factors in young adolescents over a 1-year follow-up. This was a prospective, population-based investigation. Five hundred seventeen students (267 male and 250 female) were recruited from three junior high schools in southern Taiwan. The factors examined included gender, personality, mental health, self-esteem, family function, life satisfaction, and Internet activities. The result revealed that the 1-year incidence and remission rates for Internet addiction were 7.5% and 49.5% respectively. High exploratory excitability, low reward dependence, low self-esteem, low family function, and online game playing predicted the emergency of the Internet addiction. Further, low hostility and low interpersonal sensitivity predicted remission of Internet addiction. The factors predictive incidence and remission of Internet addiction identified in this study could be provided for prevention and promoting remission of Internet addiction in adolescents.

  13. Patterns, incidence and predictive factors for pain after interventional radiology

    International Nuclear Information System (INIS)

    England, A.; Tam, C.L.; Thacker, D.E.; Walker, A.L.; Parkinson, A.S.; DeMello, W.; Bradley, A.J.; Tuck, J.S.; Laasch, H.-U.; Butterfield, J.S.; Ashleigh, R.J.; England, R.E.; Martin, D.F.

    2005-01-01

    AIM: To evaluate prospectively the pattern, severity and predictive factors of pain after interventional radiological procedures. MATERIALS AND METHODS: All patients undergoing non-arterial radiological interventional procedures were assessed using a visual-analogue scale (VAS) for pain before and at regular intervals for 24 h after their procedure. RESULTS: One hundred and fifty patients (87 men, mean age 62 years, range 18-92 years) were entered into the study. Significant increases in VAS score occurred 8 h after percutaneous biliary procedures (+47.7 mm, SD 14.9 mm; p=0.001), 6 h after central venous access and gastrostomy insertion (+23.7 mm, SD 19.5 mm; p=0.001 and +28.4 mm, SD 9.7 mm; p=0.007, respectively) and 4 h after oesophageal stenting (+27.8 mm, SD 20.2 mm, p=0.001). Non-significant increases in VAS pain score were observed after duodenal and colonic stenting (duodenal: +5.13 mm, SD 7.47 mm; p=0.055, colonic: +23.3 mm, SD 13.10 mm, p=0.250) at a mean of 5 h (range 4-6 h). Patients reported a significant reduction in pain score for nephrostomy insertion (-28.4 mm, SD 7.11 mm, p=0.001). Post-procedural analgesia was required in 99 patients (69.2%), 40 (28.0%) requiring opiates. Maximum post-procedural VAS pain score was significantly higher in patients who had no pre-procedural analgesia (p=0.003). CONCLUSION: Post-procedural pain is common and the pattern and severity of pain between procedures is variable. Pain control after interventional procedures is often inadequate, and improvements in pain management are required

  14. Predicting mortality and incident immobility in older Belgian men by characteristics related to sarcopenia and frailty

    DEFF Research Database (Denmark)

    Kruse, C; Goemaere, S; De Buyser, S

    2018-01-01

    and bone mineral density scores were the most important predictors. INTRODUCTION: Machine learning principles were used to predict 5-year mortality and 3-year incident severe immobility in a population of older men by frailty and sarcopenia characteristics. METHODS: Using prospective data from 1997 on 264......There is an increasing awareness of sarcopenia in older people. We applied machine learning principles to predict mortality and incident immobility in older Belgian men through sarcopenia and frailty characteristics. Mortality could be predicted with good accuracy. Serum 25-hydroxyvitamin D...... the most important predictors of immobility. Sarcopenia assessed by lean mass estimates was relevant to mortality prediction but not immobility prediction. CONCLUSIONS: Using advanced statistical models and a machine learning approach 5-year mortality can be predicted with good accuracy using a Bayesian...

  15. Development of incident progress prediction technologies for nuclear emergency preparedness. Current status and future subjects

    International Nuclear Information System (INIS)

    Yoshida, Yoshitaka; Yamamoto, Yasunori; Kusunoki, Takayoshi; Kawasaki, Ikuo; Yanagi, Chihiro; Kinoshita, Ikuo; Iwasaki, Yoshito

    2014-01-01

    Nuclear licensees are required to maintain a prediction system during normal condition for using a nuclear emergency by the Basic Plan for Disaster Prevention of government. With prediction of the incident progress, if the present condition of nuclear power plant is understood appropriately and it grows more serious with keeping the present situation, it is in predicting what kind of situation will be occurred in the near future, choosing the effective countermeasures against the coming threat, and understanding the time available of intervention time. Following the accident on September 30 1999 in the nuclear fuel fabrication facility in Tokai Village of Ibaraki Prefecture, the Institute of Nuclear Safety System started development of incident progress prediction technologies for nuclear emergency preparedness. We have performed technical applications and made improvements in nuclear emergency exercises and verified the developed systems using the observed values of the Fukushima Daiichi Nuclear Power Plant accident. As a result, our developed Incident Progress Prediction System was applied to nuclear emergency exercises and we accumulated knowledge and experience by which we improved the system to make predictions more rapidly and more precisely, including for example, the development of a prediction method for leak size of reactor coolant. On the other hand, if a rapidly progressing incident occurs, since end users need simple and quick predictions about the public's protection and evacuation areas, we developed the Radioactive Materials Release, Radiation Dose and Radiological Protection Area Prediction System which changed solving an inverse problem into a forward problem solution. In view of the water-level-decline incident of the spent fuel storage facility at the Fukushima Daiichi Nuclear Power Plant, the spent fuel storage facility water level and the water temperature evaluation tool were improved. Such incident progress prediction technologies were

  16. Predicting Likelihood of Having Four or More Positive Nodes in Patient With Sentinel Lymph Node-Positive Breast Cancer: A Nomogram Validation Study

    International Nuclear Information System (INIS)

    Unal, Bulent; Gur, Akif Serhat; Beriwal, Sushil; Tang Gong; Johnson, Ronald; Ahrendt, Gretchen; Bonaventura, Marguerite; Soran, Atilla

    2009-01-01

    Purpose: Katz suggested a nomogram for predicting having four or more positive nodes in sentinel lymph node (SLN)-positive breast cancer patients. The findings from this formula might influence adjuvant radiotherapy decisions. Our goal was to validate the accuracy of the Katz nomogram. Methods and Materials: We reviewed the records of 309 patients with breast cancer who had undergone completion axillary lymph node dissection. The factors associated with the likelihood of having four or more positive axillary nodes were evaluated in patients with one to three positive SLNs. The nomogram developed by Katz was applied to our data set. The area under the curve of the corresponding receiver operating characteristics curve was calculated for the nomogram. Results: Of the 309 patients, 80 (25.9%) had four or more positive axillary lymph nodes. On multivariate analysis, the number of positive SLNs (p < .0001), overall metastasis size (p = .019), primary tumor size (p = .0001), and extracapsular extension (p = .01) were significant factors predicting for four or more positive nodes. For patients with <5% probability, 90.3% had fewer than four positive nodes and 9.7% had four or more positive nodes. The negative predictive value was 91.7%, and sensitivity was 80%. The nomogram was accurate and discriminating (area under the curve, .801). Conclusion: The probability of four or more involved nodes is significantly greater in patients who have an increased number of positive SLNs, increased overall metastasis size, increased tumor size, and extracapsular extension. The Katz nomogram was validated in our patients. This nomogram will be helpful to clinicians making adjuvant treatment recommendations to their patients.

  17. Incidence, Mortality, and Predictive Factors of Hepatocellular Carcinoma in Primary Biliary Cirrhosis

    Directory of Open Access Journals (Sweden)

    Kenichi Hosonuma

    2013-01-01

    Full Text Available Background. The study aims to analyze in detail the incidence, mortality using the standardized incidence ratio (SIR, and standardized mortality ratio (SMR of hepatocellular carcinoma (HCC in primary biliary cirrhosis (PBC, because no large case studies have focused on the detailed statistical analysis of them in Asia. Methods. The study cohorts were consecutively diagnosed at Gunma University and its affiliated hospitals. Age- or sex-specific annual cancer incidence and deaths were obtained from Japanese Cancer Registry and Death Registry as a reference for the comparison of SIR or SMR of HCC. Moreover, univariate analyses and multivariate analyses were performed to clarify predictive factors for the incidence of HCC. Results. The overall 179 patients were followed up for a median of 97 months. HCC had developed in 13 cases. SIR for HCC was 11.6 (95% confidence interval (CI, 6.2–19.8 and SMR for HCC was 11.2 (95% CI, 5.4–20.6 in overall patients. The serum albumin levels were a predictive factor for the incidence of HCC in overall patients. Conclusions. The incidence and mortality of HCC in PBC patients were significantly higher than those in Japanese general population. PBC patients with low serum albumin levels were populations at high risk for HCC.

  18. Temporal Trends and Future Prediction of Breast Cancer Incidence Across Age Groups in Trivandrum, South India.

    Science.gov (United States)

    Mathew, Aleyamma; George, Preethi Sara; Arjunan, Asha; Augustine, Paul; Kalavathy, Mc; Padmakumari, G; Mathew, Beela Sarah

    2016-01-01

    Increasing breast cancer (BC) incidence rates have been reported from India; causal factors for this increased incidence are not understood and diagnosis is mostly in advanced stages. Trivandrum exhibits the highest BC incidence rates in India. This study aimed to estimate trends in incidence by age from 2005- 2014, to predict rates through 2020 and to assess the stage at diagnosis of BC in Trivandrum. BC cases were obtained from the Population Based Cancer Registry, Trivandrum. Distribution of stage at diagnosis and incidence rates of BC [Age-specific (ASpR), crude (CR) and age-standardized (ASR)] are described and employed with a joinpoint regression model to estimate average annual percent changes (AAPC) and a Bayesian model to estimate predictive rates. BC accounts for 31% (2681/8737) of all female cancers in Trivandrum. Thirty-five percent (944/2681) are 60 years and overall CR is 80 (ASR: 57) for 2019- 20. BC, mostly diagnosed in advanced stages, is rising rapidly in South India with large increases likely in the future; particularly among post-menopausal women. This increase might be due to aging and/or changes in lifestyle factors. Reasons for the increased incidence and late stage diagnosis need to be studied.

  19. Screening and Predicting Posttraumatic Stress and Depression in Children Following Single-Incident Trauma

    Science.gov (United States)

    Nixon, Reginald D. V.; Ellis, Alicia A.; Nehmy, Thomas J.; Ball, Shelley-Anne

    2010-01-01

    Three screening methods to predict posttraumatic stress disorder (PTSD) and depression symptoms in children following single-incident trauma were tested. Children and adolescents (N = 90; aged 7-17 years) were assessed within 4 weeks of an injury that led to hospital treatment and followed up 3 and 6 months later. Screening methods were adapted…

  20. Symptoms of delirium predict incident delirium in older long-term care residents.

    Science.gov (United States)

    Cole, Martin G; McCusker, Jane; Voyer, Philippe; Monette, Johanne; Champoux, Nathalie; Ciampi, Antonio; Vu, Minh; Dyachenko, Alina; Belzile, Eric

    2013-06-01

    Detection of long-term care (LTC) residents at risk of delirium may lead to prevention of this disorder. The primary objective of this study was to determine if the presence of one or more Confusion Assessment Method (CAM) core symptoms of delirium at baseline assessment predicts incident delirium. Secondary objectives were to determine if the number or the type of symptoms predict incident delirium. The study was a secondary analysis of data collected for a prospective study of delirium among older residents of seven LTC facilities in Montreal and Quebec City, Canada. The Mini-Mental State Exam (MMSE), CAM, Delirium Index (DI), Hierarchic Dementia Scale, Barthel Index, and Cornell Scale for Depression were completed at baseline. The MMSE, CAM, and DI were repeated weekly for six months. Multivariate Cox regression models were used to determine if baseline symptoms predict incident delirium. Of 273 residents, 40 (14.7%) developed incident delirium. Mean (SD) time to onset of delirium was 10.8 (7.4) weeks. When one or more CAM core symptoms were present at baseline, the Hazard Ratio (HR) for incident delirium was 3.5 (95% CI = 1.4, 8.9). The HRs for number of symptoms present ranged from 2.9 (95% CI = 1.0, 8.3) for one symptom to 3.8 (95% CI = 1.3, 11.0) for three symptoms. The HR for one type of symptom, fluctuation, was 2.2 (95% CI = 1.2, 4.2). The presence of CAM core symptoms at baseline assessment predicts incident delirium in older LTC residents. These findings have potentially important implications for clinical practice and research in LTC settings.

  1. Prediction of cancer incidence in Tyrol/Austria for year of diagnosis 2020.

    Science.gov (United States)

    Oberaigner, Willi; Geiger-Gritsch, Sabine

    2014-10-01

    Prediction of the number of incident cancer cases is very relevant for health planning purposes and allocation of resources. The shift towards elder age groups in central European populations in the next decades is likely to contribute to an increase in cancer incidence for many cancer sites. In Tyrol, cancer incidence data have been registered on a high level of completeness for more than 20 years. We therefore aimed to compute well-founded predictions of cancer incidence for Tyrol for the year 2020 for all frequent cancer sites and for all cancer sites combined. After defining a prediction base range for every cancer site, we extrapolated the age-specific time trends in the prediction base range following a linear model for increasing and a log-linear model for decreasing time trends. The extrapolated time trends were evaluated for the year 2020 applying population figures supplied by Statistics Austria. Compared with the number of annual incident cases for the year 2009 for all cancer sites combined except non-melanoma skin cancer, we predicted an increase of 235 (15 %) and 362 (21 %) for females and males, respectively. For both sexes, more than 90 % of the increase is attributable to the shift toward older age groups in the next decade. The biggest increase in absolute numbers is seen for females in breast cancer (92, 21 %), lung cancer (64, 52 %), colorectal cancer (40, 24 %), melanoma (38, 30 %) and the haematopoietic system (37, 35 %) and for males in prostate cancer (105, 25 %), colorectal cancer (91, 45 %), the haematopoietic system (71, 55 %), bladder cancer (69, 100 %) and melanoma (64, 52 %). The increase in the number of incident cancer cases of 15 % in females and 21 % in males in the next decade is very relevant for planning purposes. However, external factors cause uncertainty in the prediction of some cancer sites (mainly prostate cancer and colorectal cancer) and the prediction intervals are still broad. Therefore

  2. PREVAIL: Predicting Recovery through Estimation and Visualization of Active and Incident Lesions.

    Science.gov (United States)

    Dworkin, Jordan D; Sweeney, Elizabeth M; Schindler, Matthew K; Chahin, Salim; Reich, Daniel S; Shinohara, Russell T

    2016-01-01

    The goal of this study was to develop a model that integrates imaging and clinical information observed at lesion incidence for predicting the recovery of white matter lesions in multiple sclerosis (MS) patients. Demographic, clinical, and magnetic resonance imaging (MRI) data were obtained from 60 subjects with MS as part of a natural history study at the National Institute of Neurological Disorders and Stroke. A total of 401 lesions met the inclusion criteria and were used in the study. Imaging features were extracted from the intensity-normalized T1-weighted (T1w) and T2-weighted sequences as well as magnetization transfer ratio (MTR) sequence acquired at lesion incidence. T1w and MTR signatures were also extracted from images acquired one-year post-incidence. Imaging features were integrated with clinical and demographic data observed at lesion incidence to create statistical prediction models for long-term damage within the lesion. The performance of the T1w and MTR predictions was assessed in two ways: first, the predictive accuracy was measured quantitatively using leave-one-lesion-out cross-validated (CV) mean-squared predictive error. Then, to assess the prediction performance from the perspective of expert clinicians, three board-certified MS clinicians were asked to individually score how similar the CV model-predicted one-year appearance was to the true one-year appearance for a random sample of 100 lesions. The cross-validated root-mean-square predictive error was 0.95 for normalized T1w and 0.064 for MTR, compared to the estimated measurement errors of 0.48 and 0.078 respectively. The three expert raters agreed that T1w and MTR predictions closely resembled the true one-year follow-up appearance of the lesions in both degree and pattern of recovery within lesions. This study demonstrates that by using only information from a single visit at incidence, we can predict how a new lesion will recover using relatively simple statistical techniques. The

  3. Financial and health literacy predict incident AD dementia and AD pathology

    Science.gov (United States)

    Yu, Lei; Wilson, Robert S.; Schneider, Julie A.; Bennett, David A.; Boyle, Patricia A.

    2017-01-01

    Background Domain specific literacy is a multidimensional construct that requires multiple resources including cognitive and non-cognitive factors. Objective We test the hypothesis that domain specific literacy is associated with AD dementia and AD pathology after controlling for cognition. Methods Participants were community based older persons who completed a baseline literacy assessment, underwent annual clinical evaluations for up to 8 years and agreed to organ donation after death. Financial and health literacy was measured using 32 questions and cognition was measured using 19 tests. Annual diagnosis of AD dementia followed standard criteria. AD pathology was examined post-mortem by quantifying plaques and tangles. Cox models examined the association of literacy with incident AD dementia. Performance of model prediction for incident AD dementia was assessed using indices for integrated discrimination improvement and continuous net reclassification improvement. Linear regression models examined the independent association of literacy with AD pathology in autopsied participants. Results All 805 participants were free of dementia at baseline and 102 (12.7%) developed AD dementia during the follow-up. Lower literacy was associated with higher risk for incident AD dementia (pliteracy measure had better predictive performance than the one with demographics and cognition only. Lower literacy also was associated with higher burden of AD pathology after controlling for cognition (β=0.07, p=0.035). Conclusion Literacy predicts incident AD dementia and AD pathology in community-dwelling older persons, and the association is independent of traditional measures of cognition. PMID:28157101

  4. Financial and Health Literacy Predict Incident Alzheimer's Disease Dementia and Pathology.

    Science.gov (United States)

    Yu, Lei; Wilson, Robert S; Schneider, Julie A; Bennett, David A; Boyle, Patricia A

    2017-01-01

    Domain specific literacy is a multidimensional construct that requires multiple resources including cognitive and non-cognitive factors. We test the hypothesis that domain specific literacy is associated with Alzheimer's disease (AD) dementia and AD pathology after controlling for cognition. Participants were community-based older persons who completed a baseline literacy assessment, underwent annual clinical evaluations for up to 8 years, and agreed to organ donation after death. Financial and health literacy was measured using 32 questions and cognition was measured using 19 tests. Annual diagnosis of AD dementia followed standard criteria. AD pathology was examined postmortem by quantifying plaques and tangles. Cox models examined the association of literacy with incident AD dementia. Performance of model prediction for incident AD dementia was assessed using indices for integrated discrimination improvement and continuous net reclassification improvement. Linear regression models examined the independent association of literacy with AD pathology in autopsied participants. All 805 participants were free of dementia at baseline and 102 (12.7%) developed AD dementia during the follow-up. Lower literacy was associated with higher risk for incident AD dementia (p literacy measure had better predictive performance than the one with demographics and cognition only. Lower literacy also was associated with higher burden of AD pathology after controlling for cognition (β= 0.07, p = 0.035). Literacy predicts incident AD dementia and AD pathology in community-dwelling older persons, and the association is independent of traditional measures of cognition.

  5. Predictions of Quantum Molecular Dynamical Model between incident energy 50 and 1000 MeV/Nucleon

    Directory of Open Access Journals (Sweden)

    Kumar Sanjeev

    2015-01-01

    Full Text Available In the present work, the Quantum Molecular Dynamical (QMD model is summarized as a useful tool for the incident energy range of 50 to 1000 MeV/nucleon in heavy-ion collisions. The model has reproduced the experimental results of various collaborations such as ALADIN, INDRA, PLASTIC BALL and FOPI upto a high level of accuracy for the phenomena like multifragmentation, collective flow as well as elliptical flow in the above prescribed energy range. The efforts are further in the direction to predict the symmetry energy in the wide incident energy range.

  6. Predictive Index The Incidence Of Tuberculosis Children In South Kalimantan Province

    Directory of Open Access Journals (Sweden)

    Bahrul Ilmi

    2015-08-01

    Full Text Available The research objective to formulate predictive index of Tuberculosis Children in South Kalimantan province. Research methods combined mixed methods with a combination of research model Sequential Exploratory Design qualitative approach to support quantitative and centered on quantitative Sugiono 2012 case control design. The number of qualitative sample was 16 respondents to interviews and 48 respondents for FGD. The number of quantitative research sample was 216 consisted of 62 cases and 154 controls. Qualitative sampling by purposive sampling and quantitative Multi-stage Cluster random sampling on 3 stages. The analysis technique used is descriptive qualitative and Confirmatory Factor Analysis Confirmatory Factor Analysis measure the latent of variables by using path analysis path analysis with the program Linear Structural Relationships LISREL. The results showed a positive effect on the socio-cultural environment and significantly associated with the incidence of Tuberculosis Children. While the physical environment of the house positively and significantly with biological environments and the incidence of Tuberculosis Children and immunization and nutrition status of children positively and significantly to the incidence of Tuberculosis of the Child as well as to the biological environment positive and significant effect on the incidence of TB Children. Formulation Predictive Index of Tuberculosis Children in South Kalimantan province. is index 019 Physical Environment Home 044 053 Biological Environment Social Environment Culture 019 Status Immunization and Child Nutrition. The results of all the R-square value indicates that all of the R-square values 0.5. This means that a predictive model of TB Kids index has met the required Goodness of Fit. New findings from research of this dissertation are 1. Research Variable of social networks social support and collective efficacy were associated with the incidence of Tuberculosis Children. 2

  7. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  8. Effects of passengers on bus driver celeration behavior and incident prediction.

    Science.gov (United States)

    Af Wåhlberg, A E

    2007-01-01

    Driver celeration (speed change) behavior of bus drivers has previously been found to predict their traffic incident involvement, but it has also been ascertained that the level of celeration is influenced by the number of passengers carried as well as other traffic density variables. This means that the individual level of celeration is not as well estimated as could be the case. Another hypothesized influence of the number of passengers is that of differential quality of measurements, where high passenger density circumstances are supposed to yield better estimates of the individual driver component of celeration behavior. Comparisons were made between different variants of the celeration as predictor of traffic incidents of bus drivers. The number of bus passengers was held constant, and cases identified by their number of passengers per kilometer during measurement were excluded (in 12 samples of repeated measurements). After holding passengers constant, the correlations between celeration behavior and incident record increased very slightly. Also, the selective prediction of incident record of those drivers who had had many passengers when measured increased the correlations even more. The influence of traffic density variables like the number of passengers have little direct influence on the predictive power of celeration behavior, despite the impact upon absolute celeration level. Selective prediction on the other hand increased correlations substantially. This unusual effect was probably due to how the individual propensity for high or low celeration driving was affected by the number of stops made and general traffic density; differences between drivers in this respect were probably enhanced by the denser traffic, thus creating a better estimate of the theoretical celeration behavior parameter C. The new concept of selective prediction was discussed in terms of making estimates of the systematic differences in quality of the individual driver data.

  9. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. PSA predicts development of incident lower urinary tract symptoms: results from the REDUCE study.

    Science.gov (United States)

    Patel, Devin N; Feng, Tom; Simon, Ross M; Howard, Lauren E; Vidal, Adriana C; Moreira, Daniel M; Castro-Santamaria, Ramiro; Roehrborn, Claus; Andriole, Gerald L; Freedland, Stephen J

    2018-05-23

    The relationship between baseline prostate-specific antigen (PSA) and development of lower urinary tract symptoms (LUTS) in asymptomatic and mildly symptomatic men is unclear. We sought to determine if PSA predicts incident LUTS in these men. A post-hoc analysis of the 4-year REDUCE study was performed to assess for incident LUTS in 1534 men with mild to no LUTS at baseline. The primary aim was to determine whether PSA independently predicted incident LUTS after adjusting for the key clinical variables of age, prostate size, and baseline International prostate symptom score (IPSS). Incident LUTS was defined as the first report of medical treatment, surgery, or sustained clinically significant symptoms (two IPSS >14). Cox proportional hazards, cumulative incidence curves, and the log-rank test were used to test our hypothesis. A total of 1534 men with baseline IPSS PSA 2.5-4 ng/mL, 589 with PSA 4.1-6 ng/mL, and 610 with PSA 6-10 ng/mL. During the 4-year study, 196 men progressed to incident LUTS (50.5% medical treatment, 9% surgery, and 40.5% new symptoms). As a continuous variable, higher PSA was associated with increased incident LUTS on univariable (HR 1.09, p = 0.019) and multivariable (HR 1.08, p = 0.040) analysis. Likewise, baseline PSA 6-10 ng/mL was associated with increased incident LUTS vs. PSA 2.5-4 ng/mL in adjusted models (HR 1.68, p = 0.016). This association was also observed in men with PSA 4.1-6 ng/mL vs. PSA 2.5-4 ng/mL (HR 1.60, p = 0.032). Men with mild to no LUTS but increased baseline PSA are at increased risk of developing incident LUTS presumed due to benign prostatic hyperplasia.

  11. Applications of Machine learning in Prediction of Breast Cancer Incidence and Mortality

    International Nuclear Information System (INIS)

    Helal, N.; Sarwat, E.

    2012-01-01

    Breast cancer is one of the leading causes of cancer deaths for the female population in both developed and developing countries. In this work we have used the baseline descriptive data about the incidence (new cancer cases) of in situ breast cancer among Wisconsin females. The documented data were from the most recent 12-years period for which data are available. Wiscons in cancer incidence and mortality (deaths due to cancer) that occurred were also considered in this work. Artificial Neural network (ANN) have been successfully applied to problems in the prediction of the number of new cancer cases and mortality. Using artificial intelligence (AI) in this study, the numbers of new cancer cases and mortality that may occur are predicted.

  12. Sex-dependent independent prediction of incident diabetes by depressive symptoms.

    Science.gov (United States)

    Akbaş-Şimşek, Tuğba; Onat, Altan; Kaya, Adnan; Tusun, Eyyup; Yüksel, Hüsniye; Can, Günay

    2017-12-01

    To study the predictive value of depressive symptoms (DeprSs) in a general population of Turkey for type 2 diabetes. Responses to three questions served to assess the sense of depression. Cox regression analyses were used regarding risk estimates for incident diabetes, after exclusion of prevalent cases of diabetes. Mean follow-up consisted of 5.15 (±1.4) years. Depressive symptoms were present at baseline in 16.2% of the whole study sample, threefold in women than men. Reduced physical activity grade was the only significant covariate at baseline in men, while younger age and lower blood pressure were significantly different in women compared with those without DeprS. In men, presence of DeprS predicted incident diabetes at a significant 2.58-fold relative risk (95% confidence interval 1.03; 6.44), after adjustment for age, systolic blood pressure, and antidepressant drug usage. When further covariates were added, waist circumference remained the only significant predictor, while DepS was attenuated to a relative risk of 2.12 (95% confidence interval 0.83; 5.40). DeprS was not associated with diabetes in women, whereas antidepressant drug usage only tended to be positively associated. Gender difference existed in the relationship between DeprS and incident diabetes. DeprS predicted subsequent development of diabetes in men alone, not in women. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Emphysema predicts hospitalisation and incident airflow obstruction among older smokers: a prospective cohort study.

    Directory of Open Access Journals (Sweden)

    David A McAllister

    Full Text Available Emphysema on CT is common in older smokers. We hypothesised that emphysema on CT predicts acute episodes of care for chronic lower respiratory disease among older smokers.Participants in a lung cancer screening study age ≥ 60 years were recruited into a prospective cohort study in 2001-02. Two radiologists independently visually assessed the severity of emphysema as absent, mild, moderate or severe. Percent emphysema was defined as the proportion of voxels ≤ -910 Hounsfield Units. Participants completed a median of 5 visits over a median of 6 years of follow-up. The primary outcome was hospitalization, emergency room or urgent office visit for chronic lower respiratory disease. Spirometry was performed following ATS/ERS guidelines. Airflow obstruction was defined as FEV1/FVC ratio <0.70 and FEV1<80% predicted.Of 521 participants, 4% had moderate or severe emphysema, which was associated with acute episodes of care (rate ratio 1.89; 95% CI: 1.01-3.52 adjusting for age, sex and race/ethnicity, as was percent emphysema, with similar associations for hospitalisation. Emphysema on visual assessment also predicted incident airflow obstruction (HR 5.14; 95% CI 2.19-21.1.Visually assessed emphysema and percent emphysema on CT predicted acute episodes of care for chronic lower respiratory disease, with the former predicting incident airflow obstruction among older smokers.

  14. Predicting Medical Students’ Current Attitudes Toward Psychiatry, Interest in Psychiatry, and Estimated Likelihood of Working in Psychiatry: A Cross-Sectional Study in Four European Countries

    Directory of Open Access Journals (Sweden)

    Ingeborg Warnke

    2018-03-01

    Full Text Available Psychiatry as a medical discipline is becoming increasingly important due to the high and increasing worldwide burden associated with mental disorders. Surprisingly, however, there is a lack of young academics choosing psychiatry as a career. Previous evidence on medical students’ perspectives is abundant but has methodological shortcomings. Therefore, by attempting to avoid previous shortcomings, we aimed to contribute to a better understanding of the predictors of the following three outcome variables: current medical students’ attitudes toward psychiatry, interest in psychiatry, and estimated likelihood of working in psychiatry. The sample consisted of N = 1,356 medical students at 45 medical schools in Germany and Austria as well as regions of Switzerland and Hungary with a German language curriculum. We used snowball sampling via Facebook with a link to an online questionnaire as recruitment procedure. Snowball sampling is based on referrals made among people. This questionnaire included a German version of the Attitudes Toward Psychiatry Scale (ATP-30-G and further variables related to outcomes and potential predictors in terms of sociodemography (e.g., gender or medical training (e.g., curriculum-related experience with psychiatry. Data were analyzed by linear mixed models and further regression models. On average, students had a positive attitude to and high general interest in, but low professional preference for, psychiatry. A neutral attitude to psychiatry was partly related to the discipline itself, psychiatrists, or psychiatric patients. Female gender and previous experience with psychiatry, particularly curriculum-related and personal experience, were important predictors of all outcomes. Students in the first years of medical training were more interested in pursuing psychiatry as a career. Furthermore, the country of the medical school was related to the outcomes. However, statistical models explained only a small

  15. Predicting Medical Students’ Current Attitudes Toward Psychiatry, Interest in Psychiatry, and Estimated Likelihood of Working in Psychiatry: A Cross-Sectional Study in Four European Countries

    Science.gov (United States)

    Warnke, Ingeborg; Gamma, Alex; Buadze, Maria; Schleifer, Roman; Canela, Carlos; Strebel, Bernd; Tényi, Tamás; Rössler, Wulf; Rüsch, Nicolas; Liebrenz, Michael

    2018-01-01

    Psychiatry as a medical discipline is becoming increasingly important due to the high and increasing worldwide burden associated with mental disorders. Surprisingly, however, there is a lack of young academics choosing psychiatry as a career. Previous evidence on medical students’ perspectives is abundant but has methodological shortcomings. Therefore, by attempting to avoid previous shortcomings, we aimed to contribute to a better understanding of the predictors of the following three outcome variables: current medical students’ attitudes toward psychiatry, interest in psychiatry, and estimated likelihood of working in psychiatry. The sample consisted of N = 1,356 medical students at 45 medical schools in Germany and Austria as well as regions of Switzerland and Hungary with a German language curriculum. We used snowball sampling via Facebook with a link to an online questionnaire as recruitment procedure. Snowball sampling is based on referrals made among people. This questionnaire included a German version of the Attitudes Toward Psychiatry Scale (ATP-30-G) and further variables related to outcomes and potential predictors in terms of sociodemography (e.g., gender) or medical training (e.g., curriculum-related experience with psychiatry). Data were analyzed by linear mixed models and further regression models. On average, students had a positive attitude to and high general interest in, but low professional preference for, psychiatry. A neutral attitude to psychiatry was partly related to the discipline itself, psychiatrists, or psychiatric patients. Female gender and previous experience with psychiatry, particularly curriculum-related and personal experience, were important predictors of all outcomes. Students in the first years of medical training were more interested in pursuing psychiatry as a career. Furthermore, the country of the medical school was related to the outcomes. However, statistical models explained only a small proportion of variance

  16. Predicting Medical Students' Current Attitudes Toward Psychiatry, Interest in Psychiatry, and Estimated Likelihood of Working in Psychiatry: A Cross-Sectional Study in Four European Countries.

    Science.gov (United States)

    Warnke, Ingeborg; Gamma, Alex; Buadze, Maria; Schleifer, Roman; Canela, Carlos; Strebel, Bernd; Tényi, Tamás; Rössler, Wulf; Rüsch, Nicolas; Liebrenz, Michael

    2018-01-01

    Psychiatry as a medical discipline is becoming increasingly important due to the high and increasing worldwide burden associated with mental disorders. Surprisingly, however, there is a lack of young academics choosing psychiatry as a career. Previous evidence on medical students' perspectives is abundant but has methodological shortcomings. Therefore, by attempting to avoid previous shortcomings, we aimed to contribute to a better understanding of the predictors of the following three outcome variables: current medical students' attitudes toward psychiatry, interest in psychiatry, and estimated likelihood of working in psychiatry. The sample consisted of N  = 1,356 medical students at 45 medical schools in Germany and Austria as well as regions of Switzerland and Hungary with a German language curriculum. We used snowball sampling via Facebook with a link to an online questionnaire as recruitment procedure. Snowball sampling is based on referrals made among people. This questionnaire included a German version of the Attitudes Toward Psychiatry Scale (ATP-30-G) and further variables related to outcomes and potential predictors in terms of sociodemography (e.g., gender) or medical training (e.g., curriculum-related experience with psychiatry). Data were analyzed by linear mixed models and further regression models. On average, students had a positive attitude to and high general interest in, but low professional preference for, psychiatry. A neutral attitude to psychiatry was partly related to the discipline itself, psychiatrists, or psychiatric patients. Female gender and previous experience with psychiatry, particularly curriculum-related and personal experience, were important predictors of all outcomes. Students in the first years of medical training were more interested in pursuing psychiatry as a career. Furthermore, the country of the medical school was related to the outcomes. However, statistical models explained only a small proportion of variance. The

  17. Predicting Porosity and Permeability for the Canyon Formation, SACROC Unit (Kelly-Snyder Field), Using the Geologic Analysis via Maximum Likelihood System

    International Nuclear Information System (INIS)

    Reinaldo Gonzalez; Scott R. Reeves; Eric Eslinger

    2007-01-01

    Accurate, high-resolution, three-dimensional (3D) reservoir characterization can provide substantial benefits for effective oilfield management. By doing so, the predictive reliability of reservoir flow models, which are routinely used as the basis for significant investment decisions designed to recover millions of barrels of oil, can be substantially improved. This is particularly true when Secondary Oil Recovery (SOR) or Enhanced Oil Recovery (EOR) operations are planned. If injectants such as water, hydrocarbon gases, steam, CO2, etc. are to be used; an understanding of fluid migration paths can mean the difference between economic success and failure. SOR/EOR projects will increasingly take place in heterogeneous reservoirs where interwell complexity is high and difficult to understand. Although reasonable reservoir characterization information often exists at the wellbore, the only economical way to sample the interwell region is with seismic methods which makes today's standard practice for developing a 3D reservoir description to resort to the use of seismic inversion techniques. However, the application of these methods brings other technical drawbacks than can render them inefficient. The industry therefore needs improved reservoir characterization approaches that are quicker, more accurate, and less expensive than today's standard methods. To achieve this objective, the Department of Energy (DOE) has been promoting some studies with the goal of evaluating whether robust relationships between data at vastly different scales of measurement could be established using advanced pattern recognition (soft computing) methods. Advanced Resources International (ARI) has performed two of these projects with encouraging results showing the feasibility of establishing critical relationships between data at different measurement scales to create high-resolution reservoir characterization. In this third study performed by ARI and also funded by the DOE, a model

  18. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  19. Chronic dry eye in PRK and LASIK: manifestations, incidence and predictive factors

    Science.gov (United States)

    Bower, Kraig S.; Sia, Rose K.; Ryan, Denise S.; Mines, Michael J.; Dartt, Darlene A.

    2017-01-01

    Purpose To evaluate dry eye manifestations following photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) and determine the incidence and predictive factors of chronic dry eye using a set of dry eye criteria. Setting Walter Reed Army Medical Center, Washington, DC, USA Methods This is a prospective non-randomized clinical study of 143 active duty U.S. Army personnel aged 29.9±5.2 years with myopia or myopic astigmatism (manifest spherical equivalent −3.83±1.96 diopters) undergoing either PRK or LASIK. Dry eye evaluation was performed pre- and postoperatively. Main outcome measures included dry eye manifestations, incidence, and predictive factors of chronic dry eye. Results Schirmer scores, corneal sensitivity, ocular surface staining, surface regularity index (SRI), and responses to dry eye questionnaire significantly changed over time after PRK. After LASIK, significant changes were observed in tear breakup time, corneal sensitivity, ocular surface staining, and responses to questionnaire. At twelve months postoperatively, 5.0% of PRK and 0.8% of LASIK participants developed chronic dry eye. Regression analysis showed preoperatively lower Schirmer score will significantly influence development of chronic dry eye after PRK whereas preoperatively lower Schirmer score or higher ocular surface staining score will significantly influence the occurrence of chronic dry eye after LASIK. Conclusions Chronic dry eye is uncommon after PRK and LASIK. Ocular surface and tear film characteristics during preoperative examination may help predict chronic dry eye development in PRK and LASIK. PMID:26796443

  20. Progression of diffuse esophageal spasm to achalasia: incidence and predictive factors.

    Science.gov (United States)

    Fontes, L H S; Herbella, F A M; Rodriguez, T N; Trivino, T; Farah, J F M

    2013-07-01

    The progression of certain primary esophageal motor disorders to achalasia has been documented; however, the true incidence of this decay is still elusive. This study aims to evaluate: (i) the incidence of the progression of diffuse esophageal spasm to achalasia, and (ii) predictive factors to this progression. Thirty-five patients (mean age 53 years, 80% females) with a manometric picture of diffuse esophageal spasm were followed for at least 1 year. Patients with gastroesophageal reflux disease confirmed by pH monitoring or systemic diseases that may affect esophageal motility were excluded. Esophageal manometry was repeated in all patients. Five (14%) of the patients progressed to achalasia at a mean follow-up of 2.1 (range 1-4) years. Demographic characteristics were not predictive of transition to achalasia, while dysphagia (P= 0.005) as the main symptom and the wave amplitude of simultaneous waves less than 50 mmHg (P= 0.003) were statistically significant. In conclusion, the transition of diffuse esophageal spasm to achalasia is not frequent at a 2-year follow-up. Dysphagia and simultaneous waves with low amplitude are predictive factors for this degeneration. © 2012 Copyright the Authors. Journal compilation © 2012, Wiley Periodicals, Inc. and the International Society for Diseases of the Esophagus.

  1. Assessing cutoff values for increased exercise blood pressure to predict incident hypertension in a general population.

    Science.gov (United States)

    Lorbeer, Roberto; Ittermann, Till; Völzke, Henry; Gläser, Sven; Ewert, Ralf; Felix, Stephan B; Dörr, Marcus

    2015-07-01

    Cutoff values for increased exercise blood pressure (BP) are not established in hypertension guidelines. The aim of the study was to assess optimal cutoff values for increased exercise BP to predict incident hypertension. Data of 661 normotensive participants (386 women) aged 25-77 years from the Study of Health in Pomerania (SHIP-1) with a 5-year follow-up were used. Exercise BP was measured at a submaximal level of 100 W and at maximum level of a symptom-limited cycle ergometry test. Cutoff values for increased exercise BP were defined at the maximum sum of sensitivity and specificity for the prediction of incident hypertension. The area under the receiver-operating characteristic curve (AUC) and net reclassification index (NRI) were calculated to investigate whether increased exercise BP adds predictive value for incident hypertension beyond established cardiovascular risk factors. In men, values of 160  mmHg (100  W level; AUC = 0.7837; NRI = 0.534, P AUC = 0.7677; NRI = 0.340, P = 0.003) were detected as optimal cutoff values for the definition of increased exercise SBP. A value of 190  mmHg (AUC = 0.8347; NRI = 0.519, P < 0.001) showed relevance for the definition of increased exercise SBP in women at the maximum level. According to our analyses, 190 and 210  mmHg are clinically relevant cutoff values for increased exercise SBP at the maximum exercise level of cycle ergometry test for women and men, respectively. In addition, for men, our analyses provided a cutoff value of 160  mmHg for increased exercise SBP at the 100  W level.

  2. Estimating likelihood of future crashes for crash-prone drivers

    OpenAIRE

    Subasish Das; Xiaoduan Sun; Fan Wang; Charles Leboeuf

    2015-01-01

    At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the a...

  3. Wavelength prediction of laser incident on amorphous silicon detector by neural network

    International Nuclear Information System (INIS)

    Esmaeili Sani, V.; Moussavi-Zarandi, A.; Kafaee, M.

    2011-01-01

    In this paper we present a method based on artificial neural networks (ANN) and the use of only one amorphous semiconductor detector to predict the wavelength of incident laser. Amorphous semiconductors and especially amorphous hydrogenated silicon, a-Si:H, are now widely used in many electronic devices, such as solar cells, many types of position sensitive detectors and X-ray imagers for medical applications. In order to study the electrical properties and detection characteristics of thin films of a-Si:H, n-i-p structures have been simulated by SILVACO software. The basic electronic properties of most of the materials used are known, but device modeling depends on a large number of parameters that are not all well known. In addition, the relationship between the shape of the induced anode current and the wavelength of the incident laser leads to complicated calculations. Soft data-based computational methods can model multidimensional non-linear processes and represent the complex input-output relation between the form of the output signal and the wavelength of incident laser.

  4. Wavelength prediction of laser incident on amorphous silicon detector by neural network

    Energy Technology Data Exchange (ETDEWEB)

    Esmaeili Sani, V., E-mail: vaheed_esmaeely80@yahoo.com [Amirkabir University of Technology, Faculty of Physics, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of); Moussavi-Zarandi, A.; Kafaee, M. [Amirkabir University of Technology, Faculty of Physics, P.O. Box 4155-4494, Tehran (Iran, Islamic Republic of)

    2011-10-21

    In this paper we present a method based on artificial neural networks (ANN) and the use of only one amorphous semiconductor detector to predict the wavelength of incident laser. Amorphous semiconductors and especially amorphous hydrogenated silicon, a-Si:H, are now widely used in many electronic devices, such as solar cells, many types of position sensitive detectors and X-ray imagers for medical applications. In order to study the electrical properties and detection characteristics of thin films of a-Si:H, n-i-p structures have been simulated by SILVACO software. The basic electronic properties of most of the materials used are known, but device modeling depends on a large number of parameters that are not all well known. In addition, the relationship between the shape of the induced anode current and the wavelength of the incident laser leads to complicated calculations. Soft data-based computational methods can model multidimensional non-linear processes and represent the complex input-output relation between the form of the output signal and the wavelength of incident laser.

  5. Prostate-specific antigen and long-term prediction of prostate cancer incidence and mortality in the general population

    DEFF Research Database (Denmark)

    Ørsted, David Dynnes; Nordestgaard, Børge G; Jensen, Gorm B

    2012-01-01

    It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population.......It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population....

  6. Meterology-driven Prediction of RSV/RHV Incidence in Rural Nepal

    Science.gov (United States)

    Scott, Anna; Englund, Janet; Chu, Helen; Tielsch, James; Tielsch, James; Khatry, Subarna; Leclerq, Steven C; Shrestha, Laxman; Kuypers, Jane; Steinhoff, Mark C; Katz, Joanne

    2017-01-01

    Abstract Background Incidence of respiratory syncytial virus (RSV) and rhinovirus (RHV) varies throughout the year. We aim to quantify the relationship between weather variables (temperature, humidity, precipitation, and aerosol concentration) and disease incidence in order to quantify how outbreaks of RSV and RHV are related to seasonal or sub-seasonal meteorology, and if these relationships can predict viral outbreaks of RSV and RHV. Methods Health data were collected in a community-based, prospective randomized trial of maternal influenza immunization of pregnant women and their infants conducted in rural Nepal from 2011–2014. Adult illness episodes were defined as fever plus cough, sore throat, runny nose, and/or myalgia, with infant illness defined similarly but without fever requirement. Cases were identified through longitudinal household-based weekly surveillance. Temperature, humidity, precipitation, and fine particulate matter (PM 2.5) data come from reanalysis data products NCEP, Era-Interim, and Merra-2, which are produced by assimilating historical in-situ and satellite-based observations into a weather model. Results RSV exhibits a relationship with temperature after removing the seasonal cycle (r = -0.16, N = 208, P = 0.02), and RHV exhibits a strong relationship to daily temperature (r =-0.14, N =208, P = 0.05). When lagging meteorology by up to 15 weeks, correlations with disease count and weather improve (RSV: r_max = 0.45, P < 0.05; RHV: r_max = 0.15, P = 0.05). We use an SIR model forced by lagged meteorological variables to predict RSV and RHV, suggesting that disease burden can be predicted at lead times of weeks to months. Conclusion Meteorological variables are associated with RSV and RHV incidence in rural Nepal and can be used to drive predictive models with a lead time of several months. Disclosures J. Englund, Gilead: Consultant and Investigator, Research support Chimerix: Investigator, Research support Alios: Investigator

  7. Chronic dry eye in photorefractive keratectomy and laser in situ keratomileusis: Manifestations, incidence, and predictive factors.

    Science.gov (United States)

    Bower, Kraig S; Sia, Rose K; Ryan, Denise S; Mines, Michael J; Dartt, Darlene A

    2015-12-01

    To evaluate dry-eye manifestations after photorefractive keratectomy (PRK) and laser in situ keratomileusis (LASIK) and determine the incidence and predictive factors of chronic dry eye using a set of dry-eye criteria. Walter Reed Army Medical Center, Washington, DC, USA. Prospective, non-randomized clinical study. Dry-eye evaluation was performed before and after surgery. Main outcome measures included dry-eye manifestations, incidence, and predictive factors of chronic dry eye. This study comprised 143 active-duty U.S. Army personnel, ages 29.9 ± 5.2 years, with myopia or myopic astigmatism (manifest spherical equivalent -3.83 ± 1.96 diopters) having PRK or LASIK. Schirmer scores, corneal sensitivity, ocular surface staining, surface regularity index, and responses to dry-eye questionnaire significantly changed over time after PRK. After LASIK, significant changes were observed in tear breakup time, corneal sensitivity, ocular surface staining, and responses to questionnaire. Twelve months postoperatively, 5.0% of PRK and 0.8% of LASIK participants developed chronic dry eye. Regression analysis showed that pre-operatively lower Schirmer score will significantly influence development of chronic dry eye after PRK, whereas preoperatively, lower Schirmer score or higher ocular surface staining score will significantly influence the occurrence of chronic dry eye after LASIK. Chronic dry eye was uncommon after PRK and LASIK. Ocular surface and tear-film characteristics during pre-operative examination might help to predict chronic dry-eye development in PRK and LASIK. The authors have no financial interest in any product, drug, instrument, or equipment discussed in this manuscript. Copyright © 2015 ASCRS and ESCRS. All rights reserved.

  8. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaë l; Davison, Anthony C.; Genton, Marc G.

    2015-01-01

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  9. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  10. Quantitative prediction of shrimp disease incidence via the profiles of gut eukaryotic microbiota.

    Science.gov (United States)

    Xiong, Jinbo; Yu, Weina; Dai, Wenfang; Zhang, Jinjie; Qiu, Qiongfen; Ou, Changrong

    2018-04-01

    One common notion is emerging that gut eukaryotes are commensal or beneficial, rather than detrimental. To date, however, surprisingly few studies have been taken to discern the factors that govern the assembly of gut eukaryotes, despite growing interest in the dysbiosis of gut microbiota-disease relationship. Herein, we firstly explored how the gut eukaryotic microbiotas were assembled over shrimp postlarval to adult stages and a disease progression. The gut eukaryotic communities changed markedly as healthy shrimp aged, and converged toward an adult-microbiota configuration. However, the adult-like stability was distorted by disease exacerbation. A null model untangled that the deterministic processes that governed the gut eukaryotic assembly tended to be more important over healthy shrimp development, whereas this trend was inverted as the disease progressed. After ruling out the baseline of gut eukaryotes over shrimp ages, we identified disease-discriminatory taxa (species level afforded the highest accuracy of prediction) that characteristic of shrimp health status. The profiles of these taxa contributed an overall 92.4% accuracy in predicting shrimp health status. Notably, this model can accurately diagnose the onset of shrimp disease. Interspecies interaction analysis depicted how the disease-discriminatory taxa interacted with one another in sustaining shrimp health. Taken together, our findings offer novel insights into the underlying ecological processes that govern the assembly of gut eukaryotes over shrimp postlarval to adult stages and a disease progression. Intriguingly, the established model can quantitatively and accurately predict the incidences of shrimp disease.

  11. Angiographically Negative Acute Arterial Upper and Lower Gastrointestinal Bleeding: Incidence, Predictive Factors, and Clinical Outcomes

    International Nuclear Information System (INIS)

    Kim, Jin Hyoung; Shin, Ji Hoon; Yoon, Hyun Ki; Chae, Eun Young; Myung, Seung Jae; Ko, Gi Young; Gwon, Dong Il; Sung, Kyu Bo

    2009-01-01

    To evaluate the incidence, predictive factors, and clinical outcomes of angiographically negative acute arterial upper and lower gastrointestinal (GI) bleeding. From 2001 to 2008, 143 consecutive patients who underwent an angiography for acute arterial upper or lower GI bleeding were examined. The angiographies revealed a negative bleeding focus in 75 of 143 (52%) patients. The incidence of an angiographically negative outcome was significantly higher in patients with a stable hemodynamic status (p < 0.001), or in patients with lower GI bleeding (p = 0.032). A follow-up of the 75 patients (range: 0-72 months, mean: 8 ± 14 months) revealed that 60 of the 75 (80%) patients with a negative bleeding focus underwent conservative management only, and acute bleeding was controlled without rebleeding. Three of the 75 (4%) patients underwent exploratory surgery due to prolonged bleeding; however, no bleeding focus was detected. Rebleeding occurred in 12 of 75 (16%) patients. Of these, six patients experienced massive rebleeding and died of disseminated intravascular coagulation within four to nine hours after the rebleeding episode. Four of the 16 patients underwent a repeat angiography and the two remaining patients underwent a surgical intervention to control the bleeding. Angiographically negative results are relatively common in patients with acute GI bleeding, especially in patients with a stable hemodynamic status or lower GI bleeding. Most patients with a negative bleeding focus have experienced spontaneous resolution of their condition

  12. Angiographically Negative Acute Arterial Upper and Lower Gastrointestinal Bleeding: Incidence, Predictive Factors, and Clinical Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Hyoung; Shin, Ji Hoon; Yoon, Hyun Ki; Chae, Eun Young; Myung, Seung Jae; Ko, Gi Young; Gwon, Dong Il; Sung, Kyu Bo [Asan Medical Center, Seoul (Korea, Republic of)

    2009-08-15

    To evaluate the incidence, predictive factors, and clinical outcomes of angiographically negative acute arterial upper and lower gastrointestinal (GI) bleeding. From 2001 to 2008, 143 consecutive patients who underwent an angiography for acute arterial upper or lower GI bleeding were examined. The angiographies revealed a negative bleeding focus in 75 of 143 (52%) patients. The incidence of an angiographically negative outcome was significantly higher in patients with a stable hemodynamic status (p < 0.001), or in patients with lower GI bleeding (p = 0.032). A follow-up of the 75 patients (range: 0-72 months, mean: 8 {+-} 14 months) revealed that 60 of the 75 (80%) patients with a negative bleeding focus underwent conservative management only, and acute bleeding was controlled without rebleeding. Three of the 75 (4%) patients underwent exploratory surgery due to prolonged bleeding; however, no bleeding focus was detected. Rebleeding occurred in 12 of 75 (16%) patients. Of these, six patients experienced massive rebleeding and died of disseminated intravascular coagulation within four to nine hours after the rebleeding episode. Four of the 16 patients underwent a repeat angiography and the two remaining patients underwent a surgical intervention to control the bleeding. Angiographically negative results are relatively common in patients with acute GI bleeding, especially in patients with a stable hemodynamic status or lower GI bleeding. Most patients with a negative bleeding focus have experienced spontaneous resolution of their condition.

  13. Neonatal seizures in a rural Iranian district hospital: etiologies, incidence and predicting factors.

    Science.gov (United States)

    Sadeghian, Afsaneh; Damghanian, Maryam; Shariati, Mohammad

    2012-01-01

    Current study determined the overall incidence, common causes as well as main predictors of this final diagnosis among neonates admitted to a rural district hospital in Iran. This study was conducted on 699 neonates who were candidate for admission to the NICU. Study population was categorized in the case group, including patients exposed to final diagnosis of neonatal seizures and the control group without this diagnosis. Neonatal seizure was reported as final diagnosis in 25 (3.6%) of neonates. The most frequent discharge diagnosis in the seizure group was neonatal sepsis and in the non-seizure group was respiratory problems. No significant difference was found in early fatality rate between neonates with and without seizures (8.0% vs. 10.1%). Only gestational age <38 week had a relationship with the appearance of neonatal seizure. Low gestational age has a crucial role for predicting appearance of seizure in Iranian neonates.

  14. Incidence, predictive factors, and clinical outcomes of acute kidney injury after gastric surgery for gastric cancer.

    Directory of Open Access Journals (Sweden)

    Chang Seong Kim

    Full Text Available BACKGROUND: Postoperative acute kidney injury (AKI, a serious surgical complication, is common after cardiac surgery; however, reports on AKI after noncardiac surgery are limited. We sought to determine the incidence and predictive factors of AKI after gastric surgery for gastric cancer and its effects on the clinical outcomes. METHODS: We conducted a retrospective study of 4718 patients with normal renal function who underwent partial or total gastrectomy for gastric cancer between June 2002 and December 2011. Postoperative AKI was defined by serum creatinine change, as per the Kidney Disease Improving Global Outcomes guideline. RESULTS: Of the 4718 patients, 679 (14.4% developed AKI. Length of hospital stay, intensive care unit admission rates, and in-hospital mortality rate (3.5% versus 0.2% were significantly higher in patients with AKI than in those without. AKI was also associated with requirement of renal replacement therapy. Multivariate analysis revealed that male gender; hypertension; chronic obstructive pulmonary disease; hypoalbuminemia (<4 g/dl; use of diuretics, vasopressors, and contrast agents; and packed red blood cell transfusion were independent predictors for AKI after gastric surgery. Postoperative AKI and vasopressor use entailed a high risk of 3-month mortality after multiple adjustments. CONCLUSIONS: AKI was common after gastric surgery for gastric cancer and associated with adverse outcomes. We identified several factors associated with postoperative AKI; recognition of these predictive factors may help reduce the incidence of AKI after gastric surgery. Furthermore, postoperative AKI in patients with gastric cancer is an important risk factor for short-term mortality.

  15. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    OpenAIRE

    Loha, Eskindir; Lindtj?rn, Bernt

    2010-01-01

    Abstract Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 4...

  16. Dietary Sodium Consumption Predicts Future Blood Pressure and Incident Hypertension in the Japanese Normotensive General Population.

    Science.gov (United States)

    Takase, Hiroyuki; Sugiura, Tomonori; Kimura, Genjiro; Ohte, Nobuyuki; Dohi, Yasuaki

    2015-07-29

    Although there is a close relationship between dietary sodium and hypertension, the concept that persons with relatively high dietary sodium are at increased risk of developing hypertension compared with those with relatively low dietary sodium has not been studied intensively in a cohort. We conducted an observational study to investigate whether dietary sodium intake predicts future blood pressure and the onset of hypertension in the general population. Individual sodium intake was estimated by calculating 24-hour urinary sodium excretion from spot urine in 4523 normotensive participants who visited our hospital for a health checkup. After a baseline examination, they were followed for a median of 1143 days, with the end point being development of hypertension. During the follow-up period, hypertension developed in 1027 participants (22.7%). The risk of developing hypertension was higher in those with higher rather than lower sodium intake (hazard ratio 1.25, 95% CI 1.04 to 1.50). In multivariate Cox proportional hazards regression analysis, baseline sodium intake and the yearly change in sodium intake during the follow-up period (as continuous variables) correlated with the incidence of hypertension. Furthermore, both the yearly increase in sodium intake and baseline sodium intake showed significant correlations with the yearly increase in systolic blood pressure in multivariate regression analysis after adjustment for possible risk factors. Both relatively high levels of dietary sodium intake and gradual increases in dietary sodium are associated with future increases in blood pressure and the incidence of hypertension in the Japanese general population. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  17. Spontaneous regression of retinopathy of prematurity:incidence and predictive factors

    Directory of Open Access Journals (Sweden)

    Rui-Hong Ju

    2013-08-01

    Full Text Available AIM:To evaluate the incidence of spontaneous regression of changes in the retina and vitreous in active stage of retinopathy of prematurity(ROP and identify the possible relative factors during the regression.METHODS: This was a retrospective, hospital-based study. The study consisted of 39 premature infants with mild ROP showed spontaneous regression (Group A and 17 with severe ROP who had been treated before naturally involuting (Group B from August 2008 through May 2011. Data on gender, single or multiple pregnancy, gestational age, birth weight, weight gain from birth to the sixth week of life, use of oxygen in mechanical ventilation, total duration of oxygen inhalation, surfactant given or not, need for and times of blood transfusion, 1,5,10-min Apgar score, presence of bacterial or fungal or combined infection, hyaline membrane disease (HMD, patent ductus arteriosus (PDA, duration of stay in the neonatal intensive care unit (NICU and duration of ROP were recorded.RESULTS: The incidence of spontaneous regression of ROP with stage 1 was 86.7%, and with stage 2, stage 3 was 57.1%, 5.9%, respectively. With changes in zone Ⅲ regression was detected 100%, in zoneⅡ 46.2% and in zoneⅠ 0%. The mean duration of ROP in spontaneous regression group was 5.65±3.14 weeks, lower than that of the treated ROP group (7.34±4.33 weeks, but this difference was not statistically significant (P=0.201. GA, 1min Apgar score, 5min Apgar score, duration of NICU stay, postnatal age of initial screening and oxygen therapy longer than 10 days were significant predictive factors for the spontaneous regression of ROP (P<0.05. Retinal hemorrhage was the only independent predictive factor the spontaneous regression of ROP (OR 0.030, 95%CI 0.001-0.775, P=0.035.CONCLUSION:This study showed most stage 1 and 2 ROP and changes in zone Ⅲ can spontaneously regression in the end. Retinal hemorrhage is weakly inversely associated with the spontaneous regression.

  18. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia.

    Science.gov (United States)

    Loha, Eskindir; Lindtjørn, Bernt

    2010-06-16

    Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations), temperature (17 locations), and relative humidity (three locations). Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF) models and univariate auto-regressive integrated moving average (ARIMA) when there was no significant predictor meteorological variable. Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations) or when coupled with meteorological variables (four locations) was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location to location, and among lagged effects, data

  19. Model variations in predicting incidence of Plasmodium falciparum malaria using 1998-2007 morbidity and meteorological data from south Ethiopia

    Directory of Open Access Journals (Sweden)

    Loha Eskindir

    2010-06-01

    Full Text Available Abstract Background Malaria transmission is complex and is believed to be associated with local climate changes. However, simple attempts to extrapolate malaria incidence rates from averaged regional meteorological conditions have proven unsuccessful. Therefore, the objective of this study was to determine if variations in specific meteorological factors are able to consistently predict P. falciparum malaria incidence at different locations in south Ethiopia. Methods Retrospective data from 42 locations were collected including P. falciparum malaria incidence for the period of 1998-2007 and meteorological variables such as monthly rainfall (all locations, temperature (17 locations, and relative humidity (three locations. Thirty-five data sets qualified for the analysis. Ljung-Box Q statistics was used for model diagnosis, and R squared or stationary R squared was taken as goodness of fit measure. Time series modelling was carried out using Transfer Function (TF models and univariate auto-regressive integrated moving average (ARIMA when there was no significant predictor meteorological variable. Results Of 35 models, five were discarded because of the significant value of Ljung-Box Q statistics. Past P. falciparum malaria incidence alone (17 locations or when coupled with meteorological variables (four locations was able to predict P. falciparum malaria incidence within statistical significance. All seasonal AIRMA orders were from locations at altitudes above 1742 m. Monthly rainfall, minimum and maximum temperature was able to predict incidence at four, five and two locations, respectively. In contrast, relative humidity was not able to predict P. falciparum malaria incidence. The R squared values for the models ranged from 16% to 97%, with the exception of one model which had a negative value. Models with seasonal ARIMA orders were found to perform better. However, the models for predicting P. falciparum malaria incidence varied from location

  20. The role of surface topography in predicting scattering at grazing incidence from optical surfaces

    International Nuclear Information System (INIS)

    Rehn, V.; Jones, V.O.; Elson, J.M.; Bennett, J.M.

    1980-01-01

    Monochromator design and the design of optical experiments at XUV and X-ray wavelengths are frequently limited by scattering from optical components, yet theoretical treatments are few and untested experimentally. This is partly due to the failure of scattering models used in the visible and near UV when the wavelength becomes comparable to, or smaller than, the topographic features on the surface, and partly it is due to the difficulty in measuring the topography on the required size scale. We briefly review the theoretical problems and prospects for accurately predicting both the magnitude and angular distribution of scattering at grazing incidence from optical surfaces. Experimental methods for determining and representing the surface topography are also reviewed, together with their limitations and ranges of applicability. Finally, the first results of our experiments, conducted recently at the Stanford Synchrotron Radiation Laboratory on the angular distribution of scattering by surfaces of known topography are presented and discussed, along with their potential implications for the theory of scattering, and for XUV and X-ray optical components. (orig.)

  1. Incidence of atrial fibrillation and its risk prediction model based on a prospective urban Han Chinese cohort.

    Science.gov (United States)

    Ding, L; Li, J; Wang, C; Li, X; Su, Q; Zhang, G; Xue, F

    2017-09-01

    Prediction models of atrial fibrillation (AF) have been developed; however, there was no AF prediction model validated in Chinese population. Therefore, we aimed to investigate the incidence of AF in urban Han Chinese health check-up population, as well as to develop AF prediction models using behavioral, anthropometric, biochemical, electrocardiogram (ECG) markers, as well as visit-to-visit variability (VVV) in blood pressures available in the routine health check-up. A total of 33 186 participants aged 45-85 years and free of AF at baseline were included in this cohort, to follow up for incident AF with an annually routine health check-up. Cox regression models were used to develop AF prediction model and 10-fold cross-validation was used to test the discriminatory accuracy of prediction model. We developed three prediction models, with age, sex, history of coronary heart disease (CHD), hypertension as predictors for simple model, with left high-amplitude waves, premature beats added for ECG model, and with age, sex, history of CHD and VVV in systolic and diabolic blood pressures as predictors for VVV model, to estimate risk of incident AF. The calibration of our models ranged from 1.001 to 1.004 (P for Hosmer Lemeshow test >0.05). The area under receiver operator characteristics curve were 78%, 80% and 82%, respectively, for predicting risk of AF. In conclusion, we have identified predictors of incident AF and developed prediction models for AF with variables readily available in routine health check-up.

  2. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    Science.gov (United States)

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  3. Development of a multivariate model to predict the likelihood of carcinoma in patients with indeterminate peripheral lung nodules after a nondiagnostic bronchoscopic evaluation.

    Science.gov (United States)

    Voss, Jesse S; Iqbal, Seher; Jenkins, Sarah M; Henry, Michael R; Clayton, Amy C; Jett, James R; Kipp, Benjamin R; Halling, Kevin C; Maldonado, Fabien

    2014-01-01

    Studies have shown that fluorescence in situ hybridization (FISH) testing increases lung cancer detection on cytology specimens in peripheral nodules. The goal of this study was to determine whether a predictive model using clinical features and routine cytology with FISH results could predict lung malignancy after a nondiagnostic bronchoscopic evaluation. Patients with an indeterminate peripheral lung nodule that had a nondiagnostic bronchoscopic evaluation were included in this study (N = 220). FISH was performed on residual bronchial brushing cytology specimens diagnosed as negative (n = 195), atypical (n = 16), or suspicious (n = 9). FISH results included hypertetrasomy (n = 30) and negative (n = 190). Primary study end points included lung cancer status along with time to diagnosis of lung cancer or date of last clinical follow-up. Hazard ratios (HRs) were calculated using Cox proportional hazards regression model analyses, and P values < .05 were considered statistically significant. The mean age of the 220 patients was 66.7 years (range, 35-91), and most (58%) were men. Most patients (79%) were current or former smokers with a mean pack year history of 43.2 years (median, 40; range, 1-200). After multivariate analysis, hypertetrasomy FISH (HR = 2.96, P < .001), pack years (HR = 1.03 per pack year up to 50, P = .001), age (HR = 1.04 per year, P = .02), atypical or suspicious cytology (HR = 2.02, P = .04), and nodule spiculation (HR = 2.36, P = .003) were independent predictors of malignancy over time and were used to create a prediction model (C-statistic = 0.78). These results suggest that this multivariate model including test results and clinical features may be useful following a nondiagnostic bronchoscopic examination. © 2013.

  4. Ventricular arrhythmias and sudden cardiac arrest in Takotsubo cardiomyopathy: Incidence, predictive factors, and clinical implications.

    Science.gov (United States)

    Jesel, Laurence; Berthon, Charlotte; Messas, Nathan; Lim, Han S; Girardey, Mélanie; Marzak, Halim; Marchandot, Benjamin; Trinh, Annie; Ohlmann, Patrick; Morel, Olivier

    2018-04-06

    Takotsubo cardiomyopathy (TTC) is a stress-related transient cardiomyopathy. Life-threatening arrhythmias (LTA) can occur and worsen prognosis. The purpose of this study was to assess the incidence and outcome of LTA in TTC, as well as its predictive factors and clinical implications. We studied 214 consecutive cases of TTC over 8 years. The study cohort was divided into 2 groups: those with LTA (LTA group) and those without (non-LTA group). LTA was defined as ventricular tachycardia, ventricular fibrillation, or cardiac arrest. LTA occurred in 10.7% of patients mainly in the first 24 hours of hospitalization: ventricular tachycardia (n = 2), ventricular fibrillation (n = 11), cardiac arrest (n = 10: 5 asystole, 3 complete heart block, and 2 sinoatrial block). LTA were associated with lower left ventricular ejection fraction (LVEF) and a high rate of conduction disturbances. In-hospital (39.1% vs 8.9%; P 105 ms were independent predictors of LTA. In cases where a device was implanted, conduction disturbances persisted after the index event despite complete recovery of LVEF. There was no ventricular arrhythmia recurrence during follow-up. LTA occur early in patients presenting with TTC and is associated with significantly worse short- and long-term prognosis. Left ventricular impairment and QRS duration >105 ms are independent predictors of LTA. Ventricular arrhythmias occurred in the acute phase without further recurrence recorded in hospital survivors, whereas severe conduction disorders persisted during long-term follow-up. These findings may have implications on the choice of device therapy for this specific patient subgroup. Copyright © 2018 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  5. Increase in breast cancer incidence among older women in Mumbai: 30-year trends and predictions to 2025.

    Science.gov (United States)

    Dikshit, Rajesh P; Yeole, B B; Nagrani, Rajini; Dhillon, P; Badwe, R; Bray, Freddie

    2012-08-01

    Increasing trends in the incidence of breast cancer have been observed in India, including Mumbai. These have likely stemmed from an increasing adoption of lifestyle factors more akin to those commonly observed in westernized countries. Analyses of breast cancer trends and corresponding estimation of the future burden are necessary to better plan rationale cancer control programmes within the country. We used data from the population-based Mumbai Cancer Registry to study time trends in breast cancer incidence rates 1976-2005 and stratified them according to younger (25-49) and older age group (50-74). Age-period-cohort models were fitted and the net drift used as a measure of the estimated annual percentage change (EAPC). Age-period-cohort models and population projections were used to predict the age-adjusted rates and number of breast cancer cases circa 2025. Breast cancer incidence increased significantly among older women over three decades (EAPC = 1.6%; 95% CI 1.1-2.0), while lesser but significant 1% increase in incidence among younger women was observed (EAPC = 1.0; 95% CI 0.2-1.8). Non-linear period and cohort effects were observed; a trends-based model predicted a close-to-doubling of incident cases by 2025 from 1300 mean cases per annum in 2001-2005 to over 2500 cases in 2021-2025. The incidence of breast cancer has increased in Mumbai during last two to three decades, with increases greater among older women. The number of breast cancer cases is predicted to double to over 2500 cases, the vast majority affecting older women. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Incidence trends for childhood type 1 diabetes in Europe during 1989-2003 and predicted new cases 2005-20: a multicentre prospective registration study

    DEFF Research Database (Denmark)

    Patterson, Christopher C; Dahlquist, Gisela G; Gyürüs, Eva

    2009-01-01

    BACKGROUND: The incidence of type 1 diabetes in children younger than 15 years is increasing. Prediction of future incidence of this disease will enable adequate fund allocation for delivery of care to be planned. We aimed to establish 15-year incidence trends for childhood type 1 diabetes in Eur...

  7. Over Time, Do Anthropometric Measures Still Predict Diabetes Incidence in Chinese Han Nationality Population from Chengdu Community?

    Directory of Open Access Journals (Sweden)

    Kai Liu

    2013-01-01

    Full Text Available Objective. To examine whether anthropometric measures could predict diabetes incidence in a Chinese population during a 15-year follow-up. Design and Methods. The data were collected in 1992 and then again in 2007 from the same group of 687 individuals. Waist circumference, body mass index, waist to hip ratio, and waist to height ratio were collected based on a standard protocol. To assess the effects of baseline anthropometric measures on the new onset of diabetes, Cox's proportional hazards regression models were used to estimate the hazard ratios of them, and the discriminatory power of anthropometric measures for diabetes was assessed by the area under the receiver operating curve (AROC. Results. Seventy-four individuals were diagnosed with diabetes during a 15-year follow-up period (incidence: 10.8%. These anthropometric measures also predicted future diabetes during a long follow-up (. At 7-8 years, the AROC of central obesity measures (WC, WHpR, WHtR were higher than that of general obesity measures (BMI (. But, there were no significant differences among the four anthropometric measurements at 15 years. Conclusions. These anthropometric measures could still predict diabetes with a long time follow-up. However, the validity of anthropometric measures to predict incident diabetes may change with time.

  8. Trait anger but not anxiety predicts incident type 2 diabetes: The Multi-Ethnic Study of Atherosclerosis (MESA).

    Science.gov (United States)

    Abraham, Sherley; Shah, Nina G; Diez Roux, Ana; Hill-Briggs, Felicia; Seeman, Teresa; Szklo, Moyses; Schreiner, Pamela J; Golden, Sherita Hill

    2015-10-01

    Prior studies have shown a bidirectional association between depression and type 2 diabetes mellitus (T2DM); however, the prospective associations of anger and anxiety with T2DM have not been established. We hypothesized that trait anger and anxiety would predict incident T2DM, independently of depressive symptoms. In the Multi-ethnic Study of Atherosclerosis (MESA), we prospectively examined the association of trait anger and trait anxiety (assessed via the Spielberger Trait Anger and Anxiety Scales, respectively) with incident T2DM over 11.4 years in 5598 White, Black, Hispanic, and Chinese participants (53.2% women, mean age 61.6 years) at baseline without prevalent T2DM or cardiovascular disease. We used Cox proportional hazards models to calculate the hazard ratios (HR) of incident T2DM by previously defined anger category (low, moderate, high), and anxiety quartile, as there were no previously defined categories. High total trait anger was associated with incident T2DM (HR 1.50; 95% CI 1.08-2.07) relative to low total trait anger. The association was attenuated following adjustment for waist circumference (HR 1.32; 95% CI 0.94-1.86). Higher anger reaction was also associated with incident T2DM (HR=1.07; 95% CI 1.03-1.11) that remained significant after adjusting for potential confounders/explanatory factors. In contrast, trait anxiety did not predict incident T2DM. High total trait anger and anger reaction are potential modifiable risk factors for T2DM. Further research is needed to explore the mechanisms of the anger-diabetes relationship and to develop preventive interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Prediction of incidence and stability of alcohol use disorders by latent internalizing psychopathology risk profiles in adolescence and young adulthood.

    Science.gov (United States)

    Behrendt, Silke; Bühringer, Gerhard; Höfler, Michael; Lieb, Roselind; Beesdo-Baum, Katja

    2017-10-01

    Comorbid internalizing mental disorders in alcohol use disorders (AUD) can be understood as putative independent risk factors for AUD or as expressions of underlying shared psychopathology vulnerabilities. However, it remains unclear whether: 1) specific latent internalizing psychopathology risk-profiles predict AUD-incidence and 2) specific latent internalizing comorbidity-profiles in AUD predict AUD-stability. To investigate baseline latent internalizing psychopathology risk profiles as predictors of subsequent AUD-incidence and -stability in adolescents and young adults. Data from the prospective-longitudinal EDSP study (baseline age 14-24 years) were used. The study-design included up to three follow-up assessments in up to ten years. DSM-IV mental disorders were assessed with the DIA-X/M-CIDI. To investigate risk-profiles and their associations with AUD-outcomes, latent class analysis with auxiliary outcome variables was applied. AUD-incidence: a 4-class model (N=1683) was identified (classes: normative-male [45.9%], normative-female [44.2%], internalizing [5.3%], nicotine dependence [4.5%]). Compared to the normative-female class, all other classes were associated with a higher risk of subsequent incident alcohol dependence (p<0.05). AUD-stability: a 3-class model (N=1940) was identified with only one class (11.6%) with high probabilities for baseline AUD. This class was further characterized by elevated substance use disorder (SUD) probabilities and predicted any subsequent AUD (OR 8.5, 95% CI 5.4-13.3). An internalizing vulnerability may constitute a pathway to AUD incidence in adolescence and young adulthood. In contrast, no indication for a role of internalizing comorbidity profiles in AUD-stability was found, which may indicate a limited importance of such profiles - in contrast to SUD-related profiles - in AUD stability. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Nine-year incident diabetes is predicted by fatty liver indices: the French D.E.S.I.R. study

    Directory of Open Access Journals (Sweden)

    Vol Sylviane

    2010-06-01

    Full Text Available Abstract Background Fatty liver is known to be linked with insulin resistance, alcohol intake, diabetes and obesity. Biopsy and even scan-assessed fatty liver are not always feasible in clinical practice. This report evaluates the predictive ability of two recently published markers of fatty liver: the Fatty Liver Index (FLI and the NAFLD fatty liver score (NAFLD-FLS, for 9-year incident diabetes, in the French general-population cohort: Data from an Epidemiological Study on the Insulin Resistance syndrome (D.E.S.I.R. Methods At baseline, there were 1861 men and 1950 women, non-diabetic, aged 30 to 65 years. Over the follow-up, 203 incident diabetes cases (140 men, 63 women were identified by diabetes-treatment or fasting plasma glucose ≥ 7.0 mmol/l. The FLI includes: BMI, waist circumference, triglycerides and gamma glutamyl transferase, and the NAFLD-FLS: the metabolic syndrome, diabetes, insulin, alanine aminotransferase, and asparate aminotransferase. Logistic regression was used to determine the odds ratios for incident diabetes associated with categories of the fatty liver indices. Results In comparison to those with a FLI Conclusions These fatty liver indexes are simple clinical tools for evaluating the extent of liver fat and they are predictive of incident diabetes. Physicians should screen for diabetes in patients with fatty liver.

  11. Cancer incidence predictions in the North of Portugal: keeping population-based cancer registration up to date.

    Science.gov (United States)

    Castro, Clara; Antunes, Luís; Lunet, Nuno; Bento, Maria José

    2016-09-01

    Decision making towards cancer prevention and control requires monitoring of trends in cancer incidence and accurate estimation of its burden in different settings. We aimed to estimate the number of incident cases in northern Portugal for 2015 and 2020 (all cancers except nonmelanoma skin and for the 15 most frequent tumours). Cancer cases diagnosed in 1994-2009 were collected by the North Region Cancer Registry of Portugal (RORENO) and corresponding population figures were obtained from Statistics Portugal. JoinPoint regression was used to analyse incidence trends. Population projections until 2020 were derived by RORENO. Predictions were performed using the Poisson regression models proposed by Dyba and Hakulinen. The number of incident cases is expected to increase by 18.7% in 2015 and by 37.6% in 2020, with lower increments among men than among women. For most cancers considered, the number of cases will keep rising up to 2020, although decreasing trends of age-standardized rates are expected for some tumours. Cervix was the only cancer with a decreasing number of incident cases in the entire period. Thyroid and lung cancers were among those with the steepest increases in the number of incident cases expected for 2020, especially among women. In 2020, the top five cancers are expected to account for 82 and 62% of all cases diagnosed in men and women, respectively. This study contributes to a broader understanding of cancer burden in the north of Portugal and provides the basis for keeping population-based incidence estimates up to date.

  12. Soluble CD163 predicts incident chronic lung, kidney and liver disease in HIV infection

    DEFF Research Database (Denmark)

    Kirkegaard-Klitbo, Ditte M; Mejer, Niels; Knudsen, Troels B

    2017-01-01

    OBJECTIVE: To examine if monocyte and macrophage activity may be on the mechanistic pathway to non-AIDS comorbidity by investigating the associations between plasma-soluble CD163 (sCD163) and incident non-AIDS comorbidities in well treated HIV-infected individuals. DESIGN: Prospective single...... was examined using multivariable Cox proportional hazards models adjusted for pertinent covariates. RESULTS: In HIV-1-infected individuals (n = 799), the highest quartile of plasma sCD163 was associated with incident chronic lung disease [adjusted hazard ratio (aHR), 3.2; 95% confidence interval (CI): 1.34; 7.......46] and incident chronic kidney disease (aHR, 10.94; 95% CI: 2.32; 51.35), when compared with lowest quartiles. Further, (every 1 mg) increase in plasma sCD163 was positively correlated with incident liver disease (aHR, 1.12; 95% CI: 1.05; 1.19). The sCD163 level was not associated with incident cancer...

  13. Likelihood devices in spatial statistics

    NARCIS (Netherlands)

    Zwet, E.W. van

    1999-01-01

    One of the main themes of this thesis is the application to spatial data of modern semi- and nonparametric methods. Another, closely related theme is maximum likelihood estimation from spatial data. Maximum likelihood estimation is not common practice in spatial statistics. The method of moments

  14. Development of Risk Score for Predicting 3-Year Incidence of Type 2 Diabetes: Japan Epidemiology Collaboration on Occupational Health Study.

    Directory of Open Access Journals (Sweden)

    Akiko Nanri

    Full Text Available Risk models and scores have been developed to predict incidence of type 2 diabetes in Western populations, but their performance may differ when applied to non-Western populations. We developed and validated a risk score for predicting 3-year incidence of type 2 diabetes in a Japanese population.Participants were 37,416 men and women, aged 30 or older, who received periodic health checkup in 2008-2009 in eight companies. Diabetes was defined as fasting plasma glucose (FPG ≥ 126 mg/dl, random plasma glucose ≥ 200 mg/dl, glycated hemoglobin (HbA1c ≥ 6.5%, or receiving medical treatment for diabetes. Risk scores on non-invasive and invasive models including FPG and HbA1c were developed using logistic regression in a derivation cohort and validated in the remaining cohort.The area under the curve (AUC for the non-invasive model including age, sex, body mass index, waist circumference, hypertension, and smoking status was 0.717 (95% CI, 0.703-0.731. In the invasive model in which both FPG and HbA1c were added to the non-invasive model, AUC was increased to 0.893 (95% CI, 0.883-0.902. When the risk scores were applied to the validation cohort, AUCs (95% CI for the non-invasive and invasive model were 0.734 (0.715-0.753 and 0.882 (0.868-0.895, respectively. Participants with a non-invasive score of ≥ 15 and invasive score of ≥ 19 were projected to have >20% and >50% risk, respectively, of developing type 2 diabetes within 3 years.The simple risk score of the non-invasive model might be useful for predicting incident type 2 diabetes, and its predictive performance may be markedly improved by incorporating FPG and HbA1c.

  15. Microsatellite Status of Primary Colorectal Cancer Predicts the Incidence of Postoperative Colorectal Neoplasms.

    Science.gov (United States)

    Takiyama, Aki; Tanaka, Toshiaki; Yamamoto, Yoko; Hata, Keisuke; Ishihara, Soichiro; Nozawa, Hiroaki; Kawai, Kazushige; Kiyomatsu, Tomomichi; Nishikawa, Takeshi; Otani, Kensuke; Sasaki, Kazuhito; Watanabe, Toshiaki

    2017-10-01

    Few studies have evaluated the risk of postoperative colorectal neoplasms stratified by the nature of primary colorectal cancer (CRC). In this study, we revealed it on the basis of the microsatellite (MS) status of primary CRC. We retrospectively reviewed 338 patients with CRC and calculated the risk of neoplasms during postoperative surveillance colonoscopy in association with the MS status of primary CRC. A propensity score method was applied. We identified a higher incidence of metachronous rectal neoplasms after the resection of MS stable CRC than MS instable CRC (adjusted HR 5.74, p=0.04). We also observed a higher incidence of colorectal tubular adenoma in patients with MSS CRC (adjusted hazard ratio 7.09, pcolorectal cancer influenced the risk of postoperative colorectal neoplasms. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  16. Liver function tests and risk prediction of incident type 2 diabetes : evaluation in two independent cohorts

    NARCIS (Netherlands)

    Abbasi, Ali; Bakker, Stephan J. L.; Corpeleijn, Eva; van der A, Daphne L.; Gansevoort, Ron T.; Gans, Rijk O. B.; Peelen, Linda M.; van der Schouw, Yvonne T.; Stolk, Ronald P.; Navis, Gerjan; Spijkerman, Annemieke M. W.; Beulens, Joline W. J.

    2012-01-01

    Background: Liver function tests might predict the risk of type 2 diabetes. An independent study evaluating utility of these markers compared with an existing prediction model is yet lacking. Methods and Findings: We performed a case-cohort study, including random subcohort (6.5%) from 38,379

  17. Predictive modeling capabilities from incident powder and laser to mechanical properties for laser directed energy deposition

    Science.gov (United States)

    Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda

    2018-01-01

    This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.

  18. Predictive modeling capabilities from incident powder and laser to mechanical properties for laser directed energy deposition

    Science.gov (United States)

    Shin, Yung C.; Bailey, Neil; Katinas, Christopher; Tan, Wenda

    2018-05-01

    This paper presents an overview of vertically integrated comprehensive predictive modeling capabilities for directed energy deposition processes, which have been developed at Purdue University. The overall predictive models consist of vertically integrated several modules, including powder flow model, molten pool model, microstructure prediction model and residual stress model, which can be used for predicting mechanical properties of additively manufactured parts by directed energy deposition processes with blown powder as well as other additive manufacturing processes. Critical governing equations of each model and how various modules are connected are illustrated. Various illustrative results along with corresponding experimental validation results are presented to illustrate the capabilities and fidelity of the models. The good correlations with experimental results prove the integrated models can be used to design the metal additive manufacturing processes and predict the resultant microstructure and mechanical properties.

  19. Health risk factor modification predicts incidence of diabetes in an employee population: results of an 8-year longitudinal cohort study.

    Science.gov (United States)

    Rolando, Lori; Byrne, Daniel W; McGown, Paula W; Goetzel, Ron Z; Elasy, Tom A; Yarbrough, Mary I

    2013-04-01

    To understand risk factor modification effect on Type 2 diabetes incidence in a workforce population. Annual health risk assessment data (N = 3125) in years 1 through 4 were used to predict diabetes development in years 5 through 8. Employees who reduced their body mass index from 30 or more to less than 30 decreased their chances of developing diabetes (odds ratio = 0.22, 95% confidence interval: 0.05 to 0.93), while those who became obese increased their diabetes risk (odds ratio = 8.85, 95% confidence interval: 2.53 to 31.0). Weight reduction observed over a long period can result in clinically important reductions in diabetes incidence. Workplace health promotion programs may prevent diabetes among workers by encouraging weight loss and adoption of healthy lifestyle habits.

  20. Worldwide trends in gastric cancer mortality (1980-2011), with predictions to 2015, and incidence by subtype.

    Science.gov (United States)

    Ferro, Ana; Peleteiro, Bárbara; Malvezzi, Matteo; Bosetti, Cristina; Bertuccio, Paola; Levi, Fabio; Negri, Eva; La Vecchia, Carlo; Lunet, Nuno

    2014-05-01

    Gastric cancer incidence and mortality decreased substantially over the last decades in most countries worldwide, with differences in the trends and distribution of the main topographies across regions. To monitor recent mortality trends (1980-2011) and to compute short-term predictions (2015) of gastric cancer mortality in selected countries worldwide, we analysed mortality data provided by the World Health Organization. We also analysed incidence of cardia and non-cardia cancers using data from Cancer Incidence in Five Continents (2003-2007). The joinpoint regression over the most recent calendar periods gave estimated annual percent changes (EAPC) around -3% for the European Union (EU) and major European countries, as well as in Japan and Korea, and around -2% in North America and major Latin American countries. In the United States of America (USA), EU and other major countries worldwide, the EAPC, however, were lower than in previous years. The predictions for 2015 show that a levelling off of rates is expected in the USA and a few other countries. The relative contribution of cardia and non-cardia gastric cancers to the overall number of cases varies widely, with a generally higher proportion of cardia cancers in countries with lower gastric cancer incidence and mortality rates (e.g. the USA, Canada and Denmark). Despite the favourable mortality trends worldwide, in some countries the declines are becoming less marked. There still is the need to control Helicobacter pylori infection and other risk factors, as well as to improve diagnosis and management, to further reduce the burden of gastric cancer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. The Incidence of Hypothyroidism Following the Radioactive Iodine Treatment of Graves’ Disease and the Predictive Factors Influencing its Development

    International Nuclear Information System (INIS)

    Husseni, Maha Abd El-Kareem El-Sayed

    2016-01-01

    The purpose of this study is to evaluate and compare the incidence of hypothyroidism following different fixed radioactive iodine-131 ( 131 I) activities in the treatment of Graves’ disease (GD) and to investigate the predictive factors that may influence its occurrence. This retrospective analysis was performed on 272 patients with GD who were treated with 131 I, among whom 125 received 370 MBq and 147 received 555 MBq. The outcome was categorized as hypothyroidism, euthyroidism, and persistent hyperthyroidism. Multiple logistic regression analysis was performed to identify significant risk factors that affect the development of hypothyroidism. The incidence of hypothyroidism following the first low activity was 24.8% with a high treatment failure rate of 58.4%, compared with 48.3% and 32% following high activity. The overall cumulative incidence of hypothyroidism following repeated activities was 50.7%, out of which 73.9% occurred after the first activity and 20.3% after the second activity. The higher 131 I activity (P < 0.001) and average and mild enlargement of the thyroid gland (P = 0.004) were identified as significant independent factors that increase the rate of incidence of hypothyroidism (Odds ratios were 2.95 and 2.59). No correlation was found between the development of hypothyroidism and the factors such as age, gender, presence of exophthalmos, previous antithyroid medications, and the durations, and Technetium-99m (Tc-99m) pertechnetate thyroid uptake. In view of the high treatment failure rate after first low activity and lower post high activity hypothyroid incidence, high activity is recommended for GD patients, reserving the use of 370MBq for patients with average sized and mildly enlarged goiter; this increases patient convenience by avoiding multiple activities to achieve cure and long-term follow-up

  2. Prediction model for prevalence and incidence of advanced age-related macular degeneration based on genetic, demographic, and environmental variables.

    Science.gov (United States)

    Seddon, Johanna M; Reynolds, Robyn; Maller, Julian; Fagerness, Jesen A; Daly, Mark J; Rosner, Bernard

    2009-05-01

    The joint effects of genetic, ocular, and environmental variables were evaluated and predictive models for prevalence and incidence of AMD were assessed. Participants in the multicenter Age-Related Eye Disease Study (AREDS) were included in a prospective evaluation of 1446 individuals, of which 279 progressed to advanced AMD (geographic atrophy or neovascular disease) and 1167 did not progress during 6.3 years of follow-up. For prevalent AMD, 509 advanced cases were compared with 222 controls. Covariates for the incidence analysis included age, sex, education, smoking, body mass index (BMI), baseline AMD grade, and the AREDS vitamin-mineral treatment assignment. DNA specimens were evaluated for six variants in five genes related to AMD. Unconditional logistic regression analyses were performed for prevalent and incident advanced AMD. An algorithm was developed and receiver operating characteristic curves and C statistics were calculated to assess the predictive ability of risk scores to discriminate progressors from nonprogressors. All genetic polymorphisms were independently related to prevalence of advanced AMD, controlling for genetic factors, smoking, BMI, and AREDS treatment. Multivariate odds ratios (ORs) were 3.5 (95% confidence interval [CI], 1.7-7.1) for CFH Y402H; 3.7 (95% CI, 1.6-8.4) for CFH rs1410996; 25.4 (95% CI, 8.6-75.1) for LOC387715 A69S (ARMS2); 0.3 (95% CI, 0.1-0.7) for C2 E318D; 0.3 (95% CI, 0.1-0.5) for CFB; and 3.6 (95% CI, 1.4-9.4) for C3 R102G, comparing the homozygous risk/protective genotypes to the referent genotypes. For incident AMD, all these variants except CFB were significantly related to progression to advanced AMD, after controlling for baseline AMD grade and other factors, with ORs from 1.8 to 4.0 for presence of two risk alleles and 0.4 for the protective allele. An interaction was seen between CFH402H and treatment, after controlling for all genotypes. Smoking was independently related to AMD, with a multiplicative joint

  3. Predictive value of noninvasive measures of atherosclerosis for incident myocardial infarction - The Rotterdam study

    NARCIS (Netherlands)

    van der Meer, IM; Bots, ML; Hofman, A; del Sol, AI; van der Kuip, DAM; Witteman, JCM

    2004-01-01

    Background - Several noninvasive methods are available to investigate the severity of extracoronary atherosclerotic disease. No population- based study has yet examined whether differences exist between these measures with regard to their predictive value for myocardial infarction (MI) or whether a

  4. Higher levels of albuminuria within the normal range predict incident hypertension.

    Science.gov (United States)

    Forman, John P; Fisher, Naomi D L; Schopick, Emily L; Curhan, Gary C

    2008-10-01

    Higher levels of albumin excretion within the normal range are associated with cardiovascular disease in high-risk individuals. Whether incremental increases in urinary albumin excretion, even within the normal range, are associated with the development of hypertension in low-risk individuals is unknown. This study included 1065 postmenopausal women from the first Nurses' Health Study and 1114 premenopausal women from the second Nurses' Health Study who had an albumin/creatinine ratio who did not have diabetes or hypertension. Among the older women, 271 incident cases of hypertension occurred during 4 yr of follow-up, and among the younger women, 296 incident cases of hypertension occurred during 8 yr of follow-up. Cox proportional hazards regression was used to examine prospectively the association between the albumin/creatinine ratio and incident hypertension after adjustment for age, body mass index, estimated GFR, baseline BP, physical activity, smoking, and family history of hypertension. Participants who had an albumin/creatinine ratio in the highest quartile (4.34 to 24.17 mg/g for older women and 3.68 to 23.84 mg/g for younger women) were more likely to develop hypertension than those who had an albumin/creatinine ratio in the lowest quartile (hazard ratio 1.76 [95% confidence interval 1.21 to 2.56] and hazard ratio 1.35 [95% confidence interval 0.97 to 1.91] for older and younger women, respectively). Higher albumin/creatinine ratios, even within the normal range, are independently associated with increased risk for development of hypertension among women without diabetes. The definition of normal albumin excretion should be reevaluated.

  5. High C-Reactive Protein Predicts Delirium Incidence, Duration, and Feature Severity After Major Noncardiac Surgery.

    Science.gov (United States)

    Vasunilashorn, Sarinnapha M; Dillon, Simon T; Inouye, Sharon K; Ngo, Long H; Fong, Tamara G; Jones, Richard N; Travison, Thomas G; Schmitt, Eva M; Alsop, David C; Freedman, Steven D; Arnold, Steven E; Metzger, Eran D; Libermann, Towia A; Marcantonio, Edward R

    2017-08-01

    To examine associations between the inflammatory marker C-reactive protein (CRP) measured preoperatively and on postoperative day 2 (POD2) and delirium incidence, duration, and feature severity. Prospective cohort study. Two academic medical centers. Adults aged 70 and older undergoing major noncardiac surgery (N = 560). Plasma CRP was measured using enzyme-linked immunosorbent assay. Delirium was assessed from Confusion Assessment Method (CAM) interviews and chart review. Delirium duration was measured according to number of hospital days with delirium. Delirium feature severity was defined as the sum of CAM-Severity (CAM-S) scores on all postoperative hospital days. Generalized linear models were used to examine independent associations between CRP (preoperatively and POD2 separately) and delirium incidence, duration, and feature severity; prolonged hospital length of stay (LOS, >5 days); and discharge disposition. Postoperative delirium occurred in 24% of participants, 12% had 2 or more delirium days, and the mean ± standard deviation sum CAM-S was 9.3 ± 11.4. After adjusting for age, sex, surgery type, anesthesia route, medical comorbidities, and postoperative infectious complications, participants with preoperative CRP of 3 mg/L or greater had a risk of delirium that was 1.5 times as great (95% confidence interval (CI) = 1.1-2.1) as that of those with CRP less than 3 mg/L, 0.4 more delirium days (P delirium (3.6 CAM-S points higher, P delirium (95% CI = 1.0-2.4) as those in the lowest quartile (≤127.53 mg/L), had 0.2 more delirium days (P delirium (4.5 CAM-S points higher, P delirium incidence, duration, and feature severity. CRP may be useful to identify individuals who are at risk of developing delirium. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.

  6. Early Prediction of Sepsis Incidence in Critically Ill Patients Using Specific Genetic Polymorphisms.

    Science.gov (United States)

    David, Vlad Laurentiu; Ercisli, Muhammed Furkan; Rogobete, Alexandru Florin; Boia, Eugen S; Horhat, Razvan; Nitu, Razvan; Diaconu, Mircea M; Pirtea, Laurentiu; Ciuca, Ioana; Horhat, Delia; Horhat, Florin George; Licker, Monica; Popovici, Sonia Elena; Tanasescu, Sonia; Tataru, Calin

    2017-06-01

    Several diagnostic methods for the evaluation and monitoring were used to find out the pro-inflammatory status, as well as incidence of sepsis in critically ill patients. One such recent method is based on investigating the genetic polymorphisms and determining the molecular and genetic links between them, as well as other sepsis-associated pathophysiologies. Identification of genetic polymorphisms in critical patients with sepsis can become a revolutionary method for evaluating and monitoring these patients. Similarly, the complications, as well as the high costs associated with the management of patients with sepsis, can be significantly reduced by early initiation of intensive care.

  7. Predictive Value of Triglyceride Glucose Index for the Risk of Incident Diabetes: A 4-Year Retrospective Longitudinal Study

    OpenAIRE

    Lee, Da Young; Lee, Eun Seo; Kim, Ji Hyun; Park, Se Eun; Park, Cheol-Young; Oh, Ki-Won; Park, Sung-Woo; Rhee, Eun-Jung; Lee, Won-Young

    2016-01-01

    The Triglyceride Glucose Index (TyG index) is considered a surrogate marker of insulin resistance. The aim of this study is to investigate whether the TyG index has a predictive role in identifying individuals with a high risk of incident diabetes and to compare it with other indicators of metabolic health. A total 2900 non-diabetic adults who attended five consecutive annual health check-ups at Kangbuk Samsung Hospital was divided into four subgroups using three methods: (1) baseline TyG ind...

  8. Recent development of risk-prediction models for incident hypertension: An updated systematic review.

    Directory of Open Access Journals (Sweden)

    Dongdong Sun

    Full Text Available Hypertension is a leading global health threat and a major cardiovascular disease. Since clinical interventions are effective in delaying the disease progression from prehypertension to hypertension, diagnostic prediction models to identify patient populations at high risk for hypertension are imperative.Both PubMed and Embase databases were searched for eligible reports of either prediction models or risk scores of hypertension. The study data were collected, including risk factors, statistic methods, characteristics of study design and participants, performance measurement, etc.From the searched literature, 26 studies reporting 48 prediction models were selected. Among them, 20 reports studied the established models using traditional risk factors, such as body mass index (BMI, age, smoking, blood pressure (BP level, parental history of hypertension, and biochemical factors, whereas 6 reports used genetic risk score (GRS as the prediction factor. AUC ranged from 0.64 to 0.97, and C-statistic ranged from 60% to 90%.The traditional models are still the predominant risk prediction models for hypertension, but recently, more models have begun to incorporate genetic factors as part of their model predictors. However, these genetic predictors need to be well selected. The current reported models have acceptable to good discrimination and calibration ability, but whether the models can be applied in clinical practice still needs more validation and adjustment.

  9. Incidence of tuberculosis and the predictive value of ELISPOT and Mantoux tests in Gambian case contacts.

    Directory of Open Access Journals (Sweden)

    Philip C Hill

    2008-01-01

    Full Text Available Studies of Tuberculosis (TB case contacts are increasingly being utilised for understanding the relationship between M. tuberculosis and the human host and for assessing new interventions and diagnostic tests. We aimed to identify the incidence rate of new TB cases among TB contacts and to relate this to their initial Mantoux and ELISPOT test results.After initial Mantoux and ELISPOT tests and exclusion of co-prevalent TB cases, we followed 2348 household contacts of sputum smear positive TB cases. We visited them at 3 months, 6 months, 12 months, 18 months and 24 months, and investigated those with symptoms consistent with TB. Those who were diagnosed separately at a government clinic had a chest x-ray. Twenty six contacts were diagnosed with definite TB over 4312 person years of follow-up (Incidence rate 603/100,000 person years; 95% Confidence Interval, 370-830. Nine index and secondary case pairs had cultured isolates available for genotyping. Of these, 6 pairs were concordant and 3 were discordant. 2.5% of non-progressors were HIV positive compared to 12% of progressors (HR 6.2; 95% CI 1.7-22.5; p = 0.010. 25 secondary cases had initial Mantoux results, 14 (56% were positive ; 21 had initial ELISPOT results, 11 (52% were positive; 15 (71% of 21 tested were positive by one or the other test. Of the 6 contacts who had concordant isolates with their respective index case, 4 (67% were Mantoux positive at recruitment, 3 (50% were ELISPOT positive; 5 (83% were positive by one or other of the two tests. ELISPOT positive contacts, and those with discordant results, had a similar rate of progression to those who were Mantoux positive. Those negative on either or both tests had the lowest rate of progression.The incidence rate of TB disease in Gambian TB case contacts, after screening for co-prevalent cases, was 603/100,000 person years. Since initial ELISPOT test and Mantoux tests were each positive in only just over half of cases, but 71% were

  10. Alcohol-use disorder severity predicts first-incidence of depressive disorders

    NARCIS (Netherlands)

    Boschloo, L.; van den Brink, W.; Penninx, B.W.J.H.; Wall, M.M.; Hasin, D.S.

    2012-01-01

    Background Previous studies suggest that alcohol-use disorder severity, defined by the number of criteria met, provides a more informative phenotype than dichotomized DSM-IV diagnostic measures of alcohol use disorders. Therefore, this study examined whether alcohol-use disorder severity predicted

  11. Alcohol-use disorder severity predicts first-incidence of depressive disorders

    NARCIS (Netherlands)

    Boschloo, L.; van den Brink, W.; Penninx, B. W. J. H.; Wall, M. M.; Hasin, D. S.

    2012-01-01

    Background. Previous studies suggest that alcohol-use disorder severity, defined by the number of criteria met, provides a more informative phenotype than dichotomized DSM-IV diagnostic measures of alcohol use disorders. Therefore, this study examined whether alcohol-use disorder severity predicted

  12. Loss of social resources predicts incident posttraumatic stress disorder during ongoing political violence within the Palestinian Authority.

    Science.gov (United States)

    Hall, Brian J; Murray, Sarah M; Galea, Sandro; Canetti, Daphna; Hobfoll, Stevan E

    2015-04-01

    Exposure to ongoing political violence and stressful conditions increases the risk of posttraumatic stress disorder (PTSD) in low-resource contexts. However, much of our understanding of the determinants of PTSD in these contexts comes from cross-sectional data. Longitudinal studies that examine factors associated with incident PTSD may be useful to the development of effective prevention interventions and the identification of those who may be most at-risk for the disorder. A 3-stage cluster random stratified sampling methodology was used to obtain a representative sample of 1,196 Palestinian adults living in Gaza, the West Bank and East Jerusalem. Face-to-face interviews were conducted at two time points 6-months apart. Logistic regression analyses were conducted on a restricted sample of 643 people who did not have PTSD at baseline and who completed both interviews. The incidence of PTSD was 15.0 % over a 6-month period. Results of adjusted logistic regression models demonstrated that talking to friends and family about political circumstances (aOR = 0.78, p = 0.01) was protective, and female sex (aOR = 1.76, p = 0.025), threat perception of future violence (aOR = 1.50, p = 0.002), poor general health (aOR = 1.39, p = 0.005), exposure to media (aOR = 1.37, p = 0.002), and loss of social resources (aOR = 1.71, p = 0.006) were predictive of incident cases of PTSD. A high incidence of PTSD was documented during a 6-month follow-up period among Palestinian residents of Gaza, the West Bank, and East Jerusalem. Interventions that promote health and increase and forestall loss to social resources could potentially reduce the onset of PTSD in communities affected by violence.

  13. Patterns and Trends of Liver Cancer Incidence Rates in Eastern and Southeastern Asian Countries (1983-2007) and Predictions to 2030.

    Science.gov (United States)

    Wu, Jie; Yang, Shigui; Xu, Kaijin; Ding, Cheng; Zhou, Yuqing; Fu, Xiaofang; Li, Yiping; Deng, Min; Wang, Chencheng; Liu, Xiaoxiao; Li, Lanjuan

    2018-05-01

    We examined temporal trends in liver cancer incidence rates overall and by histological type from 1983 through 2007. We predict trends in liver cancer incidence rates through 2030 for selected Eastern and Southeastern Asian countries. Data on yearly liver cancer incident cases by age group and sex were drawn from 6 major selected Eastern and Southeastern Asian countries or regions with cancer registries available in the CI5plus database, including China, Japan, Hong Kong Special Administrative Region (SAR), the Philippines, Singapore, and Thailand. We also analyzed data for the United States and Australia for comparative purposes. Age-standardized incidence rates were calculated and plotted from 1983 through 2007. Numbers of new cases and incidence rates were predicted through 2030 by fitting and extrapolating age-period-cohort models. The incidence rates of liver cancer have been decreasing, and decreases will continue in all selected Eastern and Southeastern Asian countries, except for Thailand, whose liver cancer incidence rate will increase due to the increasing incidence rate of intrahepatic cholangiocarcinomas. Even though the incidence rates of liver cancer are predicted to decrease in most Eastern and Southeastern Asian countries, the burden, in terms of new cases, will continue to increase because of population growth and aging. Based on an analysis of data from cancer registries from Asian countries, incidence rates of liver cancer are expected to decrease through 2030 in most Eastern and Southeastern Asian countries. However, in Thailand, the incidence rate of intrahepatic cholangiocarcinomas is predicted to increase, so health education programs are necessary. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  14. The paradox of high apolipoprotein A-I levels independently predicting incident type-2 diabetes among Turks.

    Science.gov (United States)

    Onat, A; Hergenç, G; Bulur, S; Uğur, M; Küçükdurmaz, Z; Can, G

    2010-06-25

    Predictive value of apolipoprotein (apo) A-I for incident hypertension, metabolic syndrome (MetS), type 2 diabetes (DM) and coronary heart disease (CHD) needs further exploration. A representative sample of Turkish adults was studied with this purpose prospectively. Sex-specific apoA-I tertiles were examined regarding cardiometabolic risk. A total of 1044 men and 1067 women (aged 49+/-12 years at baseline) were followed up over 7.4 years. High serum apoA-I levels were significantly associated in multivariable analysis with female sex, aging, alcohol intake, (inversely) cigarette smoking and, in women, with systolic blood pressure. Risk of diabetes was predicted in logistic regression in both genders by top versus bottom apoA-I tertile (RR 1.98; [95%CI 1.31; 3.0]), additive to age, body mass index (BMI), C-reactive protein (CRP), HDL-cholesterol and lipid lowering drugs. By adding sex hormone-binding globulin to the model in a subset of the sample, the association between high apoA-I and incident diabetes was attenuated only in women. ApoA-I tertiles tended to be positively associated also with hypertension and CHD only in women but this did not reach significance. High compared with low serum apoA-I levels nearly double the risk for incident diabetes, additively to age, BMI, CRP, HDL-cholesterol among Turks. Systemic inflammation concomitant with prevailing MetS might turn apoA-I into proinflammatory particles. Copyright 2008 Elsevier Ireland Ltd. All rights reserved.

  15. Obtaining reliable Likelihood Ratio tests from simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    It is standard practice by researchers and the default option in many statistical programs to base test statistics for mixed models on simulations using asymmetric draws (e.g. Halton draws). This paper shows that when the estimated likelihood functions depend on standard deviations of mixed param...

  16. Does caries risk assessment predict the incidence of caries for special needs patients requiring general anesthesia?

    Science.gov (United States)

    Chang, Juhea; Kim, Hae-Young

    2014-11-01

    The aim of this study was to correlate the caries-related variables of special needs patients to the incidence of new caries. Data for socio-demographic information and dental and general health status were obtained from 110 patients treated under general anesthesia because of their insufficient co-operation. The Cariogram program was used for risk assessment and other caries-related variables were also analyzed. Within a defined follow-up period (16.3 ± 9.5 months), 64 patients received dental examinations to assess newly developed caries. At baseline, the mean (SD) values of the DMFT (decayed, missing and filled teeth) and DT (decayed teeth) for the total patients were 9.2 (6.5) and 5.8 (5.3), respectively. During the follow-up period, new caries occurred in 48.4% of the patients and the mean value (SD) of the increased DMFT (iDMFT) was 2.1 (4.2). The patients with a higher increment of caries (iDMFT ≥3) showed significantly different caries risk profiles compared to the other patients (iDMFT dentistry. Past caries experience and inadequate oral hygiene maintenance were largely related to caries development in special needs patients.

  17. Predictors of incident heart failure in patients after an acute coronary syndrome: The LIPID heart failure risk-prediction model.

    Science.gov (United States)

    Driscoll, Andrea; Barnes, Elizabeth H; Blankenberg, Stefan; Colquhoun, David M; Hunt, David; Nestel, Paul J; Stewart, Ralph A; West, Malcolm J; White, Harvey D; Simes, John; Tonkin, Andrew

    2017-12-01

    Coronary heart disease is a major cause of heart failure. Availability of risk-prediction models that include both clinical parameters and biomarkers is limited. We aimed to develop such a model for prediction of incident heart failure. A multivariable risk-factor model was developed for prediction of first occurrence of heart failure death or hospitalization. A simplified risk score was derived that enabled subjects to be grouped into categories of 5-year risk varying from 20%. Among 7101 patients from the LIPID study (84% male), with median age 61years (interquartile range 55-67years), 558 (8%) died or were hospitalized because of heart failure. Older age, history of claudication or diabetes mellitus, body mass index>30kg/m 2 , LDL-cholesterol >2.5mmol/L, heart rate>70 beats/min, white blood cell count, and the nature of the qualifying acute coronary syndrome (myocardial infarction or unstable angina) were associated with an increase in heart failure events. Coronary revascularization was associated with a lower event rate. Incident heart failure increased with higher concentrations of B-type natriuretic peptide >50ng/L, cystatin C>0.93nmol/L, D-dimer >273nmol/L, high-sensitivity C-reactive protein >4.8nmol/L, and sensitive troponin I>0.018μg/L. Addition of biomarkers to the clinical risk model improved the model's C statistic from 0.73 to 0.77. The net reclassification improvement incorporating biomarkers into the clinical model using categories of 5-year risk was 23%. Adding a multibiomarker panel to conventional parameters markedly improved discrimination and risk classification for future heart failure events. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. Predictive Value of Triglyceride Glucose Index for the Risk of Incident Diabetes: A 4-Year Retrospective Longitudinal Study.

    Science.gov (United States)

    Lee, Da Young; Lee, Eun Seo; Kim, Ji Hyun; Park, Se Eun; Park, Cheol-Young; Oh, Ki-Won; Park, Sung-Woo; Rhee, Eun-Jung; Lee, Won-Young

    The Triglyceride Glucose Index (TyG index) is considered a surrogate marker of insulin resistance. The aim of this study is to investigate whether the TyG index has a predictive role in identifying individuals with a high risk of incident diabetes and to compare it with other indicators of metabolic health. A total 2900 non-diabetic adults who attended five consecutive annual health check-ups at Kangbuk Samsung Hospital was divided into four subgroups using three methods: (1) baseline TyG index; (2) obesity status (body mass index ≥25 kg/m2) and cutoff value of TyG index; (3) obesity status and metabolic health, defined as having fewer than two of the five components of high blood pressure, fasting blood glucose, triglyceride, low high-density lipoprotein cholesterol, and highest decile of homeostasis model assessment-insulin resistance. The development of diabetes was assessed annually using self-questionnaire, fasting glucose, and glycated hemoglobin. We compared the risk of incident diabetes using multivariate Cox analysis. During 11623 person-years there were 101 case of incident diabetes. Subjects with high TyG index had a high risk of diabetes. For TyG index quartiles, hazard ratios (HRs) of quartiles 3 and 4 were 4.06 (p = 0.033) and 5.65 (p = 0.006) respectively. When the subjects were divided by obesity status and cutoff value of TyG index of 8.8, the subgroups with TyG index ≥ 8.8 regardless of obesity had a significantly high risk for diabetes (HR 2.40 [p = 0.024] and 2.25 [p = 0.048]). For obesity status and metabolic health, the two metabolically unhealthy subgroups regardless of obesity had a significantly high risk for diabetes (HRs 2.54 [p = 0.024] and 2.73 [p = 0.021]). In conclusion, the TyG index measured at a single time point may be an indicator of the risk for incident diabetes. The predictive value of the TyG index was comparable to that of metabolic health.

  19. Validation of a multi-marker model for the prediction of incident type 2 diabetes mellitus

    DEFF Research Database (Denmark)

    Lyssenko, Valeriya; Jørgensen, Torben; Gerwien, Robert W

    2012-01-01

    Purpose: To assess performance of a biomarker-based score that predicts the five-year risk of diabetes (Diabetes Risk Score, DRS) in an independent cohort that included 15-year follow-up. Method: DRS was developed on the Inter99 cohort, and validated on the Botnia cohort. Performance...... was benchmarked against other risk-assessment tools comparing calibration, time to event analysis, and net reclassification. Results: The area under the receiver-operating characteristic curve (AUC) was 0.84 for the Inter99 cohort and 0.78 for the Botnia cohort. In the Botnia cohort, DRS provided better...... discrimination than fasting plasma glucose (FPG), homeostasis model assessment of insulin resistance, oral glucose tolerance test or risk scores derived from Framingham or San Antonio Study cohorts. Overall reclassification with DRS was significantly better than using FPG and glucose tolerance status (p

  20. A systematic review of breast cancer incidence risk prediction models with meta-analysis of their performance.

    Science.gov (United States)

    Meads, Catherine; Ahmed, Ikhlaaq; Riley, Richard D

    2012-04-01

    A risk prediction model is a statistical tool for estimating the probability that a currently healthy individual with specific risk factors will develop a condition in the future such as breast cancer. Reliably accurate prediction models can inform future disease burdens, health policies and individual decisions. Breast cancer prediction models containing modifiable risk factors, such as alcohol consumption, BMI or weight, condom use, exogenous hormone use and physical activity, are of particular interest to women who might be considering how to reduce their risk of breast cancer and clinicians developing health policies to reduce population incidence rates. We performed a systematic review to identify and evaluate the performance of prediction models for breast cancer that contain modifiable factors. A protocol was developed and a sensitive search in databases including MEDLINE and EMBASE was conducted in June 2010. Extensive use was made of reference lists. Included were any articles proposing or validating a breast cancer prediction model in a general female population, with no language restrictions. Duplicate data extraction and quality assessment were conducted. Results were summarised qualitatively, and where possible meta-analysis of model performance statistics was undertaken. The systematic review found 17 breast cancer models, each containing a different but often overlapping set of modifiable and other risk factors, combined with an estimated baseline risk that was also often different. Quality of reporting was generally poor, with characteristics of included participants and fitted model results often missing. Only four models received independent validation in external data, most notably the 'Gail 2' model with 12 validations. None of the models demonstrated consistently outstanding ability to accurately discriminate between those who did and those who did not develop breast cancer. For example, random-effects meta-analyses of the performance of the

  1. Rib fractures predict incident limb fractures: results from the European prospective osteoporosis study.

    Science.gov (United States)

    Ismail, A A; Silman, A J; Reeve, J; Kaptoge, S; O'Neill, T W

    2006-01-01

    Population studies suggest that rib fractures are associated with a reduction in bone mass. While much is known about the predictive risk of hip, spine and distal forearm fracture on the risk of future fracture, little is known about the impact of rib fracture. The aim of this study was to determine whether a recalled history of rib fracture was associated with an increased risk of future limb fracture. Men and women aged 50 years and over were recruited from population registers in 31 European centres for participation in a screening survey of osteoporosis (European Prospective Osteoporosis Study). Subjects were invited to complete an interviewer-administered questionnaire that included questions about previous fractures including rib fracture, the age of their first fracture and also the level of trauma. Lateral spine radiographs were performed and the presence of vertebral deformity was determined morphometrically. Following the baseline survey, subjects were followed prospectively by annual postal questionnaire to determine the occurrence of clinical fractures. The subjects included 6,344 men, with a mean age of 64.2 years, and 6,788 women, with a mean age of 63.6 years, who were followed for a median of 3 years (range 0.4-5.9 years), of whom 135 men (2.3%) and 101 women (1.6%) reported a previous low trauma rib fracture. In total, 138 men and 391 women sustained a limb fracture during follow-up. In women, after age adjustment, those with a recalled history of low trauma rib fracture had an increased risk of sustaining 'any' limb fracture [relative hazard (RH)=2.3; 95% CI 1.3, 4.0]. When stratified by fracture type the predictive risk was more marked for hip (RH=7.7; 95% CI 2.3, 25.9) and humerus fracture (RH=4.5; 95% CI 1.4, 14.6) than other sites (RH=1.6; 95% CI 0.6, 4.3). Additional adjustment for prevalent vertebral deformity and previous (non-rib) low trauma fractures at other sites slightly reduced the strength of the association between rib fracture and

  2. Individual reactions to stress predict performance during a critical aviation incident.

    Science.gov (United States)

    Vine, Samuel J; Uiga, Liis; Lavric, Aureliu; Moore, Lee J; Tsaneva-Atanasova, Krasimira; Wilson, Mark R

    2015-01-01

    Understanding the influence of stress on human performance is of theoretical and practical importance. An individual's reaction to stress predicts their subsequent performance; with a "challenge" response to stress leading to better performance than a "threat" response. However, this contention has not been tested in truly stressful environments with highly skilled individuals. Furthermore, the effect of challenge and threat responses on attentional control during visuomotor tasks is poorly understood. Thus, this study aimed to examine individual reactions to stress and their influence on attentional control, among a cohort of commercial pilots performing a stressful flight assessment. Sixteen pilots performed an "engine failure on take-off" scenario, in a high-fidelity flight simulator. Reactions to stress were indexed via self-report; performance was assessed subjectively (flight instructor assessment) and objectively (simulator metrics); gaze behavior data were captured using a mobile eye tracker, and measures of attentional control were subsequently calculated (search rate, stimulus driven attention, and entropy). Hierarchical regression analyses revealed that a threat response was associated with poorer performance and disrupted attentional control. The findings add to previous research showing that individual reactions to stress influence performance and shed light on the processes through which stress influences performance.

  3. Poor Positive Predictive Value of Lyme Disease Serologic Testing in an Area of Low Disease Incidence.

    Science.gov (United States)

    Lantos, Paul M; Branda, John A; Boggan, Joel C; Chudgar, Saumil M; Wilson, Elizabeth A; Ruffin, Felicia; Fowler, Vance; Auwaerter, Paul G; Nigrovic, Lise E

    2015-11-01

    Lyme disease is diagnosed by 2-tiered serologic testing in patients with a compatible clinical illness, but the significance of positive test results in low-prevalence regions has not been investigated. We reviewed the medical records of patients who tested positive for Lyme disease with standardized 2-tiered serologic testing between 2005 and 2010 at a single hospital system in a region with little endemic Lyme disease. Based on clinical findings, we calculated the positive predictive value of Lyme disease serology. Next, we reviewed the outcome of serologic testing in patients with select clinical syndromes compatible with disseminated Lyme disease (arthritis, cranial neuropathy, or meningitis). During the 6-year study period 4723 patients were tested for Lyme disease, but only 76 (1.6%) had positive results by established laboratory criteria. Among 70 seropositive patients whose medical records were available for review, 12 (17%; 95% confidence interval, 9%-28%) were found to have Lyme disease (6 with documented travel to endemic regions). During the same time period, 297 patients with a clinical illness compatible with disseminated Lyme disease underwent 2-tiered serologic testing. Six of them (2%; 95% confidence interval, 0.7%-4.3%) were seropositive, 3 with documented travel and 1 who had an alternative diagnosis that explained the clinical findings. In this low-prevalence cohort, fewer than 20% of positive Lyme disease tests are obtained from patients with clinically likely Lyme disease. Positive Lyme disease test results may have little diagnostic value in this setting. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Predictive factors for and incidence of hospital readmissions of patients with acute and chronic pancreatitis.

    Science.gov (United States)

    Suchsland, Till; Aghdassi, Ali; Kühn, Kristina; Simon, Peter; Lerch, Markus M; Mayerle, Julia; Flessa, Steffen

    2015-01-01

    Acute and chronic pancreatitis are common gastroenterological disorders that have a fairly unpredictable long-term course often associated with unplanned hospital readmissions. Little is known about the factors that increase or decrease the risk for a hospital readmission. The aim of this study was to identify positive and negative predictive factors for hospital readmissions of patients with acute and chronic pancreatitis after in-hospital treatment. In a retrospective analysis data from the hospital information and reimbursement data system (HIS) were evaluated for 606 hospital stays for either acute or chronic pancreatitis between 2006 and 2011. Additional clinical data were obtained from a questionnaire covering quality of life and socio-economic status. A total of 973 patient variables were assessed by bivariate and multivariate analysis. Between 2006 and 2011, 373 patients were admitted for acute or chronic pancreatitis; 107 patients of them were readmitted and 266 had only one hospitalization. Predictors for readmission were concomitant liver disease, presence of a pseudocyst or a suspected tumor of the pancreas as well as alcohol, tobacco or substance abuse or coexisting mental disorders. Patients who had undergone a CT-scan were more susceptible to readmission. Lower readmissions rates were found in patients with diabetes mellitus or gallstone disease as co-morbidity. While factors like age and severity of the initial disease cannot be influenced to reduce the readmission rate for pancreatitis, variables like alcohol, tobacco and drug abuse can be addressed in outpatient programs to reduce disease recurrence and readmission rates for pancreatitis. Copyright © 2015 IAP and EPC. Published by Elsevier B.V. All rights reserved.

  5. In 'big bang' major incidents do triage tools accurately predict clinical priority?: a systematic review of the literature.

    Science.gov (United States)

    Kilner, T M; Brace, S J; Cooke, M W; Stallard, N; Bleetman, A; Perkins, G D

    2011-05-01

    The term "big bang" major incidents is used to describe sudden, usually traumatic,catastrophic events, involving relatively large numbers of injured individuals, where demands on clinical services rapidly outstrip the available resources. Triage tools support the pre-hospital provider to prioritise which patients to treat and/or transport first based upon clinical need. The aim of this review is to identify existing triage tools and to determine the extent to which their reliability and validity have been assessed. A systematic review of the literature was conducted to identify and evaluate published data validating the efficacy of the triage tools. Studies using data from trauma patients that report on the derivation, validation and/or reliability of the specific pre-hospital triage tools were eligible for inclusion.Purely descriptive studies, reviews, exercises or reports (without supporting data) were excluded. The search yielded 1982 papers. After initial scrutiny of title and abstract, 181 papers were deemed potentially applicable and from these 11 were identified as relevant to this review (in first figure). There were two level of evidence one studies, three level of evidence two studies and six level of evidence three studies. The two level of evidence one studies were prospective validations of Clinical Decision Rules (CDR's) in children in South Africa, all the other studies were retrospective CDR derivation, validation or cohort studies. The quality of the papers was rated as good (n=3), fair (n=7), poor (n=1). There is limited evidence for the validity of existing triage tools in big bang major incidents.Where evidence does exist it focuses on sensitivity and specificity in relation to prediction of trauma death or severity of injury based on data from single or small number patient incidents. The Sacco system is unique in combining survivability modelling with the degree by which the system is overwhelmed in the triage decision system. The

  6. Incremental Predictive Value of Serum AST-to-ALT Ratio for Incident Metabolic Syndrome: The ARIRANG Study

    Science.gov (United States)

    Ahn, Song Vogue; Baik, Soon Koo; Cho, Youn zoo; Koh, Sang Baek; Huh, Ji Hye; Chang, Yoosoo; Sung, Ki-Chul; Kim, Jang Young

    2016-01-01

    Aims The ratio of aspartate aminotransferase (AST) to alanine aminotransferase (ALT) is of great interest as a possible novel marker of metabolic syndrome. However, longitudinal studies emphasizing the incremental predictive value of the AST-to-ALT ratio in diagnosing individuals at higher risk of developing metabolic syndrome are very scarce. Therefore, our study aimed to evaluate the AST-to-ALT ratio as an incremental predictor of new onset metabolic syndrome in a population-based cohort study. Material and Methods The population-based cohort study included 2276 adults (903 men and 1373 women) aged 40–70 years, who participated from 2005–2008 (baseline) without metabolic syndrome and were followed up from 2008–2011. Metabolic syndrome was defined according to the harmonized definition of metabolic syndrome. Serum concentrations of AST and ALT were determined by enzymatic methods. Results During an average follow-up period of 2.6-years, 395 individuals (17.4%) developed metabolic syndrome. In a multivariable adjusted model, the odds ratio (95% confidence interval) for new onset of metabolic syndrome, comparing the fourth quartile to the first quartile of the AST-to-ALT ratio, was 0.598 (0.422–0.853). The AST-to-ALT ratio also improved the area under the receiver operating characteristic curve (AUC) for predicting new cases of metabolic syndrome (0.715 vs. 0.732, P = 0.004). The net reclassification improvement of prediction models including the AST-to-ALT ratio was 0.23 (95% CI: 0.124–0.337, Pmetabolic syndrome and had incremental predictive value for incident metabolic syndrome. PMID:27560931

  7. METS-IR, a novel score to evaluate insulin sensitivity, is predictive of visceral adiposity and incident type 2 diabetes.

    Science.gov (United States)

    Bello-Chavolla, Omar Yaxmehen; Almeda-Valdes, Paloma; Gomez-Velasco, Donaji; Viveros-Ruiz, Tannia; Cruz-Bautista, Ivette; Romo-Romo, Alonso; Sánchez-Lázaro, Daniel; Meza-Oviedo, Dushan; Vargas-Vázquez, Arsenio; Campos, Olimpia Arellano; Sevilla-González, Magdalena Del Rocío; Martagón, Alexandro J; Hernández, Liliana Muñoz; Mehta, Roopa; Caballeros-Barragán, César Rodolfo; Aguilar-Salinas, Carlos A

    2018-05-01

    We developed a novel non-insulin-based fasting score to evaluate insulin sensitivity validated against the euglycemic-hyperinsulinemic clamp (EHC). We also evaluated its correlation with ectopic fact accumulation and its capacity to predict incident type 2 diabetes mellitus (T2D). The discovery sample was composed by 125 subjects (57 without and 68 with T2D) that underwent an EHC. We defined METS-IR as Ln((2*G 0 )+TG 0 )*BMI)/(Ln(HDL-c)) (G 0 : fasting glucose, TG 0 : fasting triglycerides, BMI: body mass index, HDL-c: high-density lipoprotein cholesterol), and compared its diagnostic performance against the M-value adjusted by fat-free mass (MFFM) obtained by an EHC. METS-IR was validated in a sample with EHC data, a sample with modified frequently sampled intravenous glucose tolerance test (FSIVGTT) data and a large cohort against HOMA-IR. We evaluated the correlation of the score with intrahepatic and intrapancreatic fat measured using magnetic resonance spectroscopy. Subsequently, we evaluated its ability to predict incident T2D cases in a prospective validation cohort of 6144 subjects. METS-IR demonstrated the better correlation with the MFFM ( ρ  = -0.622, P  index obtained from the FSIVGTT (AUC: 0.67, 95% CI: 0.53-0.81). METS-IR significantly correlated with intravisceral, intrahepatic and intrapancreatic fat and fasting insulin levels ( P  50.39) had the highest adjusted risk to develop T2D (HR: 3.91, 95% CI: 2.25-6.81). Furthermore, subjects with incident T2D had higher baseline METS-IR compared to healthy controls (50.2 ± 10.2 vs 44.7 ± 9.2, P  < 0.001). METS-IR is a novel score to evaluate cardiometabolic risk in healthy and at-risk subjects and a promising tool for screening of insulin sensitivity. © 2018 European Society of Endocrinology.

  8. Incorporating Nuisance Parameters in Likelihoods for Multisource Spectra

    CERN Document Server

    Conway, J.S.

    2011-01-01

    We describe here the general mathematical approach to constructing likelihoods for fitting observed spectra in one or more dimensions with multiple sources, including the effects of systematic uncertainties represented as nuisance parameters, when the likelihood is to be maximized with respect to these parameters. We consider three types of nuisance parameters: simple multiplicative factors, source spectra "morphing" parameters, and parameters representing statistical uncertainties in the predicted source spectra.

  9. Ego involvement increases doping likelihood.

    Science.gov (United States)

    Ring, Christopher; Kavussanu, Maria

    2018-08-01

    Achievement goal theory provides a framework to help understand how individuals behave in achievement contexts, such as sport. Evidence concerning the role of motivation in the decision to use banned performance enhancing substances (i.e., doping) is equivocal on this issue. The extant literature shows that dispositional goal orientation has been weakly and inconsistently associated with doping intention and use. It is possible that goal involvement, which describes the situational motivational state, is a stronger determinant of doping intention. Accordingly, the current study used an experimental design to examine the effects of goal involvement, manipulated using direct instructions and reflective writing, on doping likelihood in hypothetical situations in college athletes. The ego-involving goal increased doping likelihood compared to no goal and a task-involving goal. The present findings provide the first evidence that ego involvement can sway the decision to use doping to improve athletic performance.

  10. Timing and Magnitude of Initial Change in Disease Activity Score 28 Predicts the Likelihood of Achieving Low Disease Activity at 1 Year in Rheumatoid Arthritis Patients Treated with Certolizumab Pegol: A Post-hoc Analysis of the RAPID 1 Trial

    NARCIS (Netherlands)

    van der Heijde, Désirée; Keystone, Edward C.; Curtis, Jeffrey R.; Landewé, Robert B.; Schiff, Michael H.; Khanna, Dinesh; Kvien, Tore K.; Ionescu, Lucian; Gervitz, Leon M.; Davies, Owen R.; Luijtens, Kristel; Furst, Daniel E.

    2012-01-01

    Objective. To determine the relationship between timing and magnitude of Disease Activity Score [DAS28(ESR)] nonresponse (DAS28 improvement thresholds not reached) during the first 12 weeks of treatment with certolizumab pegol (CZP) plus methotrexate, and the likelihood of achieving low disease

  11. The prediction of the incidence rate of upper limb musculoskeletal disorders, with CTD risk index method on potters of Meybod city

    Directory of Open Access Journals (Sweden)

    Reza Khani Jazani

    2012-02-01

    Full Text Available Background: The objective of this study was to predict the incidence of musculoskeletal disorders in potters of Meybod city by performing CTD risk index method.Materials and Method: This is a descriptive cross-sectional study. Target society was all workers in pottery workshops which were located in the Meybod. Information related to musculoskeletal disorders was obtained by the Nordic questionnaire and we used CTD risk index method to predict the incidence of musculoskeletal disorders.Results: We observed in this study that 59.3% of the potters had symptoms of musculoskeletal disorders in at least in one of their upper extremities. Also significant differences between mean CTD risk index on potters with and without symptoms of the upper limb musculoskeletal disorders, respectively (p=0.038.Conclusion: CTD risk index method can be as a suitable method for predicting the incidence of musculoskeletal disorders used in the potters

  12. Baseline and changes in serum uric acid independently predict 11-year incidence of metabolic syndrome among community-dwelling women.

    Science.gov (United States)

    Kawamoto, R; Ninomiya, D; Kasai, Y; Senzaki, K; Kusunoki, T; Ohtsuka, N; Kumagi, T

    2018-02-19

    Metabolic syndrome (MetS) is associated with an increased risk of major cardiovascular events. In women, increased serum uric acid (SUA) levels are associated with MetS and its components. However, whether baseline and changes in SUA predict incidence of MetS and its components remains unclear. The subjects comprised 407 women aged 71 ± 8 years from a rural village. We have identified participants who underwent a similar examination 11 years ago, and examined the relationship between baseline and changes in SUA, and MetS based on the modified criteria of the National Cholesterol Education Program's Adult Treatment Panel (NCEP-ATP) III report. Of these subjects, 83 (20.4%) women at baseline and 190 (46.7%) women at follow-up had MetS. Multiple linear regression analysis was performed to evaluate the contribution of each confounding factor for MetS; both baseline and changes in SUA as well as history of cardiovascular disease, low-density lipoprotein cholesterol, and estimated glomerular filtration ratio (eGFR) were independently and significantly associated with the number of MetS components during an 11-year follow-up. The adjusted odds ratios (ORs) (95% confidence interval) for incident MetS across tertiles of baseline SUA and changes in SUA were 1.00, 1.47 (0.82-2.65), and 3.11 (1.66-5.83), and 1.00, 1.88 (1.03-3.40), and 2.49 (1.38-4.47), respectively. In addition, the combined effect between increased baseline and changes in SUA was also a significant and independent determinant for the accumulation of MetS components (F = 20.29, p baseline MetS. These results suggested that combined assessment of baseline and changes in SUA levels provides increased information for incident MetS, independent of other confounding factors in community-dwelling women.

  13. Estimating likelihood of future crashes for crash-prone drivers

    Directory of Open Access Journals (Sweden)

    Subasish Das

    2015-06-01

    Full Text Available At-fault crash-prone drivers are usually considered as the high risk group for possible future incidents or crashes. In Louisiana, 34% of crashes are repeatedly committed by the at-fault crash-prone drivers who represent only 5% of the total licensed drivers in the state. This research has conducted an exploratory data analysis based on the driver faultiness and proneness. The objective of this study is to develop a crash prediction model to estimate the likelihood of future crashes for the at-fault drivers. The logistic regression method is used by employing eight years' traffic crash data (2004–2011 in Louisiana. Crash predictors such as the driver's crash involvement, crash and road characteristics, human factors, collision type, and environmental factors are considered in the model. The at-fault and not-at-fault status of the crashes are used as the response variable. The developed model has identified a few important variables, and is used to correctly classify at-fault crashes up to 62.40% with a specificity of 77.25%. This model can identify as many as 62.40% of the crash incidence of at-fault drivers in the upcoming year. Traffic agencies can use the model for monitoring the performance of an at-fault crash-prone drivers and making roadway improvements meant to reduce crash proneness. From the findings, it is recommended that crash-prone drivers should be targeted for special safety programs regularly through education and regulations.

  14. Maximum likelihood of phylogenetic networks.

    Science.gov (United States)

    Jin, Guohua; Nakhleh, Luay; Snir, Sagi; Tuller, Tamir

    2006-11-01

    Horizontal gene transfer (HGT) is believed to be ubiquitous among bacteria, and plays a major role in their genome diversification as well as their ability to develop resistance to antibiotics. In light of its evolutionary significance and implications for human health, developing accurate and efficient methods for detecting and reconstructing HGT is imperative. In this article we provide a new HGT-oriented likelihood framework for many problems that involve phylogeny-based HGT detection and reconstruction. Beside the formulation of various likelihood criteria, we show that most of these problems are NP-hard, and offer heuristics for efficient and accurate reconstruction of HGT under these criteria. We implemented our heuristics and used them to analyze biological as well as synthetic data. In both cases, our criteria and heuristics exhibited very good performance with respect to identifying the correct number of HGT events as well as inferring their correct location on the species tree. Implementation of the criteria as well as heuristics and hardness proofs are available from the authors upon request. Hardness proofs can also be downloaded at http://www.cs.tau.ac.il/~tamirtul/MLNET/Supp-ML.pdf

  15. A maximum likelihood framework for protein design

    Directory of Open Access Journals (Sweden)

    Philippe Hervé

    2006-06-01

    Full Text Available Abstract Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces

  16. Predicted risks of second malignant neoplasm incidence and mortality due to secondary neutrons in a girl and boy receiving proton craniospinal irradiation

    International Nuclear Information System (INIS)

    Taddei, Phillip J; Mirkovic, Dragan; Zhang Rui; Giebeler, Annelise; Harvey, Mark; Newhauser, Wayne D; Mahajan, Anita; Kornguth, David; Woo, Shiao

    2010-01-01

    The purpose of this study was to compare the predicted risks of second malignant neoplasm (SMN) incidence and mortality from secondary neutrons for a 9-year-old girl and a 10-year-old boy who received proton craniospinal irradiation (CSI). SMN incidence and mortality from neutrons were predicted from equivalent doses to radiosensitive organs for cranial, spinal and intracranial boost fields. Therapeutic proton absorbed dose and equivalent dose from neutrons were calculated using Monte Carlo simulations. Risks of SMN incidence and mortality in most organs and tissues were predicted by applying risks models from the National Research Council of the National Academies to the equivalent dose from neutrons; for non-melanoma skin cancer, risk models from the International Commission on Radiological Protection were applied. The lifetime absolute risks of SMN incidence due to neutrons were 14.8% and 8.5%, for the girl and boy, respectively. The risks of a fatal SMN were 5.3% and 3.4% for the girl and boy, respectively. The girl had a greater risk for any SMN except colon and liver cancers, indicating that the girl's higher risks were not attributable solely to greater susceptibility to breast cancer. Lung cancer predominated the risk of SMN mortality for both patients. This study suggests that the risks of SMN incidence and mortality from neutrons may be greater for girls than for boys treated with proton CSI.

  17. Incidence, Prognostic Impact, and Predictive Factors of Readmission for Heart Failure After Transcatheter Aortic Valve Replacement.

    Science.gov (United States)

    Durand, Eric; Doutriaux, Maxime; Bettinger, Nicolas; Tron, Christophe; Fauvel, Charles; Bauer, Fabrice; Dacher, Jean-Nicolas; Bouhzam, Najime; Litzler, Pierre-Yves; Cribier, Alain; Eltchaninoff, Hélène

    2017-12-11

    The aim of this study was to assess the incidence, prognostic impact, and predictive factors of readmission for congestive heart failure (CHF) in patients with severe aortic stenosis treated by transcatheter aortic valve replacement (TAVR). TAVR is indicated in patients with severe symptomatic aortic stenosis in whom surgery is considered high risk or is contraindicated. Readmission for CHF after TAVR remains a challenge, and data on prognostic and predictive factors are lacking. All patients who underwent TAVR from January 2010 to December 2014 were included. Follow-up was achieved for at least 1 year and included clinical and echocardiographic data. Readmission for CHF was analyzed retrospectively. This study included 546 patients, 534 (97.8%) of whom were implanted with balloon-expandable valves preferentially via the transfemoral approach in 87.8% of cases. After 1 year, 285 patients (52.2%) had been readmitted at least once, 132 (24.1%) for CHF. Patients readmitted for CHF had an increased risk for death (p < 0.0001) and cardiac death (p < 0.0001) compared with those not readmitted for CHF. On multivariate analysis, aortic mean gradient (hazard ratio [HR]: 0.88; 95% confidence interval [CI]: 0.79 to 0.99; p = 0.03), post-procedural blood transfusion (HR: 2.27; 95% CI: 1.13 to 5.56; p = 0.009), severe post-procedural pulmonary hypertension (HR: 1.04; 95% CI: 1.00 to 1.07; p < 0.0001), and left atrial diameter (HR: 1.47; 95% CI: 1.08 to 2.01; p = 0.02) were independently associated with CHF readmission at 1 year. Readmission for CHF after TAVR was frequent and was strongly associated with 1-year mortality. Low gradient, persistent pulmonary hypertension, left atrial dilatation, and transfusions were predictive of readmission for CHF. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  18. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Directory of Open Access Journals (Sweden)

    Renee Heffron

    Full Text Available HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year.Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects.The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76. Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score.A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  19. A Scoring Tool to Identify East African HIV-1 Serodiscordant Partnerships with a High Likelihood of Pregnancy.

    Science.gov (United States)

    Heffron, Renee; Cohen, Craig R; Ngure, Kenneth; Bukusi, Elizabeth; Were, Edwin; Kiarie, James; Mugo, Nelly; Celum, Connie; Baeten, Jared M

    2015-01-01

    HIV-1 prevention programs targeting HIV-1 serodiscordant couples need to identify couples that are likely to become pregnant to facilitate discussions about methods to minimize HIV-1 risk during pregnancy attempts (i.e. safer conception) or effective contraception when pregnancy is unintended. A clinical prediction tool could be used to identify HIV-1 serodiscordant couples with a high likelihood of pregnancy within one year. Using standardized clinical prediction methods, we developed and validated a tool to identify heterosexual East African HIV-1 serodiscordant couples with an increased likelihood of becoming pregnant in the next year. Datasets were from three prospectively followed cohorts, including nearly 7,000 couples from Kenya and Uganda participating in HIV-1 prevention trials and delivery projects. The final score encompassed the age of the woman, woman's number of children living, partnership duration, having had condomless sex in the past month, and non-use of an effective contraceptive. The area under the curve (AUC) for the probability of the score to correctly predict pregnancy was 0.74 (95% CI 0.72-0.76). Scores ≥ 7 predicted a pregnancy incidence of >17% per year and captured 78% of the pregnancies. Internal and external validation confirmed the predictive ability of the score. A pregnancy likelihood score encompassing basic demographic, clinical and behavioral factors defined African HIV-1 serodiscordant couples with high one-year pregnancy incidence rates. This tool could be used to engage African HIV-1 serodiscordant couples in counseling discussions about fertility intentions in order to offer services for safer conception or contraception that align with their reproductive goals.

  20. The predictive value of current haemoglobin levels for incident tuberculosis and/or mortality during long-term antiretroviral therapy in South Africa: a cohort study

    NARCIS (Netherlands)

    Kerkhoff, Andrew D.; Wood, Robin; Cobelens, Frank G.; Gupta-Wright, Ankur; Bekker, Linda-Gail; Lawn, Stephen D.

    2015-01-01

    Low haemoglobin concentrations may be predictive of incident tuberculosis (TB) and death in HIV-infected patients receiving antiretroviral therapy (ART), but data are limited and inconsistent. We examined these relationships retrospectively in a long-term South African ART cohort with multiple

  1. Elevated HbA1c and Fasting Plasma Glucose in Predicting Diabetes Incidence Among Older Adults

    Science.gov (United States)

    Lipska, Kasia J.; Inzucchi, Silvio E.; Van Ness, Peter H.; Gill, Thomas M.; Kanaya, Alka; Strotmeyer, Elsa S.; Koster, Annemarie; Johnson, Karen C.; Goodpaster, Bret H.; Harris, Tamara; De Rekeneire, Nathalie

    2013-01-01

    OBJECTIVE To determine which measures—impaired fasting glucose (IFG), elevated HbA1c, or both—best predict incident diabetes in older adults. RESEARCH DESIGN AND METHODS From the Health, Aging, and Body Composition study, we selected individuals without diabetes, and we defined IFG (100–125 mg/dL) and elevated HbA1c (5.7–6.4%) per American Diabetes Association guidelines. Incident diabetes was based on self-report, use of antihyperglycemic medicines, or HbA1c ≥6.5% during 7 years of follow-up. Logistic regression analyses were adjusted for age, sex, race, site, BMI, smoking, blood pressure, and physical activity. Discrimination and calibration were assessed for models with IFG and with both IFG and elevated HbA1c. RESULTS Among 1,690 adults (mean age 76.5, 46% men, 32% black), 183 (10.8%) developed diabetes over 7 years. Adjusted odds ratios of diabetes were 6.2 (95% CI 4.4–8.8) in those with IFG (versus those with fasting plasma glucose [FPG] HbA1c (versus those with HbA1c HbA1c were considered together, odds ratios were 3.5 (1.9–6.3) in those with IFG only, 8.0 (4.8–13.2) in those with elevated HbA1c only, and 26.2 (16.3–42.1) in those with both IFG and elevated HbA1c (versus those with normal FPG and HbA1c). Addition of elevated HbA1c to the model with IFG resulted in improved discrimination and calibration. CONCLUSIONS Older adults with both IFG and elevated HbA1c have a substantially increased odds of developing diabetes over 7 years. Combined screening with FPG and HbA1c may identify older adults at very high risk for diabetes. PMID:24135387

  2. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  3. Predictions of lung cancer based on county averages for indoor radon versus the historic incidence of regional lung cancer

    International Nuclear Information System (INIS)

    Mose, D.G.; Chrosniak, C.E.; Mushrush, G.W.

    1992-01-01

    After a decade of effort to determine the health risk associated with indoor radon, the efforts of the US Environmental Protection Agency have prevailed in the US, and 4 pCi/1 is commonly used as an Action Level. Proposals by other groups supporting lower or higher Action Levels have failed, largely due to paucity of information supporting any particular level of indoor radon. The authors' studies have compared indoor radon for zip code and county size areas with parameters such as geology, precipitation and home construction. Their attempts to verify the relative levels of lung cancer using US-EPA estimates of radon-vs-cancer have not been supportive of the EPA risk estimates. In general, when they compare the number of lung cancer cases in particular geological or geographical areas with the indoor radon levels in that area, they find the EPA predicted number of lung cancer cases to exceed the total number of lung cancer cases from all causes. Comparisons show a correlation between the incidence of lung cancer and indoor radon, but the level of risk is about 1/10 that proposed by the US-EPA. Evidently the assumptions used in their studies are flawed. Even though they find lower risk estimates using many counties in several states, fundamental flaws must be present in this type of investigation. Care must be taken in presenting health risks to the general population in cases, such as in indoor radon, where field data do not support risk estimates obtained by other means

  4. Syphilis Predicts HIV Incidence Among Men and Transgender Women Who Have Sex With Men in a Preexposure Prophylaxis Trial

    Science.gov (United States)

    Solomon, Marc M.; Mayer, Kenneth H.; Glidden, David V.; Liu, Albert Y.; McMahan, Vanessa M.; Guanira, Juan V.; Chariyalertsak, Suwat; Fernandez, Telmo; Grant, Robert M.; Bekker, Linda-Gail; Buchbinder, Susan; Casapia, Martin; Chariyalertsak, Suwat; Guanira, Juan; Kallas, Esper; Lama, Javier; Mayer, Kenneth; Montoya, Orlando; Schechter, Mauro; Veloso, Valdiléa

    2014-01-01

    Background. Syphilis infection may potentiate transmission of human immunodeficiency virus (HIV). We sought to determine the extent to which HIV acquisition was associated with syphilis infection within an HIV preexposure prophylaxis (PrEP) trial and whether emtricitabine/tenofovir (FTC/TDF) modified that association. Methods. The Preexposure Prophylaxis Initiative (iPrEx) study randomly assigned 2499 HIV-seronegative men and transgender women who have sex with men (MSM) to receive oral daily FTC/TDF or placebo. Syphilis prevalence at screening and incidence during follow-up were measured. Hazard ratios for the effect of incident syphilis on HIV acquisition were calculated. The effect of FTC/TDF on incident syphilis and HIV acquisition was assessed. Results. Of 2499 individuals, 360 (14.4%) had a positive rapid plasma reagin test at screening; 333 (92.5%) had a positive confirmatory test, which did not differ between the arms (FTC/TDF vs placebo, P = .81). The overall syphilis incidence during the trial was 7.3 cases per 100 person-years. There was no difference in syphilis incidence between the study arms (7.8 cases per 100 person-years for FTC/TDF vs 6.8 cases per 100 person-years for placebo, P = .304). HIV incidence varied by incident syphilis (2.8 cases per 100 person-years for no syphilis vs 8.0 cases per 100 person-years for incident syphilis), reflecting a hazard ratio of 2.6 (95% confidence interval, 1.6–4.4; P < .001). There was no evidence for interaction between randomization to the FTC/TDF arm and incident syphilis on HIV incidence. Conclusions. In HIV-seronegative MSM, syphilis infection was associated with HIV acquisition in this PrEP trial; a syphilis diagnosis should prompt providers to offer PrEP unless otherwise contraindicated. PMID:24928295

  5. Understanding the properties of diagnostic tests - Part 2: Likelihood ratios.

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh

    2018-01-01

    Diagnostic tests are used to identify subjects with and without disease. In a previous article in this series, we examined some attributes of diagnostic tests - sensitivity, specificity, and predictive values. In this second article, we look at likelihood ratios, which are useful for the interpretation of diagnostic test results in everyday clinical practice.

  6. Adaptive Unscented Kalman Filter using Maximum Likelihood Estimation

    DEFF Research Database (Denmark)

    Mahmoudi, Zeinab; Poulsen, Niels Kjølstad; Madsen, Henrik

    2017-01-01

    The purpose of this study is to develop an adaptive unscented Kalman filter (UKF) by tuning the measurement noise covariance. We use the maximum likelihood estimation (MLE) and the covariance matching (CM) method to estimate the noise covariance. The multi-step prediction errors generated...

  7. Weekly variations in feelings of trust predict incident STI within a prospective cohort of adolescent women from a US city.

    Science.gov (United States)

    Matson, Pamela A; Fortenberry, J Dennis; Chung, Shang-En; Gaydos, Charlotte A; Ellen, Jonathan M

    2018-03-24

    Feelings of intimacy, perceptions of partner concurrency (PPC) and perceptions of risk for an STD (PRSTD) are meaningful and dynamic attributes of adolescent sexual relationships. Our objective was to examine whether variations in these STI-associated feelings and perceptions predicted incident Chlamydia trachomatis and/or Neisseriagonorrhoeae infection within a prospective cohort of urban adolescent women. A cohort of clinic-recruited women aged 16-19 completed daily surveys on feelings and risk perceptions about each current sex partner on a smartphone continuously for up to 18 months. Urine was tested for C. trachomatis and N. gonorrhoeae every 3 months. Daily responses were averaged across the week. As overall means for trust, closeness and commitment were high, data were coded to indicate any decrease in feelings from the previous week. PRSTD and PPC were reverse coded to indicate any increase from the previous week. An index was created to examine the cumulative effect of variation in these feelings and perceptions. Generalised linear models were used to account for correlation among repeated measures within relationships. For each week that there was a decrease in trust, there was a 45% increase in the risk of being infected with an STI at follow-up (relative risk (RR) 1.45, 95% CI 1.18 to 1.78, P=0.004). Neither a decrease in closeness or commitment, nor an increase in PRSTD or PPC was associated with an STI outcome. Cumulatively, the index measure indicated that a change in an additional feeling or perception over the week increased the odds of an STI by 14% (RR 1.14, 95% CI 1.02 to 1.29, P=0.026). A decrease in feelings of trust towards a main partner may be a more sensitive indicator of STI risk than PRSTD, PPC or commitment. The next generation of behavioural interventions for youth will need strategies to address feelings of intimacy within adolescent romantic relationships. © Article author(s) (or their employer(s) unless otherwise stated in the

  8. Prediction of cervical cancer incidence in England, UK, up to 2040, under four scenarios: a modelling study.

    Science.gov (United States)

    Castanon, Alejandra; Landy, Rebecca; Pesola, Francesca; Windridge, Peter; Sasieni, Peter

    2018-01-01

    In the next 25 years, the epidemiology of cervical cancer in England, UK, will change: human papillomavirus (HPV) screening will be the primary test for cervical cancer. Additionally, the proportion of women screened regularly is decreasing and women who received the HPV vaccine are due to attend screening for the first time. Therefore, we aimed to estimate how vaccination against HPV, changes to the screening test, and falling screening coverage will affect cervical cancer incidence in England up to 2040. We did a data modelling study that combined results from population modelling of incidence trends, observable data from the individual level with use of a generalised linear model, and microsimulation of unobservable disease states. We estimated age-specific absolute risks of cervical cancer in the absence of screening (derived from individual level data). We used an age period cohort model to estimate birth cohort effects. We multiplied the absolute risks by the age cohort effects to provide absolute risks of cervical cancer for unscreened women in different birth cohorts. We obtained relative risks (RRs) of cervical cancer by screening history (never screened, regularly screened, or lapsed attender) using data from a population-based case-control study for unvaccinated women, and using a microsimulation model for vaccinated women. RRs of primary HPV screening were relative to cytology. We used the proportion of women in each 5-year age group (25-29 years to 75-79 years) and 5-year period (2016-20 to 2036-40) who have a combination of screening and vaccination history, and weighted to estimate the population incidence. The primary outcome was the number of cases and rates per 100 000 women under four scenarios: no changes to current screening coverage or vaccine uptake and HPV primary testing from 2019 (status quo), changing the year in which HPV primary testing is introduced, introduction of the nine-valent vaccine, and changes to cervical screening coverage

  9. The Laplace Likelihood Ratio Test for Heteroscedasticity

    Directory of Open Access Journals (Sweden)

    J. Martin van Zyl

    2011-01-01

    Full Text Available It is shown that the likelihood ratio test for heteroscedasticity, assuming the Laplace distribution, gives good results for Gaussian and fat-tailed data. The likelihood ratio test, assuming normality, is very sensitive to any deviation from normality, especially when the observations are from a distribution with fat tails. Such a likelihood test can also be used as a robust test for a constant variance in residuals or a time series if the data is partitioned into groups.

  10. Predicting the incidence of hand, foot and mouth disease in Sichuan province, China using the ARIMA model.

    Science.gov (United States)

    Liu, L; Luan, R S; Yin, F; Zhu, X P; Lü, Q

    2016-01-01

    Hand, foot and mouth disease (HFMD) is an infectious disease caused by enteroviruses, which usually occurs in children aged ARIMA) model to forecast HFMD incidence in Sichuan province, China. HFMD infection data from January 2010 to June 2014 were used to fit the ARIMA model. The coefficient of determination (R 2), normalized Bayesian Information Criterion (BIC) and mean absolute percentage of error (MAPE) were used to evaluate the goodness-of-fit of the constructed models. The fitted ARIMA model was applied to forecast the incidence of HMFD from April to June 2014. The goodness-of-fit test generated the optimum general multiplicative seasonal ARIMA (1,0,1) × (0,1,0)12 model (R 2 = 0·692, MAPE = 15·982, BIC = 5·265), which also showed non-significant autocorrelations in the residuals of the model (P = 0·893). The forecast incidence values of the ARIMA (1,0,1) × (0,1,0)12 model from July to December 2014 were 4103-9987, which were proximate forecasts. The ARIMA model could be applied to forecast HMFD incidence trend and provide support for HMFD prevention and control. Further observations should be carried out continually into the time sequence, and the parameters of the models could be adjusted because HMFD incidence will not be absolutely stationary in the future.

  11. MXLKID: a maximum likelihood parameter identifier

    International Nuclear Information System (INIS)

    Gavel, D.T.

    1980-07-01

    MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables

  12. Likelihood ratio sequential sampling models of recognition memory.

    Science.gov (United States)

    Osth, Adam F; Dennis, Simon; Heathcote, Andrew

    2017-02-01

    The mirror effect - a phenomenon whereby a manipulation produces opposite effects on hit and false alarm rates - is benchmark regularity of recognition memory. A likelihood ratio decision process, basing recognition on the relative likelihood that a stimulus is a target or a lure, naturally predicts the mirror effect, and so has been widely adopted in quantitative models of recognition memory. Glanzer, Hilford, and Maloney (2009) demonstrated that likelihood ratio models, assuming Gaussian memory strength, are also capable of explaining regularities observed in receiver-operating characteristics (ROCs), such as greater target than lure variance. Despite its central place in theorising about recognition memory, however, this class of models has not been tested using response time (RT) distributions. In this article, we develop a linear approximation to the likelihood ratio transformation, which we show predicts the same regularities as the exact transformation. This development enabled us to develop a tractable model of recognition-memory RT based on the diffusion decision model (DDM), with inputs (drift rates) provided by an approximate likelihood ratio transformation. We compared this "LR-DDM" to a standard DDM where all targets and lures receive their own drift rate parameters. Both were implemented as hierarchical Bayesian models and applied to four datasets. Model selection taking into account parsimony favored the LR-DDM, which requires fewer parameters than the standard DDM but still fits the data well. These results support log-likelihood based models as providing an elegant explanation of the regularities of recognition memory, not only in terms of choices made but also in terms of the times it takes to make them. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Predicting the hand, foot, and mouth disease incidence using search engine query data and climate variables: an ecological study in Guangdong, China.

    Science.gov (United States)

    Du, Zhicheng; Xu, Lin; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng; Hao, Yuantao

    2017-10-06

    Hand, foot, and mouth disease (HFMD) has caused a substantial burden in China, especially in Guangdong Province. Based on the enhanced surveillance system, we aimed to explore whether the addition of temperate and search engine query data improves the risk prediction of HFMD. Ecological study. Information on the confirmed cases of HFMD, climate parameters and search engine query logs was collected. A total of 1.36 million HFMD cases were identified from the surveillance system during 2011-2014. Analyses were conducted at aggregate level and no confidential information was involved. A seasonal autoregressive integrated moving average (ARIMA) model with external variables (ARIMAX) was used to predict the HFMD incidence from 2011 to 2014, taking into account temperature and search engine query data (Baidu Index, BDI). Statistics of goodness-of-fit and precision of prediction were used to compare models (1) based on surveillance data only, and with the addition of (2) temperature, (3) BDI, and (4) both temperature and BDI. A high correlation between HFMD incidence and BDI ( r =0.794, pengine query data significantly improved the prediction of HFMD. Further studies are warranted to examine whether including search engine query data also improves the prediction of other infectious diseases in other settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Bohai and Yellow Sea Oil Spill Prediction System and Its Application to Huangdao ‘11.22’ Oil Spill Incident

    Science.gov (United States)

    Li, Huan; Li, Yan; Li, Cheng; Li, Wenshan; Wang, Guosong; Zhang, Song

    2017-08-01

    Marine oil spill has deep negative effect on both marine ecosystem and human activities. In recent years, due to China’s high-speed economic development, the demand for crude oil is increasing year by year in China, and leading to the high risk of marine oil spill. Therefore, it is necessary that promoting emergency response on marine oil spill in China and improving oil spill prediction techniques. In this study, based on oil spill model and GIS platform, we have developed the Bohai and Yellow sea oil spill prediction system. Combining with high-resolution meteorological and oceanographic forecast results, the system was applied to predict the drift and diffusion process of Huangdao ‘11.22’ oil spill incident. Although the prediction can’t be validated by some SAR images due to the lack of satellite observations, it still provided effective and referable oil spill behavior information to Maritime Safety Administration.

  15. Cost-Effectiveness of Coal Workers' Pneumoconiosis Prevention Based on Its Predicted Incidence within the Datong Coal Mine Group in China.

    Science.gov (United States)

    Shen, Fuhai; Liu, Hongbo; Yuan, Juxiang; Han, Bing; Cui, Kai; Ding, Yu; Fan, Xueyun; Cao, Hong; Yao, Sanqiao; Suo, Xia; Sun, Zhiqian; Yun, Xiang; Hua, Zhengbing; Chen, Jie

    2015-01-01

    We aimed to estimate the economic losses currently caused by coal workers' pneumoconiosis (CWP) and, on the basis of these measurements, confirm the economic benefit of preventive measures. Our cohort study included 1,847 patients with CWP and 43,742 coal workers without CWP who were registered in the employment records of the Datong Coal Mine Group. We calculated the cumulative incidence rate of pneumoconiosis using the life-table method. We used the dose-response relationship between cumulative incidence density and cumulative dust exposure to predict the future trend in the incidence of CWP. We calculate the economic loss caused by CWP and economic effectiveness of CWP prevention by a step-wise model. The cumulative incidence rates of CWP in the tunneling, mining, combining, and helping cohorts were 58.7%, 28.1%, 21.7%, and 4.0%, respectively. The cumulative incidence rates increased gradually with increasing cumulative dust exposure (CDE). We predicted 4,300 new CWP cases, assuming the dust concentrations remained at the levels of 2011. If advanced dustproof equipment was adopted, 537 fewer people would be diagnosed with CWP. In all, losses of 1.207 billion Renminbi (RMB, official currency of China) would be prevented and 4,698.8 healthy life years would be gained. Investments in advanced dustproof equipment would be total 843 million RMB, according to our study; the ratio of investment to restored economic losses was 1:1.43. Controlling workplace dust concentrations is critical to reduce the onset of pneumoconiosis and to achieve economic benefits.

  16. Cost-Effectiveness of Coal Workers' Pneumoconiosis Prevention Based on Its Predicted Incidence within the Datong Coal Mine Group in China

    Science.gov (United States)

    Yuan, Juxiang; Han, Bing; Cui, Kai; Ding, Yu; Fan, Xueyun; Cao, Hong; Yao, Sanqiao; Suo, Xia; Sun, Zhiqian; Yun, Xiang; Hua, Zhengbing; Chen, Jie

    2015-01-01

    We aimed to estimate the economic losses currently caused by coal workers’ pneumoconiosis (CWP) and, on the basis of these measurements, confirm the economic benefit of preventive measures. Our cohort study included 1,847 patients with CWP and 43,742 coal workers without CWP who were registered in the employment records of the Datong Coal Mine Group. We calculated the cumulative incidence rate of pneumoconiosis using the life-table method. We used the dose-response relationship between cumulative incidence density and cumulative dust exposure to predict the future trend in the incidence of CWP. We calculate the economic loss caused by CWP and economic effectiveness of CWP prevention by a step-wise model. The cumulative incidence rates of CWP in the tunneling, mining, combining, and helping cohorts were 58.7%, 28.1%, 21.7%, and 4.0%, respectively. The cumulative incidence rates increased gradually with increasing cumulative dust exposure (CDE). We predicted 4,300 new CWP cases, assuming the dust concentrations remained at the levels of 2011. If advanced dustproof equipment was adopted, 537 fewer people would be diagnosed with CWP. In all, losses of 1.207 billion Renminbi (RMB, official currency of China) would be prevented and 4,698.8 healthy life years would be gained. Investments in advanced dustproof equipment would be total 843 million RMB, according to our study; the ratio of investment to restored economic losses was 1:1.43. Controlling workplace dust concentrations is critical to reduce the onset of pneumoconiosis and to achieve economic benefits. PMID:26098706

  17. Physical constraints on the likelihood of life on exoplanets

    Science.gov (United States)

    Lingam, Manasvi; Loeb, Abraham

    2018-04-01

    One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.

  18. Which aspects of safety culture predict incident reporting behavior in neonatal intensive care units? A multilevel analysis

    NARCIS (Netherlands)

    Snijders, Cathelijne; Kollen, Boudewijn J.; van Lingen, Richard A.; Fetter, Willem P. F.; Molendijk, Harry; Kok, J. H.; te Pas, E.; Pas, H.; van der Starre, C.; Bloemendaal, E.; Lopes Cardozo, R. H.; Molenaar, A. M.; Giezen, A.; van Lingen, R. A.; Maat, H. E.; Molendijk, A.; Snijders, C.; Lavrijssen, S.; Mulder, A. L. M.; de Kleine, M. J. K.; Koolen, A. M. P.; Schellekens, M.; Verlaan, W.; Vrancken, S.; Fetter, W. P. F.; Schotman, L.; van der Zwaan, A.; van der Tuijn, Y.; Tibboel, D.; van der Schaaf, T. W.; Klip, H.; Kollen, B. J.

    2009-01-01

    OBJECTIVES: Safety culture assessments are increasingly used to evaluate patient-safety programs. However, it is not clear which aspects of safety culture are most relevant in understanding incident reporting behavior, and ultimately improving patient safety. The objective of this study was to

  19. Men's and Women's Health Beliefs Differentially Predict Coronary Heart Disease Incidence in a Population-Based Sample

    Science.gov (United States)

    Korin, Maya Rom; Chaplin, William F.; Shaffer, Jonathan A.; Butler, Mark J.; Ojie, Mary-Jane; Davidson, Karina W.

    2013-01-01

    Objective: To examine gender differences in the association between beliefs in heart disease preventability and 10-year incidence of coronary heart disease (CHD) in a population-based sample. Methods: A total of 2,688 Noninstitutionalized Nova Scotians without prior CHD enrolled in the Nova Scotia Health Study (NSHS95) and were followed for 10…

  20. Essays on empirical likelihood in economics

    NARCIS (Netherlands)

    Gao, Z.

    2012-01-01

    This thesis intends to exploit the roots of empirical likelihood and its related methods in mathematical programming and computation. The roots will be connected and the connections will induce new solutions for the problems of estimation, computation, and generalization of empirical likelihood.

  1. Risk factors and likelihood of Campylobacter colonization in broiler flocks

    Directory of Open Access Journals (Sweden)

    SL Kuana

    2007-09-01

    Full Text Available Campylobacter was investigated in cecal droppings, feces, and cloacal swabs of 22 flocks of 3 to 5 week-old broilers. Risk factors and the likelihood of the presence of this agent in these flocks were determined. Management practices, such as cleaning and disinfection, feeding, drinkers, and litter treatments, were assessed. Results were evaluated using Odds Ratio (OR test, and their significance was tested by Fisher's test (p<0.05. A Campylobacter prevalence of 81.8% was found in the broiler flocks (18/22, and within positive flocks, it varied between 85 and 100%. Campylobacter incidence among sample types was homogenous, being 81.8% in cecal droppings, 80.9% in feces, and 80.4% in cloacal swabs (230. Flocks fed by automatic feeding systems presented higher incidence of Campylobacter as compared to those fed by tube feeders. Litter was reused in 63.6% of the farm, and, despite the lack of statistical significance, there was higher likelihood of Campylobacter incidence when litter was reused. Foot bath was not used in 45.5% of the flocks, whereas the use of foot bath associated to deficient lime management increased the number of positive flocks, although with no statiscal significance. The evaluated parameters were not significantly associated with Campylobacter colonization in the assessed broiler flocks.

  2. Fear of falling predicts incidence of functional disability two years later: A perspective from an international cohort study.

    Science.gov (United States)

    Auais, Mohammad; French, Simon; Alvarado, Beatriz; Pirkle, Catherine; Belanger, Emmanuelle; Guralnik, Jack

    2017-12-06

    To study the extent to which fear of falling (FOF) is associated with the onset of functional disability over a 2-year period in older adults using self-reported and performance-based measures. In 2012, 1,601 participants (aged 65-74) were recruited from four sites: Kingston and Saint-Hyacinthe, Canada; Manizales, Colombia; and Natal, Brazil. They were re-assessed in 2014. We quantified FOF using the Fall Efficacy Scale-International (FES-I; range: 16-64). Functional disability measures were 1) self-reported incident mobility disability, defined as difficulty climbing a flight of stairs or walking 400 meters and 2) incident poor physical performance, defined as a score <9 on the Short Physical Performance Battery. In the Poisson regression analysis, we included only those participants without functional disability at baseline to calculate incident risk ratios in 2014. 1,355 participants completed the 2014 assessment, of which 917 and 1,078 had no mobility disability and poor physical performance at baseline, respectively. In 2014, 131 (14.3%), and 166 (15.4%) participants reported incident mobility disability and poor physical performance, respectively. After adjusting for age, sex, socioeconomic, and health covariates, a one-point increase in FES-I at baseline was associated with a 4% increase in the risk of reporting incident mobility disability (95% CI: 1.02-1.05) and a 3% increase in the risk of developing poor physical performance at follow up in the overall sample (95%CI: 1.01-1.05). FOF is associated with a higher risk of incident mobility disability and poor physical performance in a cohort of older adults. It is increasingly important to study FOF's effect on functional disability and to take necessary measures to prevent the transition to end-stage disability. © The Author(s) 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Gaussian copula as a likelihood function for environmental models

    Science.gov (United States)

    Wani, O.; Espadas, G.; Cecinati, F.; Rieckermann, J.

    2017-12-01

    Parameter estimation of environmental models always comes with uncertainty. To formally quantify this parametric uncertainty, a likelihood function needs to be formulated, which is defined as the probability of observations given fixed values of the parameter set. A likelihood function allows us to infer parameter values from observations using Bayes' theorem. The challenge is to formulate a likelihood function that reliably describes the error generating processes which lead to the observed monitoring data, such as rainfall and runoff. If the likelihood function is not representative of the error statistics, the parameter inference will give biased parameter values. Several uncertainty estimation methods that are currently being used employ Gaussian processes as a likelihood function, because of their favourable analytical properties. Box-Cox transformation is suggested to deal with non-symmetric and heteroscedastic errors e.g. for flow data which are typically more uncertain in high flows than in periods with low flows. Problem with transformations is that the results are conditional on hyper-parameters, for which it is difficult to formulate the analyst's belief a priori. In an attempt to address this problem, in this research work we suggest learning the nature of the error distribution from the errors made by the model in the "past" forecasts. We use a Gaussian copula to generate semiparametric error distributions . 1) We show that this copula can be then used as a likelihood function to infer parameters, breaking away from the practice of using multivariate normal distributions. Based on the results from a didactical example of predicting rainfall runoff, 2) we demonstrate that the copula captures the predictive uncertainty of the model. 3) Finally, we find that the properties of autocorrelation and heteroscedasticity of errors are captured well by the copula, eliminating the need to use transforms. In summary, our findings suggest that copulas are an

  4. Incidence and predictive factors of Internet addiction among Chinese secondary school students in Hong Kong: a longitudinal study.

    Science.gov (United States)

    Lau, Joseph T F; Gross, Danielle L; Wu, Anise M S; Cheng, Kit-Man; Lau, Mason M C

    2017-06-01

    Internet use has global influences on all aspects of life and has become a growing concern. Cross-sectional studies on Internet addiction (IA) have been reported but causality is often unclear. More longitudinal studies are warranted. We investigated incidence and predictors of IA conversion among secondary school students. A 12-month longitudinal study was conducted among Hong Kong Chinese Secondary 1-4 students (N = 8286). Using the 26-item Chen Internet Addiction Scale (CIAS; cut-off >63), non-IA cases were identified at baseline. Conversion to IA during the follow-up period was detected, with incidence and predictors derived using multi-level models. Prevalence of IA was 16.0% at baseline and incidence of IA was 11.81 per 100 person-years (13.74 for males and 9.78 for females). Risk background factors were male sex, higher school forms, and living with only one parent, while protective background factors were having a mother/father with university education. Adjusted for all background factors, higher baseline CIAS score (ORa = 1.07), longer hours spent online for entertainment and social communication (ORa = 1.92 and 1.63 respectively), and Health Belief Model (HBM) constructs (except perceived severity of IA and perceived self-efficacy to reduce use) were significant predictors of conversion to IA (ORa = 1.07-1.45). Prevalence and incidence of IA conversion were high and need attention. Interventions should take into account risk predictors identified, such as those of the HBM, and time management skills should be enhanced. Screening is warranted to identify those at high risk (e.g. high CIAS score) and provide them with primary and secondary interventions.

  5. Prevalence, Course, Incidence, and 1-Year Prediction of Deliberate Self-Harm and Suicide Attempts in Early Norwegian School Adolescents

    Science.gov (United States)

    Larsson, Bo; Sund, Anne Mari

    2008-01-01

    In this survey of early Norwegian school adolescents, the prevalence, course, and incidence of self-harm behavior with or without suicide intent were examined, in addition to predictors of self-harm for a 1-year follow-up period. Lifetime prevalence rates of self-harm without suicide intent and suicide attempts were 2.9% and 3.0%, respectively,…

  6. Predicting the hand, foot, and mouth disease incidence using search engine query data and climate variables: an ecological study in Guangdong, China

    Science.gov (United States)

    Du, Zhicheng; Xu, Lin; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng; Hao, Yuantao

    2017-01-01

    Objectives Hand, foot, and mouth disease (HFMD) has caused a substantial burden in China, especially in Guangdong Province. Based on the enhanced surveillance system, we aimed to explore whether the addition of temperate and search engine query data improves the risk prediction of HFMD. Design Ecological study. Setting and participants Information on the confirmed cases of HFMD, climate parameters and search engine query logs was collected. A total of 1.36 million HFMD cases were identified from the surveillance system during 2011–2014. Analyses were conducted at aggregate level and no confidential information was involved. Outcome measures A seasonal autoregressive integrated moving average (ARIMA) model with external variables (ARIMAX) was used to predict the HFMD incidence from 2011 to 2014, taking into account temperature and search engine query data (Baidu Index, BDI). Statistics of goodness-of-fit and precision of prediction were used to compare models (1) based on surveillance data only, and with the addition of (2) temperature, (3) BDI, and (4) both temperature and BDI. Results A high correlation between HFMD incidence and BDI (r=0.794, pmodel. Compared with the model based on surveillance data only, the ARIMAX model including BDI reached the best goodness-of-fit with an Akaike information criterion (AIC) value of −345.332, whereas the model including both BDI and temperature had the most accurate prediction in terms of the mean absolute percentage error (MAPE) of 101.745%. Conclusions An ARIMAX model incorporating search engine query data significantly improved the prediction of HFMD. Further studies are warranted to examine whether including search engine query data also improves the prediction of other infectious diseases in other settings. PMID:28988169

  7. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  8. Maximum likelihood convolutional decoding (MCD) performance due to system losses

    Science.gov (United States)

    Webster, L.

    1976-01-01

    A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.

  9. The predictive value of current haemoglobin levels for incident tuberculosis and/or mortality during long-term antiretroviral therapy in South Africa: a cohort study.

    Science.gov (United States)

    Kerkhoff, Andrew D; Wood, Robin; Cobelens, Frank G; Gupta-Wright, Ankur; Bekker, Linda-Gail; Lawn, Stephen D

    2015-04-02

    Low haemoglobin concentrations may be predictive of incident tuberculosis (TB) and death in HIV-infected patients receiving antiretroviral therapy (ART), but data are limited and inconsistent. We examined these relationships retrospectively in a long-term South African ART cohort with multiple time-updated haemoglobin measurements. Prospectively collected clinical data on patients receiving ART for up to 8 years in a community-based cohort were analysed. Time-updated haemoglobin concentrations, CD4 counts and HIV viral loads were recorded, and TB diagnoses and deaths from all causes were ascertained. Anaemia severity was classified using World Health Organization criteria. TB incidence and mortality rates were calculated and Poisson regression models were used to identify independent predictors of incident TB and mortality, respectively. During a median follow-up of 5.0 years (IQR, 2.5-5.8) of 1,521 patients, 476 cases of incident TB and 192 deaths occurred during 6,459 person-years (PYs) of follow-up. TB incidence rates were strongly associated with time-updated anaemia severity; those without anaemia had a rate of 4.4 (95%CI, 3.8-5.1) cases/100 PYs compared to 10.0 (95%CI, 8.3-12.1), 26.6 (95%CI, 22.5-31.7) and 87.8 (95%CI, 57.0-138.2) cases/100 PYs in those with mild, moderate and severe anaemia, respectively. Similarly, mortality rates in those with no anaemia or mild, moderate and severe time-updated anaemia were 1.1 (95%CI, 0.8-1.5), 3.5 (95%CI, 2.7-4.8), 11.8 (95%CI, 9.5-14.8) and 28.2 (95%CI, 16.5-51.5) cases/100 PYs, respectively. Moderate and severe anaemia (time-updated) during ART were the strongest independent predictors for incident TB (adjusted IRR = 3.8 [95%CI, 3.0-4.8] and 8.2 [95%CI, 5.3-12.7], respectively) and for mortality (adjusted IRR = 6.0 [95%CI, 3.9-9.2] and adjusted IRR = 8.0 [95%CI, 3.9-16.4], respectively). Increasing severity of anaemia was associated with exceptionally high rates of both incident TB and mortality during

  10. Incremental predictive value of sarcopenia for incident fracture in an elderly Chinese cohort: results from the Osteoporotic Fractures in Men (MrOs) Study.

    Science.gov (United States)

    Yu, Ruby; Leung, Jason; Woo, Jean

    2014-08-01

    We examined whether sarcopenia is predictive of incident fractures among older men, whether the inclusion of sarcopenia in models adds any incremental value to bone mineral density (BMD), and whether sarcopenia is associated with a higher risk of fractures in elderly with osteoporosis. A cohort of 2000 community-dwelling men aged ≥65 years were examined for which detailed information regarding demographics, socioeconomic, medical history, clinical, and lifestyle factors were documented. Body composition and BMD were measured using dual energy X-ray absorptiometry. Sarcopenia was defined according to the Asian Working Group for Sarcopenia (AWGS) algorithm. Incident fractures were documented during the follow-up period from 2001 to 2013, and related to sarcopenia and its component measures using Cox proportional hazard regressions. The contribution of sarcopenia for predicting fracture risk was evaluated by receiver operating characteristic analysis, net reclassification improvement (NRI), and integrated discrimination improvement (IDI). During an average of 11.3 years of follow-up, 226 (11.3%) men sustained at least 1 incident fracture, making the incidence of fractures 1200.6/100,000 person-years. After multivariate adjustments, sarcopenia was associated with increased fracture risk (hazard ratio [HR], 1.87, 95% confidence interval [CI], 1.26-2.79) independent of BMD and other clinical risk factors. The addition of sarcopenia did not significantly increase area under curve or IDI but significantly improved the predictive ability on fracture risk over BMD and other clinical risk factors by 5.12% (P sarcopenia (sarco-osteoporosis) resulted in a significantly increased risk of fractures (HR, 3.49, 95% CI, 1.76-6.90) compared with those with normal BMD and without sarcopenia. This study confirms that sarcopenia is a predictor of fracture risk in this elderly men cohort, establishes that sarcopenia provides incremental predictive value for fractures over the

  11. Asymptotic Likelihood Distribution for Correlated & Constrained Systems

    CERN Document Server

    Agarwal, Ujjwal

    2016-01-01

    It describes my work as summer student at CERN. The report discusses the asymptotic distribution of the likelihood ratio for total no. of parameters being h and 2 out of these being are constrained and correlated.

  12. Maximum-Likelihood Detection Of Noncoherent CPM

    Science.gov (United States)

    Divsalar, Dariush; Simon, Marvin K.

    1993-01-01

    Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.

  13. Waist circumference cut-off values to predict the incidence of hypertension: an estimation from a Brazilian population-based cohort.

    Science.gov (United States)

    Gus, M; Cichelero, F Tremea; Moreira, C Medaglia; Escobar, G Fortes; Moreira, L Beltrami; Wiehe, M; Fuchs, S Costa; Fuchs, F Danni

    2009-01-01

    Central obesity is a key component in the definition of the metabolic syndrome, but the cut-off values proposed to define abnormal values vary among different guidelines and are mostly based on cross-sectional studies. In this study, we identify the best cut-off values for waist circumference (WC) associated with the incidence of hypertension. Participants for this prospectively planned cohort study were 589 individuals who were free of hypertension and selected at random from the community of Porto Alegre, Brazil. Hypertension was defined by a blood pressure measurement >or= 140/90 mmHg or the use of blood pressure lowering drugs. A logistic regression model established the association between WC and the incidence of hypertension. A receiver operating characteristics (ROC) curve analysis was used to select the best WC cut-off point to predict the incidence of hypertension. During a mean follow-up of 5.5+/-0.9 years, 127 subjects developed hypertension. The hazard ratios for the development of hypertension, adjusted for age, baseline systolic blood pressure, alcohol consumption, gender and scholarship were 1.02 (95% CI; 1.00-1.04; P=0.02) for WC. The best cut-off WC values to predict hypertension were 87 cm in men and 80 cm in women, with an area under the curve of 0.56 (95% CI; 0.47-0.64; P=0.17) and 0.70 (95% CI; 0.63-0.77; Phypertension in individuals living in communities in Brazil, and this risk begins at lower values of WC that those recommended by some guidelines.

  14. Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.

    Science.gov (United States)

    Rukhin, Andrew L

    2011-01-01

    A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.

  15. Operationalizing the Diagnostic Criteria for Mild Cognitive Impairment: The Salience of Objective Measures in Predicting Incident Dementia.

    Science.gov (United States)

    Brodaty, Henry; Aerts, Liesbeth; Crawford, John D; Heffernan, Megan; Kochan, Nicole A; Reppermund, Simone; Kang, Kristan; Maston, Kate; Draper, Brian; Trollor, Julian N; Sachdev, Perminder S

    2017-05-01

    Mild cognitive impairment (MCI) is considered an intermediate stage between normal aging and dementia. It is diagnosed in the presence of subjective cognitive decline and objective cognitive impairment without significant functional impairment, although there are no standard operationalizations for each of these criteria. The objective of this study is to determine which operationalization of the MCI criteria is most accurate at predicting dementia. Six-year longitudinal study, part of the Sydney Memory and Ageing Study. Community-based. 873 community-dwelling dementia-free adults between 70 and 90 years of age. Persons from a non-English speaking background were excluded. Seven different operationalizations for subjective cognitive decline and eight measures of objective cognitive impairment (resulting in 56 different MCI operational algorithms) were applied. The accuracy of each algorithm to predict progression to dementia over 6 years was examined for 618 individuals. Baseline MCI prevalence varied between 0.4% and 30.2% and dementia conversion between 15.9% and 61.9% across different algorithms. The predictive accuracy for progression to dementia was poor. The highest accuracy was achieved based on objective cognitive impairment alone. Inclusion of subjective cognitive decline or mild functional impairment did not improve dementia prediction accuracy. Not MCI, but objective cognitive impairment alone, is the best predictor for progression to dementia in a community sample. Nevertheless, clinical assessment procedures need to be refined to improve the identification of pre-dementia individuals. Copyright © 2016 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  16. [Application of R-based multiple seasonal ARIMA model, in predicting the incidence of hand, foot and mouth disease in Shaanxi province].

    Science.gov (United States)

    Liu, F; Zhu, N; Qiu, L; Wang, J J; Wang, W H

    2016-08-10

    To apply the ' auto-regressive integrated moving average product seasonal model' in predicting the number of hand, foot and mouth disease in Shaanxi province. In Shaanxi province, the trend of hand, foot and mouth disease was analyzed and tested, under the use of R software, between January 2009 and June 2015. Multiple seasonal ARIMA model was then fitted under time series to predict the number of hand, foot and mouth disease in 2016 and 2017. Seasonal effect was seen in hand, foot and mouth disease in Shaanxi province. A multiple seasonal ARIMA (2,1,0)×(1,1,0)12 was established, with the equation as (1 -B)(1 -B12)Ln (Xt) =((1-1.000B)/(1-0.532B-0.363B(2))*(1-0.644B12-0.454B12(2)))*Epsilont. The mean of absolute error and the relative error were 531.535 and 0.114, respectively when compared to the simulated number of patients from Jun to Dec in 2015. RESULTS under the prediction of multiple seasonal ARIMA model showed that the numbers of patients in both 2016 and 2017 were similar to that of 2015 in Shaanxi province. Multiple seasonal ARIMA (2,1,0)×(1,1,0)12 model could be used to successfully predict the incidence of hand, foot and mouth disease in Shaanxi province.

  17. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisová, Katarina

    To the best of our knowledge, this is the first paper which discusses likelihood inference or a random set using a germ-grain model, where the individual grains are unobservable edge effects occur, and other complications appear. We consider the case where the grains form a disc process modelled...... is specified with respect to a given marked Poisson model (i.e. a Boolean model). We show how edge effects and other complications can be handled by considering a certain conditional likelihood. Our methodology is illustrated by analyzing Peter Diggle's heather dataset, where we discuss the results...... of simulation-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  18. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  19. Ageing, exposure to pollution, and interactions between climate change and local seasons as oxidant conditions predicting incident hematologic malignancy at KINSHASA University clinics, Democratic Republic of CONGO (DRC).

    Science.gov (United States)

    Nkanga, Mireille Solange Nganga; Longo-Mbenza, Benjamin; Adeniyi, Oladele Vincent; Ngwidiwo, Jacques Bikaula; Katawandja, Antoine Lufimbo; Kazadi, Paul Roger Beia; Nzonzila, Alain Nganga

    2017-08-23

    The global burden of hematologic malignancy (HM) is rapidly rising with aging, exposure to polluted environments, and global and local climate variability all being well-established conditions of oxidative stress. However, there is currently no information on the extent and predictors of HM at Kinshasa University Clinics (KUC), DR Congo (DRC). This study evaluated the impact of bio-clinical factors, exposure to polluted environments, and interactions between global climate changes (EL Nino and La Nina) and local climate (dry and rainy seasons) on the incidence of HM. This hospital-based prospective cohort study was conducted at Kinshasa University Clinics in DR Congo. A total of 105 black African adult patients with anaemia between 2009 and 2016 were included. HM was confirmed by morphological typing according to the French-American-British (FAB) Classification System. Gender, age, exposure to traffic pollution and garages/stations, global climate variability (El Nino and La Nina), and local climate (dry and rainy seasons) were potential independent variables to predict incident HM using Cox regression analysis and Kaplan Meier curves. Out of the total 105 patients, 63 experienced incident HM, with an incidence rate of 60%. After adjusting for gender, HIV/AIDS, and other bio-clinical factors, the most significant independent predictors of HM were age ≥ 55 years (HR = 2.4; 95% CI 1.4-4.3; P = 0.003), exposure to pollution and garages or stations (HR = 4.9; 95% CI 2-12.1; P pollution, combined local dry season + La Nina and combined local dry season + El Nino were the most significant predictors of incident hematologic malignancy. These findings highlight the importance of aging, pollution, the dry season, El Nino and La Nina as related to global warming as determinants of hematologic malignancies among African patients from Kinshasa, DR Congo. Cancer registries in DRC and other African countries will provide more robust database for future researches on

  20. Maintaining symmetry of simulated likelihood functions

    DEFF Research Database (Denmark)

    Andersen, Laura Mørch

    This paper suggests solutions to two different types of simulation errors related to Quasi-Monte Carlo integration. Likelihood functions which depend on standard deviations of mixed parameters are symmetric in nature. This paper shows that antithetic draws preserve this symmetry and thereby...... improves precision substantially. Another source of error is that models testing away mixing dimensions must replicate the relevant dimensions of the quasi-random draws in the simulation of the restricted likelihood. These simulation errors are ignored in the standard estimation procedures used today...

  1. Likelihood inference for unions of interacting discs

    DEFF Research Database (Denmark)

    Møller, Jesper; Helisova, K.

    2010-01-01

    This is probably the first paper which discusses likelihood inference for a random set using a germ-grain model, where the individual grains are unobservable, edge effects occur and other complications appear. We consider the case where the grains form a disc process modelled by a marked point...... process, where the germs are the centres and the marks are the associated radii of the discs. We propose to use a recent parametric class of interacting disc process models, where the minimal sufficient statistic depends on various geometric properties of the random set, and the density is specified......-based maximum likelihood inference and the effect of specifying different reference Poisson models....

  2. Composite likelihood estimation of demographic parameters

    Directory of Open Access Journals (Sweden)

    Garrigan Daniel

    2009-11-01

    Full Text Available Abstract Background Most existing likelihood-based methods for fitting historical demographic models to DNA sequence polymorphism data to do not scale feasibly up to the level of whole-genome data sets. Computational economies can be achieved by incorporating two forms of pseudo-likelihood: composite and approximate likelihood methods. Composite likelihood enables scaling up to large data sets because it takes the product of marginal likelihoods as an estimator of the likelihood of the complete data set. This approach is especially useful when a large number of genomic regions constitutes the data set. Additionally, approximate likelihood methods can reduce the dimensionality of the data by summarizing the information in the original data by either a sufficient statistic, or a set of statistics. Both composite and approximate likelihood methods hold promise for analyzing large data sets or for use in situations where the underlying demographic model is complex and has many parameters. This paper considers a simple demographic model of allopatric divergence between two populations, in which one of the population is hypothesized to have experienced a founder event, or population bottleneck. A large resequencing data set from human populations is summarized by the joint frequency spectrum, which is a matrix of the genomic frequency spectrum of derived base frequencies in two populations. A Bayesian Metropolis-coupled Markov chain Monte Carlo (MCMCMC method for parameter estimation is developed that uses both composite and likelihood methods and is applied to the three different pairwise combinations of the human population resequence data. The accuracy of the method is also tested on data sets sampled from a simulated population model with known parameters. Results The Bayesian MCMCMC method also estimates the ratio of effective population size for the X chromosome versus that of the autosomes. The method is shown to estimate, with reasonable

  3. Subclinical carotid atherosclerosis and triglycerides predict the incidence of chronic kidney disease in the Japanese general population: results from the Kyushu and Okinawa Population Study (KOPS).

    Science.gov (United States)

    Shimizu, Motohiro; Furusyo, Norihiro; Mitsumoto, Fujiko; Takayama, Koji; Ura, Kazuya; Hiramine, Satoshi; Ikezaki, Hiroaki; Ihara, Takeshi; Mukae, Haru; Ogawa, Eiichi; Toyoda, Kazuhiro; Kainuma, Mosaburo; Murata, Masayuki; Hayashi, Jun

    2015-02-01

    To examine whether or not subclinical atherosclerosis independently predicts the incidence of chronic kidney disease (CKD) in the Japanese general population. This study is part of the Kyushu and Okinawa Population Study (KOPS), a survey of vascular events associated with lifestyle-related diseases. Participants who attended both baseline (2004-2007) and follow-up (2009-2012) examinations were eligible. The common carotid intima-media thickness (IMT) was assessed for each participant at baseline. The end point was the incidence of CKD, defined as an estimated glomerular filtration rate (eGFR) triglycerides (1.6 ± 0.8 vs. 1.3 ± 0.7 mmol/L, P triglycerides (OR 1.35, 95% CI 1.06-1.73, P = 0.015) at baseline were independent predictors for the development of CKD. Higher carotid IMT and hypertriglyceridemia were independently associated with the development of CKD in the population studied. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Predicting Teacher Likelihood to Use School Gardens: A Case Study

    Science.gov (United States)

    Kincy, Natalie; Fuhrman, Nicholas E.; Navarro, Maria; Knauft, David

    2016-01-01

    A quantitative survey, built around the theory of planned behavior, was used to investigate elementary teachers' attitudes, school norms, perceived behavioral control, and intent in both current and ideal teaching situations toward using gardens in their curriculum. With positive school norms and teachers who garden in their personal time, 77% of…

  5. Efficient Bit-to-Symbol Likelihood Mappings

    Science.gov (United States)

    Moision, Bruce E.; Nakashima, Michael A.

    2010-01-01

    This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.

  6. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  7. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  8. An optimized Nash nonlinear grey Bernoulli model based on particle swarm optimization and its application in prediction for the incidence of Hepatitis B in Xinjiang, China.

    Science.gov (United States)

    Zhang, Liping; Zheng, Yanling; Wang, Kai; Zhang, Xueliang; Zheng, Yujian

    2014-06-01

    In this paper, by using a particle swarm optimization algorithm to solve the optimal parameter estimation problem, an improved Nash nonlinear grey Bernoulli model termed PSO-NNGBM(1,1) is proposed. To test the forecasting performance, the optimized model is applied for forecasting the incidence of hepatitis B in Xinjiang, China. Four models, traditional GM(1,1), grey Verhulst model (GVM), original nonlinear grey Bernoulli model (NGBM(1,1)) and Holt-Winters exponential smoothing method, are also established for comparison with the proposed model under the criteria of mean absolute percentage error and root mean square percent error. The prediction results show that the optimized NNGBM(1,1) model is more accurate and performs better than the traditional GM(1,1), GVM, NGBM(1,1) and Holt-Winters exponential smoothing method. Copyright © 2014. Published by Elsevier Ltd.

  9. To Study the Incidence, Predictive Factors and Clinical Outcome of Spontaneous Bacterial Peritonitis in Patients of Cirrhosis with Ascites.

    Science.gov (United States)

    Paul, Kavita; Kaur, Jasmine; Kazal, Harbans Lal

    2015-07-01

    To study the prevalence and predictive factors of spontaneous bacterial peritonitis (SBP) in patients of cirrhosis with ascites and to study the clinical characteristics and prognosis of patients with SBP. The present study was conducted on 122 cases admitted in Department of Medicine, through emergency, in Guru Gobind Singh Medical College and Hospital, Faridkot, Punjab, India. Cases of cirrhosis (irrespective of aetiology) with ascites between the ages of 18-75 years were included in this study. Ascitic fluid of every patient was aspirated under all aseptic measures, before initiation of antibiotic therapy and was sent for biochemical analysis, culture and cytological analysis. Mean age of patients enrolled was 50.30± 10.98 years. 85% were male and 15% were female. Alcohol (73.8%) was the leading cause of cirrhosis followed by HCV (37.7%) and HBV (4.9%). Of the 122 patients studied, 27 (20.4%) patients were diagnosed as having SBP and its variants. Monomicrobial Bacterascites (BA) was present in 5 patients and Culture Negative Neutrocytic Ascites (CNNA) was present in 22 patients. Escherichia coli were the most common isolated organism followed by Klebsiella. The various factors that predispose to development of SBP include low ascitic fluid protein concentration, a high level of serum bilirubin, deranged serum creatinine, high Child-Pugh score and high MELD score. Ascitic fluid analysis remains the single most important test for identifying and assessing a course of SBP. Bedside inoculation of 10-20ml of ascitic fluid into culture bottle at patient bedside will yield better results. Early diagnosis and treatment will reduce the mortality rate in these patients.

  10. Deformation of log-likelihood loss function for multiclass boosting.

    Science.gov (United States)

    Kanamori, Takafumi

    2010-09-01

    The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. Bias correction in the hierarchical likelihood approach to the analysis of multivariate survival data.

    Science.gov (United States)

    Jeon, Jihyoun; Hsu, Li; Gorfine, Malka

    2012-07-01

    Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.

  12. Phylogenetic analysis using parsimony and likelihood methods.

    Science.gov (United States)

    Yang, Z

    1996-02-01

    The assumptions underlying the maximum-parsimony (MP) method of phylogenetic tree reconstruction were intuitively examined by studying the way the method works. Computer simulations were performed to corroborate the intuitive examination. Parsimony appears to involve very stringent assumptions concerning the process of sequence evolution, such as constancy of substitution rates between nucleotides, constancy of rates across nucleotide sites, and equal branch lengths in the tree. For practical data analysis, the requirement of equal branch lengths means similar substitution rates among lineages (the existence of an approximate molecular clock), relatively long interior branches, and also few species in the data. However, a small amount of evolution is neither a necessary nor a sufficient requirement of the method. The difficulties involved in the application of current statistical estimation theory to tree reconstruction were discussed, and it was suggested that the approach proposed by Felsenstein (1981, J. Mol. Evol. 17: 368-376) for topology estimation, as well as its many variations and extensions, differs fundamentally from the maximum likelihood estimation of a conventional statistical parameter. Evidence was presented showing that the Felsenstein approach does not share the asymptotic efficiency of the maximum likelihood estimator of a statistical parameter. Computer simulations were performed to study the probability that MP recovers the true tree under a hierarchy of models of nucleotide substitution; its performance relative to the likelihood method was especially noted. The results appeared to support the intuitive examination of the assumptions underlying MP. When a simple model of nucleotide substitution was assumed to generate data, the probability that MP recovers the true topology could be as high as, or even higher than, that for the likelihood method. When the assumed model became more complex and realistic, e.g., when substitution rates were

  13. Adiposity to muscle ratio predicts incident physical limitation in a cohort of 3,153 older adults--an alternative measurement of sarcopenia and sarcopenic obesity.

    Science.gov (United States)

    Auyeung, Tung Wai; Lee, Jenny Shun Wah; Leung, Jason; Kwok, Timothy; Woo, Jean

    2013-08-01

    Conventionally, sarcopenia is defined by muscle mass and physical performance. We hypothesized that the disability caused by sarcopenia and sarcopenic obesity was related to the amount of adiposity or body weight bearing on a unit of muscle mass, or the adiposity to muscle ratio. We therefore examined whether this ratio could predict physical limitation by secondary analysis of the data in our previous study. We recruited 3,153 community-dwelling adults aged >65 years and their body composition was measured by dual-energy X-ray absorptiometry. Assessment of physical limitation was undertaken 4 years later. The relationship between baseline adiposity to muscle ratio and incident physical limitation was examined by logistic regression. In men, the adiposity to muscle ratios, namely total body fat to lower-limb muscle mass, total body fat to fat-free mass (FFM), and body weight to FFM, were predictive of physical limitation before and after adjustment for the covariates: age, Mini-mental Status Examination score, Geriatric Depression Scale score >8, and the diagnosis of chronic obstructive pulmonary disease, diabetes mellitus, hypertension, heart disease, and stroke (all p values physical limitation 4 years later both before and after adjustment for the same set of covariates (all p values physical limitation in older women across the entire range of the total body fat to lower-limb muscle mass ratio; and in older men when this ratio was equal to or greater than 0.75.

  14. Factors Associated with Young Adults’ Pregnancy Likelihood

    Science.gov (United States)

    Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan

    2014-01-01

    OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849

  15. Review of Elaboration Likelihood Model of persuasion

    OpenAIRE

    藤原, 武弘; 神山, 貴弥

    1989-01-01

    This article mainly introduces Elaboration Likelihood Model (ELM), proposed by Petty & Cacioppo, that is, a general attitude change theory. ELM posturates two routes to persuasion; central and peripheral route. Attitude change by central route is viewed as resulting from a diligent consideration of the issue-relevant informations presented. On the other hand, attitude change by peripheral route is viewed as resulting from peripheral cues in the persuasion context. Secondly we compare these tw...

  16. Unbinned likelihood analysis of EGRET observations

    International Nuclear Information System (INIS)

    Digel, Seth W.

    2000-01-01

    We present a newly-developed likelihood analysis method for EGRET data that defines the likelihood function without binning the photon data or averaging the instrumental response functions. The standard likelihood analysis applied to EGRET data requires the photons to be binned spatially and in energy, and the point-spread functions to be averaged over energy and inclination angle. The full-width half maximum of the point-spread function increases by about 40% from on-axis to 30 degree sign inclination, and depending on the binning in energy can vary by more than that in a single energy bin. The new unbinned method avoids the loss of information that binning and averaging cause and can properly analyze regions where EGRET viewing periods overlap and photons with different inclination angles would otherwise be combined in the same bin. In the poster, we describe the unbinned analysis method and compare its sensitivity with binned analysis for detecting point sources in EGRET data

  17. Utility of the triglyceride level for predicting incident diabetes mellitus according to the fasting status and body mass index category: the Ibaraki Prefectural Health Study.

    Science.gov (United States)

    Fujihara, Kazuya; Sugawara, Ayumi; Heianza, Yoriko; Sairenchi, Toshimi; Irie, Fujiko; Iso, Hiroyasu; Doi, Mikio; Shimano, Hitoshi; Watanabe, Hiroshi; Sone, Hirohito; Ota, Hitoshi

    2014-01-01

    The levels of lipids, especially triglycerides (TG), and obesity are associated with diabetes mellitus (DM). Although typically measured in fasting individuals, non-fasting lipid measurements play an important role in predicting future DM. This study compared the predictive efficacy of lipid variables according to the fasting status and body mass index (BMI) category. Data were collected for 39,196 nondiabetic men and 87,980 nondiabetic women 40-79years of age who underwent health checkups in Ibaraki-Prefecture, Japan in 1993 and were followed through 2007. The hazard ratios (HRs) for DM in relation to sex, the fasting status and BMI were estimated using a Cox proportional hazards model. A total of 8,867 participants, 4,012 men and 4,855 women, developed DM during a mean follow-up of 5.5 years. TG was found to be an independent predictor of incident DM in both fasting and non-fasting men and non-fasting women. The multivariable-adjusted HR for DM according to the TG quartile (Q) 4 vs. Q1 was 1.18 (95% confidence interval (CI): 1.05, 1.34) in the non-fasting men with a normal BMI (18.5-24.9). This trend was also observed in the non-fasting women with a normal BMI. That is, the multivariable-adjusted HRs for DM for TG Q2, Q3 and Q4 compared with Q1 were 1.07 (95% CI: 0.94, 1.23), 1.17 (95%CI: 1.03, 1.34) and 1.48 (95%CI: 1.30, 1.69), respectively. The fasting and non-fasting TG levels in men and non-fasting TG levels in women are predictive of future DM among those with a normal BMI. Clinicians must pay attention to those individuals at high risk for DM.

  18. Prediction of incidence and bio-psycho-socio-cultural risk factors of post-partum depression immediately after birth in an Iranian population.

    Science.gov (United States)

    Abdollahi, Fatemeh; Zarghami, Mehran; Sazlina, Shariff-Ghazali; Zain, Azhar Md; Mohammad, Asghari Jafarabadi; Lye, Munn-Sann

    2016-10-01

    Post-partum depression (PPD) is the most prevalent mental problem associated with childbirth. The purpose of the present study was to determine the incidence of early PPD and possible relevant risk factors among women attending primary health centers in Mazandaran province, Iran for the first time. A longitudinal cohort study was conducted among 2279 eligible women during weeks 32-42 of pregnancy to determine bio-psycho-socio-cultural risk factors of depression at 2 weeks post-partum using the Iranian version of the Edinburgh Postnatal Depression Scale (EPDS). Univariate and hierarchical multiple logistic regression models were used for data analysis. Among 1,739 mothers whose EPDS scores were ≤ 12 during weeks 32-42 of gestation and at the follow-up study, the cumulative incidence rate of depression was 6.9% (120/1,739) at 2 weeks post-partum. In the multivariate model the factor that predicted depression symptomatology at 2 weeks post-partum was having psychiatric distress in pregnancy based on the General Health Questionnaire (GHQ) (OR = 1.06, (95% CI: 1.04-1.09), p = 0.001). The risk of PPD also lower in those with sufficient parenting skills (OR = 0.78 (95% CI: 0.69-0.88), p = 0.001), increased marital satisfaction (OR = 0.94 (95% CI: 0.9-0.99), p = 0.03), increased frequency of practicing rituals (OR = 0.94 (95% CI: 0.89-0.99), p = 0.004) and in those whose husbands had better education (OR = 0.03 (95% CI: 0.88-0.99), p = 0.04). The findings indicated that a combination of demographic, sociological, psychological and cultural risk factors can make mothers vulnerable to PPD.

  19. Transfer Entropy as a Log-Likelihood Ratio

    Science.gov (United States)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  20. Regularization parameter selection methods for ill-posed Poisson maximum likelihood estimation

    International Nuclear Information System (INIS)

    Bardsley, Johnathan M; Goldes, John

    2009-01-01

    In image processing applications, image intensity is often measured via the counting of incident photons emitted by the object of interest. In such cases, image data noise is accurately modeled by a Poisson distribution. This motivates the use of Poisson maximum likelihood estimation for image reconstruction. However, when the underlying model equation is ill-posed, regularization is needed. Regularized Poisson likelihood estimation has been studied extensively by the authors, though a problem of high importance remains: the choice of the regularization parameter. We will present three statistically motivated methods for choosing the regularization parameter, and numerical examples will be presented to illustrate their effectiveness

  1. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  2. Multi-Channel Maximum Likelihood Pitch Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll

    2012-01-01

    In this paper, a method for multi-channel pitch estimation is proposed. The method is a maximum likelihood estimator and is based on a parametric model where the signals in the various channels share the same fundamental frequency but can have different amplitudes, phases, and noise characteristics....... This essentially means that the model allows for different conditions in the various channels, like different signal-to-noise ratios, microphone characteristics and reverberation. Moreover, the method does not assume that a certain array structure is used but rather relies on a more general model and is hence...

  3. Dimension-Independent Likelihood-Informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-01-07

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters, which in principle can be described as functions. By exploiting low-dimensional structure in the change from prior to posterior [distributions], we introduce a suite of MCMC samplers that can adapt to the complex structure of the posterior distribution, yet are well-defined on function space. Posterior sampling in nonlinear inverse problems arising from various partial di erential equations and also a stochastic differential equation are used to demonstrate the e ciency of these dimension-independent likelihood-informed samplers.

  4. Approximate maximum parsimony and ancestral maximum likelihood.

    Science.gov (United States)

    Alon, Noga; Chor, Benny; Pardi, Fabio; Rapoport, Anat

    2010-01-01

    We explore the maximum parsimony (MP) and ancestral maximum likelihood (AML) criteria in phylogenetic tree reconstruction. Both problems are NP-hard, so we seek approximate solutions. We formulate the two problems as Steiner tree problems under appropriate distances. The gist of our approach is the succinct characterization of Steiner trees for a small number of leaves for the two distances. This enables the use of known Steiner tree approximation algorithms. The approach leads to a 16/9 approximation ratio for AML and asymptotically to a 1.55 approximation ratio for MP.

  5. Assessment of the fatty liver index as an indicator of hepatic steatosis for predicting incident diabetes independently of insulin resistance in a Korean population.

    Science.gov (United States)

    Jung, C H; Lee, W J; Hwang, J Y; Yu, J H; Shin, M S; Lee, M J; Jang, J E; Leem, J; Park, J-Y; Kim, H-K

    2013-04-01

    Fatty liver disease, especially non-alcoholic fatty liver disease, is considered to be the hepatic manifestation of the metabolic syndrome, both closely associated with insulin resistance. Furthermore, fatty liver disease assessed by ultrasonography is known to be a predictor of the development of Type 2 diabetes mellitus. However, it remains unclear whether fatty liver disease plays a role in the pathogenesis of Type 2 diabetes independently of insulin resistance. In this study, we investigated whether fatty liver disease assessed by the fatty liver index can predict the development of Type 2 diabetes independently of systemic insulin resistance. We examined the clinical and laboratory data of 7860 subjects without diabetes who underwent general routine health evaluations at the Asan Medical Center in 2007 and had returned for follow-up examinations in 2011. Fatty liver index was calculated using an equation that considers serum triglyceride levels, γ-glutamyltransferase, waist circumference and BMI. During a 4-year period, 457 incident diabetes cases (5.8%) were identified. The odds ratios for the development of Type 2 diabetes were significantly higher in the group with a fatty liver index ≥ 60 (fatty liver index-positive) than in the group with a fatty liver index hepatic steatosis is valuable in identifying subjects at high risk for Type 2 diabetes. In addition, fatty liver disease itself contributes to the development of Type 2 diabetes independently of systemic insulin resistance. © 2012 The Authors. Diabetic Medicine © 2012 Diabetes UK.

  6. Wheelchair incidents

    NARCIS (Netherlands)

    Drongelen AW van; Roszek B; Hilbers-Modderman ESM; Kallewaard M; Wassenaar C; LGM

    2002-01-01

    This RIVM study was performed to gain insight into wheelchair-related incidents with powered and manual wheelchairs reported to the USA FDA, the British MDA and the Dutch Center for Quality and Usability Research of Technical Aids (KBOH). The data in the databases do not indicate that incidents with

  7. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  8. Corporate brand extensions based on the purchase likelihood: governance implications

    Directory of Open Access Journals (Sweden)

    Spyridon Goumas

    2018-03-01

    Full Text Available This paper is examining the purchase likelihood of hypothetical service brand extensions from product companies focusing on consumer electronics based on sector categorization and perceptions of fit between the existing product category and image of the company. Prior research has recognized that levels of brand knowledge eases the transference of associations and affect to the new products. Similarity to the existing products of the parent company and perceived image also influence the success of brand extensions. However, sector categorization may interfere with this relationship. The purpose of this study is to examine Greek consumers’ attitudes towards hypothetical brand extensions, and how these are affected by consumers’ existing knowledge about the brand, sector categorization and perceptions of image and category fit of cross-sector extensions. This aim is examined in the context of technological categories, where less-known companies exhibited significance in purchase likelihood, and contradictory with the existing literature, service companies did not perform as positively as expected. Additional insights to the existing literature about sector categorization are provided. The effect of both image and category fit is also examined and predictions regarding the effect of each are made.

  9. Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.

    Science.gov (United States)

    Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L

    2016-08-01

    This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.

  10. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  11. Modelling maximum likelihood estimation of availability

    International Nuclear Information System (INIS)

    Waller, R.A.; Tietjen, G.L.; Rock, G.W.

    1975-01-01

    Suppose the performance of a nuclear powered electrical generating power plant is continuously monitored to record the sequence of failure and repairs during sustained operation. The purpose of this study is to assess one method of estimating the performance of the power plant when the measure of performance is availability. That is, we determine the probability that the plant is operational at time t. To study the availability of a power plant, we first assume statistical models for the variables, X and Y, which denote the time-to-failure and the time-to-repair variables, respectively. Once those statistical models are specified, the availability, A(t), can be expressed as a function of some or all of their parameters. Usually those parameters are unknown in practice and so A(t) is unknown. This paper discusses the maximum likelihood estimator of A(t) when the time-to-failure model for X is an exponential density with parameter, lambda, and the time-to-repair model for Y is an exponential density with parameter, theta. Under the assumption of exponential models for X and Y, it follows that the instantaneous availability at time t is A(t)=lambda/(lambda+theta)+theta/(lambda+theta)exp[-[(1/lambda)+(1/theta)]t] with t>0. Also, the steady-state availability is A(infinity)=lambda/(lambda+theta). We use the observations from n failure-repair cycles of the power plant, say X 1 , X 2 , ..., Xsub(n), Y 1 , Y 2 , ..., Ysub(n) to present the maximum likelihood estimators of A(t) and A(infinity). The exact sampling distributions for those estimators and some statistical properties are discussed before a simulation model is used to determine 95% simulation intervals for A(t). The methodology is applied to two examples which approximate the operating history of two nuclear power plants. (author)

  12. Testicular cancer incidence to rise by 25% by 2025 in Europe? Model-based predictions in 40 countries using population-based registry data.

    Science.gov (United States)

    Le Cornet, Charlotte; Lortet-Tieulent, Joannie; Forman, David; Béranger, Rémi; Flechon, Aude; Fervers, Béatrice; Schüz, Joachim; Bray, Freddie

    2014-03-01

    Testicular cancer mainly affects White Caucasian populations, accounts for 1% of all male cancers, and is frequently the most common malignancy among young adult men. In light of the escalating rates of testicular cancer incidence in Europe, and in support of future planning to ensure optimal care of patients with what can be a curable disease, we predict the future burden in 40 European countries around 2025. Current observed trends were extrapolated with the NORDPRED model to estimate the future burden of testicular cancer in the context of changes in risk versus changes in demographics. Despite substantial heterogeneity in the rates, the vast majority of European countries will see an increasing burden over the next two decades. We estimate there will be 23,000 new cases of testicular cancer annually in Europe by 2025, a rise of 24% from 2005. Some of the most rapid increases in testicular cancer are observed in Croatia, Slovenia, Italy and Spain, and a transition is underway, whereby recent attenuations and declines in rates in certain high-risk countries in Northern Europe contrast with the increasing trends and escalating burden in Southern Europe. According to our estimates for 2025, around one in 100 men will be diagnosed with the disease annually in the highest risk countries of Europe (Croatia, Slovenia and Norway). Elucidating the key determinants of testicular cancer and the equitable provision of optimal care for patients across Europe are priorities given the steady rise in the number of patients by 2025, and an absence of primary prevention opportunities. None. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Using Multivariate Regression Model with Least Absolute Shrinkage and Selection Operator (LASSO) to Predict the Incidence of Xerostomia after Intensity-Modulated Radiotherapy for Head and Neck Cancer

    Science.gov (United States)

    Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Wu, Jia-Ming; Wang, Hung-Yu; Horng, Mong-Fong; Chang, Chun-Ming; Lan, Jen-Hong; Huang, Ya-Yu; Fang, Fu-Min; Leung, Stephen Wan

    2014-01-01

    Purpose The aim of this study was to develop a multivariate logistic regression model with least absolute shrinkage and selection operator (LASSO) to make valid predictions about the incidence of moderate-to-severe patient-rated xerostomia among head and neck cancer (HNC) patients treated with IMRT. Methods and Materials Quality of life questionnaire datasets from 206 patients with HNC were analyzed. The European Organization for Research and Treatment of Cancer QLQ-H&N35 and QLQ-C30 questionnaires were used as the endpoint evaluation. The primary endpoint (grade 3+ xerostomia) was defined as moderate-to-severe xerostomia at 3 (XER3m) and 12 months (XER12m) after the completion of IMRT. Normal tissue complication probability (NTCP) models were developed. The optimal and suboptimal numbers of prognostic factors for a multivariate logistic regression model were determined using the LASSO with bootstrapping technique. Statistical analysis was performed using the scaled Brier score, Nagelkerke R2, chi-squared test, Omnibus, Hosmer-Lemeshow test, and the AUC. Results Eight prognostic factors were selected by LASSO for the 3-month time point: Dmean-c, Dmean-i, age, financial status, T stage, AJCC stage, smoking, and education. Nine prognostic factors were selected for the 12-month time point: Dmean-i, education, Dmean-c, smoking, T stage, baseline xerostomia, alcohol abuse, family history, and node classification. In the selection of the suboptimal number of prognostic factors by LASSO, three suboptimal prognostic factors were fine-tuned by Hosmer-Lemeshow test and AUC, i.e., Dmean-c, Dmean-i, and age for the 3-month time point. Five suboptimal prognostic factors were also selected for the 12-month time point, i.e., Dmean-i, education, Dmean-c, smoking, and T stage. The overall performance for both time points of the NTCP model in terms of scaled Brier score, Omnibus, and Nagelkerke R2 was satisfactory and corresponded well with the expected values. Conclusions

  14. Anticipating cognitive effort: roles of perceived error-likelihood and time demands.

    Science.gov (United States)

    Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F

    2017-11-13

    Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.

  15. Likelihood analysis of the minimal AMSB model

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Borsato, M.; Chobanova, V.; Lucio, M.; Santos, D.M. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Sakurai, K. [Institute for Particle Physics Phenomenology, University of Durham, Science Laboratories, Department of Physics, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Buchmueller, O.; Citron, M.; Costa, J.C.; Richards, A. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); De Roeck, A. [Experimental Physics Department, CERN, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [School of Physics, University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, Melbourne (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); CERN, Theoretical Physics Department, Geneva (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Cantabria (Spain); Isidori, G. [Physik-Institut, Universitaet Zuerich, Zurich (Switzerland); Luo, F. [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba (Japan); Olive, K.A. [School of Physics and Astronomy, University of Minnesota, William I. Fine Theoretical Physics Institute, Minneapolis, MN (United States)

    2017-04-15

    We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, χ{sup 0}{sub 1}, may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces m{sub χ{sup 0}{sub 1}} 0) but the scalar mass m{sub 0} is poorly constrained. In the wino-LSP case, m{sub 3/2} is constrained to about 900 TeV and m{sub χ{sup 0}{sub 1}} to 2.9 ± 0.1 TeV, whereas in the Higgsino-LSP case m{sub 3/2} has just a lower limit >or similar 650 TeV (>or similar 480 TeV) and m{sub χ{sup 0}{sub 1}} is constrained to 1.12 (1.13) ± 0.02 TeV in the μ > 0 (μ < 0) scenario. In neither case can the anomalous magnetic moment of the muon, (g-2){sub μ}, be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the χ{sup 0}{sub 1} contributes only a fraction of the cold DM density, future LHC E{sub T}-based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable BR(B{sub s,d} → μ{sup +}μ{sup -}) to agree with the data better than in the SM in the case of wino-like DM with μ > 0. (orig.)

  16. Source and Message Factors in Persuasion: A Reply to Stiff's Critique of the Elaboration Likelihood Model.

    Science.gov (United States)

    Petty, Richard E.; And Others

    1987-01-01

    Answers James Stiff's criticism of the Elaboration Likelihood Model (ELM) of persuasion. Corrects certain misperceptions of the ELM and criticizes Stiff's meta-analysis that compares ELM predictions with those derived from Kahneman's elastic capacity model. Argues that Stiff's presentation of the ELM and the conclusions he draws based on the data…

  17. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang

    2015-10-08

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  18. Likelihood Analysis of Supersymmetric SU(5) GUTs

    CERN Document Server

    Bagnaschi, E.

    2017-01-01

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringi...

  19. Reducing the likelihood of long tennis matches.

    Science.gov (United States)

    Barnett, Tristan; Alan, Brown; Pollard, Graham

    2006-01-01

    Long matches can cause problems for tournaments. For example, the starting times of subsequent matches can be substantially delayed causing inconvenience to players, spectators, officials and television scheduling. They can even be seen as unfair in the tournament setting when the winner of a very long match, who may have negative aftereffects from such a match, plays the winner of an average or shorter length match in the next round. Long matches can also lead to injuries to the participating players. One factor that can lead to long matches is the use of the advantage set as the fifth set, as in the Australian Open, the French Open and Wimbledon. Another factor is long rallies and a greater than average number of points per game. This tends to occur more frequently on the slower surfaces such as at the French Open. The mathematical method of generating functions is used to show that the likelihood of long matches can be substantially reduced by using the tiebreak game in the fifth set, or more effectively by using a new type of game, the 50-40 game, throughout the match. Key PointsThe cumulant generating function has nice properties for calculating the parameters of distributions in a tennis matchA final tiebreaker set reduces the length of matches as currently being used in the US OpenA new 50-40 game reduces the length of matches whilst maintaining comparable probabilities for the better player to win the match.

  20. Dimension-independent likelihood-informed MCMC

    KAUST Repository

    Cui, Tiangang; Law, Kody; Marzouk, Youssef M.

    2015-01-01

    Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

  1. Maximum likelihood window for time delay estimation

    International Nuclear Information System (INIS)

    Lee, Young Sup; Yoon, Dong Jin; Kim, Chi Yup

    2004-01-01

    Time delay estimation for the detection of leak location in underground pipelines is critically important. Because the exact leak location depends upon the precision of the time delay between sensor signals due to leak noise and the speed of elastic waves, the research on the estimation of time delay has been one of the key issues in leak lovating with the time arrival difference method. In this study, an optimal Maximum Likelihood window is considered to obtain a better estimation of the time delay. This method has been proved in experiments, which can provide much clearer and more precise peaks in cross-correlation functions of leak signals. The leak location error has been less than 1 % of the distance between sensors, for example the error was not greater than 3 m for 300 m long underground pipelines. Apart from the experiment, an intensive theoretical analysis in terms of signal processing has been described. The improved leak locating with the suggested method is due to the windowing effect in frequency domain, which offers a weighting in significant frequencies.

  2. Incidence of second cervical vertebral fractures far surpassed the rate predicted by the changing age distribution and growth among elderly persons in the United States (2005-2008).

    Science.gov (United States)

    Zusman, Natalie L; Ching, Alexander C; Hart, Robert A; Yoo, Jung U

    2013-04-20

    Nationwide epidemiological cohort study. To characterize the incidence of second cervical vertebral (C2) fractures by age and geographical region among the elderly Medicare population and to elucidate if the rate changed in the years 2005 to 2008. Recent publications hypothesized that the rate of cervical vertebral fractures may be increasing. To date, there are no published nationwide reports describing the incidence and demographics of these injuries in the elderly US population. Incidence of C2 fracture in the years 2005 to 2008 was determined by querying PearlDiver Technologies, Inc. (Warsaw, IN), a commercially available database, using International Classification of Diseases code 805.02. Rates were calculated using the PearlDiver reported person-counts as the numerator and the Center for Medicare and Medicare Services midyear population file as the denominator, and reported per 10,000 person-years (10,000 p-y). The age and geographical distributions of fractures were examined. Variability in rates was analyzed using the mean, standard deviation, 95% confidence intervals, χ tests, and Pearson correlation coefficients. Although the elderly population increased by 6% between 2005 and 2008, the annual incidence of C2 fracture rose by 21%, from 1.58 to 1.91 per 10,000 p-y, trending upward in a straight-line function (r = 0.999, P = 0.0006). The incidence of fracture varied between age groups; however, an increase was observed in all age groups. Persons aged 65 to 74 years (the youngest age group) experienced the lowest incidence (0.63 in 2005 to 0.71 in 2008), and the rate of increase was the smallest among the age groups examined (13%). Persons aged 85 and older demonstrated the highest incidence (4.36-5.67) and the greatest increase (30%). From 2005 to 2008, the overall incidence of C2 fracture rose at a rate that was 3.5 times faster than the elderly population growth.

  3. Incidents analysis

    International Nuclear Information System (INIS)

    Francois, P.

    1996-01-01

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs

  4. Incidents analysis

    Energy Technology Data Exchange (ETDEWEB)

    Francois, P

    1997-12-31

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.

  5. The incidence of bacterial endosymbionts in terrestrial arthropods.

    Science.gov (United States)

    Weinert, Lucy A; Araujo-Jnr, Eli V; Ahmed, Muhammad Z; Welch, John J

    2015-05-22

    Intracellular endosymbiotic bacteria are found in many terrestrial arthropods and have a profound influence on host biology. A basic question about these symbionts is why they infect the hosts that they do, but estimating symbiont incidence (the proportion of potential host species that are actually infected) is complicated by dynamic or low prevalence infections. We develop a maximum-likelihood approach to estimating incidence, and testing hypotheses about its variation. We apply our method to a database of screens for bacterial symbionts, containing more than 3600 distinct arthropod species and more than 150 000 individual arthropods. After accounting for sampling bias, we estimate that 52% (CIs: 48-57) of arthropod species are infected with Wolbachia, 24% (CIs: 20-42) with Rickettsia and 13% (CIs: 13-55) with Cardinium. We then show that these differences stem from the significantly reduced incidence of Rickettsia and Cardinium in most hexapod orders, which might be explained by evolutionary differences in the arthropod immune response. Finally, we test the prediction that symbiont incidence should be higher in speciose host clades. But while some groups do show a trend for more infection in species-rich families, the correlations are generally weak and inconsistent. These results argue against a major role for parasitic symbionts in driving arthropod diversification. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  6. Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation

    Science.gov (United States)

    Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.

    2015-11-01

    We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.

  7. Maximum likelihood versus likelihood-free quantum system identification in the atom maser

    International Nuclear Information System (INIS)

    Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin

    2014-01-01

    We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle. (paper)

  8. Maximum likelihood estimation of the parameters of nonminimum phase and noncausal ARMA models

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The well-known prediction-error-based maximum likelihood (PEML) method can only handle minimum phase ARMA models. This paper presents a new method known as the back-filtering-based maximum likelihood (BFML) method, which can handle nonminimum phase and noncausal ARMA models. The BFML method...... is identical to the PEML method in the case of a minimum phase ARMA model, and it turns out that the BFML method incorporates a noncausal ARMA filter with poles outside the unit circle for estimation of the parameters of a causal, nonminimum phase ARMA model...

  9. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E.; Weiglein, G. [DESY, Hamburg (Germany); Costa, J.C.; Buchmueller, O.; Citron, M.; Richards, A.; De Vries, K.J. [Imperial College, High Energy Physics Group, Blackett Laboratory, London (United Kingdom); Sakurai, K. [University of Durham, Science Laboratories, Department of Physics, Institute for Particle Physics Phenomenology, Durham (United Kingdom); University of Warsaw, Faculty of Physics, Institute of Theoretical Physics, Warsaw (Poland); Borsato, M.; Chobanova, V.; Lucio, M.; Martinez Santos, D. [Universidade de Santiago de Compostela, Santiago de Compostela (Spain); Cavanaugh, R. [Fermi National Accelerator Laboratory, Batavia, IL (United States); University of Illinois at Chicago, Physics Department, Chicago, IL (United States); Roeck, A. de [CERN, Experimental Physics Department, Geneva (Switzerland); Antwerp University, Wilrijk (Belgium); Dolan, M.J. [University of Melbourne, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Parkville (Australia); Ellis, J.R. [King' s College London, Theoretical Particle Physics and Cosmology Group, Department of Physics, London (United Kingdom); Theoretical Physics Department, CERN, Geneva 23 (Switzerland); Flaecher, H. [University of Bristol, H.H. Wills Physics Laboratory, Bristol (United Kingdom); Heinemeyer, S. [Campus of International Excellence UAM+CSIC, Cantoblanco, Madrid (Spain); Instituto de Fisica Teorica UAM-CSIC, Madrid (Spain); Instituto de Fisica de Cantabria (CSIC-UC), Santander (Spain); Isidori, G. [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Olive, K.A. [University of Minnesota, William I. Fine Theoretical Physics Institute, School of Physics and Astronomy, Minneapolis, MN (United States)

    2017-02-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R} - χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub τ} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC. (orig.)

  10. Likelihood analysis of supersymmetric SU(5) GUTs

    Energy Technology Data Exchange (ETDEWEB)

    Bagnaschi, E. [DESY, Hamburg (Germany); Costa, J.C. [Imperial College, London (United Kingdom). Blackett Lab.; Sakurai, K. [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomonology; Warsaw Univ. (Poland). Inst. of Theoretical Physics; Collaboration: MasterCode Collaboration; and others

    2016-10-15

    We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass m{sub 1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m{sub 5} and m{sub 10}, and for the 5 and anti 5 Higgs representations m{sub H{sub u}} and m{sub H{sub d}}, a universal trilinear soft SUSY-breaking parameter A{sub 0}, and the ratio of Higgs vevs tan β. In addition to previous constraints from direct sparticle searches, low-energy and avour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets+E{sub T} events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel u{sub R}/c{sub R}-χ{sup 0}{sub 1} coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ν{sub T} coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.

  11. A Game Theoretical Approach to Hacktivism: Is Attack Likelihood a Product of Risks and Payoffs?

    Science.gov (United States)

    Bodford, Jessica E; Kwan, Virginia S Y

    2018-02-01

    The current study examines hacktivism (i.e., hacking to convey a moral, ethical, or social justice message) through a general game theoretic framework-that is, as a product of costs and benefits. Given the inherent risk of carrying out a hacktivist attack (e.g., legal action, imprisonment), it would be rational for the user to weigh these risks against perceived benefits of carrying out the attack. As such, we examined computer science students' estimations of risks, payoffs, and attack likelihood through a game theoretic design. Furthermore, this study aims at constructing a descriptive profile of potential hacktivists, exploring two predicted covariates of attack decision making, namely, peer prevalence of hacking and sex differences. Contrary to expectations, results suggest that participants' estimations of attack likelihood stemmed solely from expected payoffs, rather than subjective risks. Peer prevalence significantly predicted increased payoffs and attack likelihood, suggesting an underlying descriptive norm in social networks. Notably, we observed no sex differences in the decision to attack, nor in the factors predicting attack likelihood. Implications for policymakers and the understanding and prevention of hacktivism are discussed, as are the possible ramifications of widely communicated payoffs over potential risks in hacking communities.

  12. Does Peak Urine Flow Rate Predict the Development of Incident Lower Urinary Tract Symptoms in Men with Mild to No Current Symptoms? Results from REDUCE.

    Science.gov (United States)

    Simon, Ross M; Howard, Lauren E; Moreira, Daniel M; Roehrborn, Claus; Vidal, Adriana; Castro-Santamaria, Ramiro; Freedland, Stephen J

    2017-09-01

    We determined whether decreased peak urine flow is associated with future incident lower urinary tract symptoms in men with mild to no lower urinary tract symptoms. Our population consisted of 3,140 men from the REDUCE (Reduction by Dutasteride of Prostate Cancer Events) trial with mild to no lower urinary tract symptoms, defined as I-PSS (International Prostate Symptom Score) less than 8. REDUCE was a randomized trial of dutasteride vs placebo for prostate cancer prevention in men with elevated prostate specific antigen and negative biopsy. I-PSS measures were obtained every 6 months throughout the 4-year study. The association between peak urine flow rate and progression to incident lower urinary tract symptoms, defined as the first of medical treatment, surgery or sustained and clinically significant lower urinary tract symptoms, was tested by multivariable Cox models, adjusting for various baseline characteristics and treatment arm. On multivariable analysis as a continuous variable, decreased peak urine flow rate was significantly associated with an increased risk of incident lower urinary tract symptoms (p = 0.002). Results were similar in the dutasteride and placebo arms. On univariable analysis when peak flow was categorized as 15 or greater, 10 to 14.9 and less than 10 ml per second, flow rates of 10 to 14.9 and less than 10 ml per second were associated with a significantly increased risk of incident lower urinary tract symptoms (HR 1.39, p = 0.011 and 1.67, p urinary tract symptoms a decreased peak urine flow rate is independently associated with incident lower urinary tract symptoms. If confirmed, these men should be followed closer for incident lower urinary tract symptoms. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  13. The behavior of the likelihood ratio test for testing missingness

    OpenAIRE

    Hens, Niel; Aerts, Marc; Molenberghs, Geert; Thijs, Herbert

    2003-01-01

    To asses the sensitivity of conclusions to model choices in the context of selection models for non-random dropout, one can oppose the different missing mechanisms to each other; e.g. by the likelihood ratio tests. The finite sample behavior of the null distribution and the power of the likelihood ratio test is studied under a variety of missingness mechanisms. missing data; sensitivity analysis; likelihood ratio test; missing mechanisms

  14. Copy number variation in glutathione-S-transferase T1 and M1 predicts incidence and 5-year survival from prostate and bladder cancer, and incidence of corpus uteri cancer in the general population

    DEFF Research Database (Denmark)

    Nørskov, M S; Frikke-Schmidt, R; Bojesen, S E

    2011-01-01

    Glutathione-S-transferase T1 (GSTT1) and GSTM1 detoxify carcinogens and thus potentially contribute to inter-individual susceptibility to cancer. We determined the ability of GST copy number variation (CNV) to predict the risk of cancer in the general population. Exact copy numbers of GSTT1 and G...

  15. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  16. The gap between fatherhood and couplehood desires among Israeli gay men and estimations of their likelihood.

    Science.gov (United States)

    Shenkman, Geva

    2012-10-01

    This study examined the frequencies of the desires and likelihood estimations of Israeli gay men regarding fatherhood and couplehood, using a sample of 183 gay men aged 19-50. It follows previous research which indicated the existence of a gap in the United States with respect to fatherhood, and called for generalizability examinations in other countries and the exploration of possible explanations. As predicted, a gap was also found in Israel between fatherhood desires and their likelihood estimations, as well as between couplehood desires and their likelihood estimations. In addition, lower estimations of fatherhood likelihood were found to predict depression and to correlate with decreased subjective well-being. Possible psychosocial explanations are offered. Moreover, by mapping attitudes toward fatherhood and couplehood among Israeli gay men, the current study helps to extend our knowledge of several central human development motivations and their correlations with depression and subjective well-being in a less-studied sexual minority in a complex cultural climate. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  17. Parental family variables and likelihood of divorce.

    Science.gov (United States)

    Skalkidou, A

    2000-01-01

    It has long been established that divorced men and women have substantially higher standardized general mortality than same gender persons. Because the incidence of divorce is increasing in many countries, determinants of divorce rates assume great importance as indirect risk factors for several diseases and conditions that adversely affect health. We have undertaken a study in Athens, Greece, to evaluate whether sibship size, birth order, and the gender composition of spousal sibships are related to the probability of divorce. 358 high school students, aged between 15 and 17 years, satisfactorily completed anonymous questionnaires, indicating whether their natural parents have been separated or divorced, their parents' educational achievement, birth order and sibship size by gender. The study was analyzed as a twin case-control investigation, treating those divorced or separated as cases and those who were not divorced or separated as controls. A man who grew up as an only child was almost three times as likely to divorce compared to a man with siblings, and this association was highly significant (p approximately 0.004). There was no such evidence with respect to women. After controlling for sibship size, earlier born men--but not women--appeared to be at higher risk for divorce compared to those later born. There was no evidence that the gender structure of the sibship substantially affects the risk for divorce. Even though divorce is not an organic disease, it indirectly affects health as well as the social well-being. The findings of this study need to be replicated, but, if confirmed, they could contribute to our understanding of the roots of some instances of marital dysfunction.

  18. Efficient Detection of Repeating Sites to Accelerate Phylogenetic Likelihood Calculations.

    Science.gov (United States)

    Kobert, K; Stamatakis, A; Flouri, T

    2017-03-01

    The phylogenetic likelihood function (PLF) is the major computational bottleneck in several applications of evolutionary biology such as phylogenetic inference, species delimitation, model selection, and divergence times estimation. Given the alignment, a tree and the evolutionary model parameters, the likelihood function computes the conditional likelihood vectors for every node of the tree. Vector entries for which all input data are identical result in redundant likelihood operations which, in turn, yield identical conditional values. Such operations can be omitted for improving run-time and, using appropriate data structures, reducing memory usage. We present a fast, novel method for identifying and omitting such redundant operations in phylogenetic likelihood calculations, and assess the performance improvement and memory savings attained by our method. Using empirical and simulated data sets, we show that a prototype implementation of our method yields up to 12-fold speedups and uses up to 78% less memory than one of the fastest and most highly tuned implementations of the PLF currently available. Our method is generic and can seamlessly be integrated into any phylogenetic likelihood implementation. [Algorithms; maximum likelihood; phylogenetic likelihood function; phylogenetics]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  19. Planck intermediate results: XVI. Profile likelihoods for cosmological parameters

    DEFF Research Database (Denmark)

    Bartlett, J.G.; Cardoso, J.-F.; Delabrouille, J.

    2014-01-01

    We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the CDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agr...

  20. Planck 2013 results. XV. CMB power spectra and likelihood

    DEFF Research Database (Denmark)

    Tauber, Jan; Bartlett, J.G.; Bucher, M.

    2014-01-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best...

  1. The modified signed likelihood statistic and saddlepoint approximations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1992-01-01

    SUMMARY: For a number of tests in exponential families we show that the use of a normal approximation to the modified signed likelihood ratio statistic r * is equivalent to the use of a saddlepoint approximation. This is also true in a large deviation region where the signed likelihood ratio...... statistic r is of order √ n. © 1992 Biometrika Trust....

  2. Likelihood analysis of parity violation in the compound nucleus

    International Nuclear Information System (INIS)

    Bowman, D.; Sharapov, E.

    1993-01-01

    We discuss the determination of the root mean-squared matrix element of the parity-violating interaction between compound-nuclear states using likelihood analysis. We briefly review the relevant features of the statistical model of the compound nucleus and the formalism of likelihood analysis. We then discuss the application of likelihood analysis to data on panty-violating longitudinal asymmetries. The reliability of the extracted value of the matrix element and errors assigned to the matrix element is stressed. We treat the situations where the spins of the p-wave resonances are not known and known using experimental data and Monte Carlo techniques. We conclude that likelihood analysis provides a reliable way to determine M and its confidence interval. We briefly discuss some problems associated with the normalization of the likelihood function

  3. Israeli hospital preparedness for terrorism-related multiple casualty incidents: can the surge capacity and injury severity distribution be better predicted?

    Science.gov (United States)

    Kosashvili, Yona; Aharonson-Daniel, L; Daniel, Limor A; Peleg, Kobi; Horowitz, Ariel; Laor, Danny; Blumenfeld, Amir

    2009-07-01

    The incidence of large-scale urban attacks on civilian populations has significantly increased across the globe over the past decade. These incidents often result in Hospital Multiple Casualty Incidents (HMCI), which are very challenging to hospital teams. 15 years ago the Emergency and Disaster Medicine Division in the Israeli Ministry of Health defined a key of 20 percent of each hospital's bed capacity as its readiness for multiple casualties. Half of those casualties are expected to require immediate medical treatment. This study was performed to evaluate the efficacy of the current readiness guidelines based on the epidemiology of encountered HMCIs. A retrospective study of HMCIs was recorded in the Israeli Defense Force (IDF) home front command and the Israeli National Trauma Registry (ITR) between November 2000 and June 2003. An HMCI is defined by the Emergency and Disaster Medicine Division in the Israeli Ministry of Health as >or=10 casualties or >or=4 suffering from injuries with an ISS>or=16 arriving to a single hospital. The study includes a total of 32 attacks, resulting in 62 HMCIs and 1292 casualties. The mean number of arriving casualties to a single hospital was 20.8+/-13.3 (range 4-56, median 16.5). In 95% of the HMCIs the casualty load was concept may improve the utilisation of national emergency health resources both in the preparation phase and on real time.

  4. Maximum likelihood approach for several stochastic volatility models

    International Nuclear Information System (INIS)

    Camprodon, Jordi; Perelló, Josep

    2012-01-01

    Volatility measures the amplitude of price fluctuations. Despite it being one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the simplest versions of the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and its Mean First-Passage Time. The approach has some predictive power on the future returns amplitude by only knowing the current volatility. The assumed models do not consider long-range volatility autocorrelation and the asymmetric return-volatility cross-correlation but the method still yields very naturally these two important stylized facts. We apply the method to different market indices and with a good performance in all cases. (paper)

  5. Incidence, Predictive Factors, and Prognosis of Chondrosarcoma in Patients with Ollier Disease and Maffucci Syndrome : An International Multicenter Study of 161 Patients

    NARCIS (Netherlands)

    Verdegaal, Suzan H. M.; Bovee, Judith V. M. G.; Pansuriya, Twinkal C.; Grimer, Robert J.; Ozger, Harzem; Jutte, Paul C.; San Julian, Mikel; Biau, David J.; van der Geest, Ingrid C. M.; Leithner, Andreas; Streitbuerger, Arne; Klenke, Frank M.; Gouin, Francois G.; Campanacci, Domenico A.; Marec-Berard, Perrine; Hogendoorn, Pancras C. W.; Brand, Ronald; Taminiau, Antonie H. M.

    2011-01-01

    Background. Enchondromatosis is characterized by the presence of multiple benign cartilage lesions in bone. While Ollier disease is typified by multiple enchondromas, in Maffucci syndrome these are associated with hemangiomas. Studies evaluating the predictive value of clinical symptoms for

  6. Climatic and ecological future of the Amazon: likelihood and causes of change

    OpenAIRE

    B. Cook; N. Zeng; J.-H. Yoon

    2010-01-01

    Some recent climate modeling results suggested a possible dieback of the Amazon rainforest under future climate change, a prediction that raised considerable interest as well as controversy. To determine the likelihood and causes of such changes, we analyzed the output of 15 models from the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC/AR4) and a dynamic vegetation model VEGAS driven by these climate output. Our results suggest that the core of the Amazon rainforest...

  7. Lightning incidents in Mongolia

    Directory of Open Access Journals (Sweden)

    Myagmar Doljinsuren

    2015-11-01

    Full Text Available This is one of the first studies that has been conducted in Mongolia on the distribution of lightning incidents. The study covers a 10-year period from 2004 to 2013. The country records a human death rate of 15.4 deaths per 10 million people per year, which is much higher than that of many countries with similar isokeraunic level. The reason may be the low-grown vegetation observed in most rural areas of Mongolia, a surface topography, typical to steppe climate. We suggest modifications to Gomes–Kadir equation for such countries, as it predicts a much lower annual death rate for Mongolia. The lightning incidents spread over the period from May to August with the peak of the number of incidents occurring in July. The worst lightning affected region in the country is the central part. Compared with impacts of other convective disasters such as squalls, thunderstorms and hail, lightning stands as the second highest in the number of incidents, human deaths and animal deaths. Economic losses due to lightning is only about 1% of the total losses due to the four extreme weather phenomena. However, unless precautionary measures are not promoted among the public, this figure of losses may significantly increase with time as the country is undergoing rapid industrialization at present.

  8. ldr: An R Software Package for Likelihood-Based Su?cient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    Kofi Placid Adragni

    2014-11-01

    Full Text Available In regression settings, a su?cient dimension reduction (SDR method seeks the core information in a p-vector predictor that completely captures its relationship with a response. The reduced predictor may reside in a lower dimension d < p, improving ability to visualize data and predict future observations, and mitigating dimensionality issues when carrying out further analysis. We introduce ldr, a new R software package that implements three recently proposed likelihood-based methods for SDR: covariance reduction, likelihood acquired directions, and principal fitted components. All three methods reduce the dimensionality of the data by pro jection into lower dimensional subspaces. The package also implements a variable screening method built upon principal ?tted components which makes use of ?exible basis functions to capture the dependencies between the predictors and the response. Examples are given to demonstrate likelihood-based SDR analyses using ldr, including estimation of the dimension of reduction subspaces and selection of basis functions. The ldr package provides a framework that we hope to grow into a comprehensive library of likelihood-based SDR methodologies.

  9. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  10. Posterior distributions for likelihood ratios in forensic science.

    Science.gov (United States)

    van den Hout, Ardo; Alberink, Ivo

    2016-09-01

    Evaluation of evidence in forensic science is discussed using posterior distributions for likelihood ratios. Instead of eliminating the uncertainty by integrating (Bayes factor) or by conditioning on parameter values, uncertainty in the likelihood ratio is retained by parameter uncertainty derived from posterior distributions. A posterior distribution for a likelihood ratio can be summarised by the median and credible intervals. Using the posterior mean of the distribution is not recommended. An analysis of forensic data for body height estimation is undertaken. The posterior likelihood approach has been criticised both theoretically and with respect to applicability. This paper addresses the latter and illustrates an interesting application area. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Practical likelihood analysis for spatial generalized linear mixed models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Ribeiro, Paulo Justiniano

    2016-01-01

    We investigate an algorithm for maximum likelihood estimation of spatial generalized linear mixed models based on the Laplace approximation. We compare our algorithm with a set of alternative approaches for two datasets from the literature. The Rhizoctonia root rot and the Rongelap are......, respectively, examples of binomial and count datasets modeled by spatial generalized linear mixed models. Our results show that the Laplace approximation provides similar estimates to Markov Chain Monte Carlo likelihood, Monte Carlo expectation maximization, and modified Laplace approximation. Some advantages...... of Laplace approximation include the computation of the maximized log-likelihood value, which can be used for model selection and tests, and the possibility to obtain realistic confidence intervals for model parameters based on profile likelihoods. The Laplace approximation also avoids the tuning...

  12. Algorithms of maximum likelihood data clustering with applications

    Science.gov (United States)

    Giada, Lorenzo; Marsili, Matteo

    2002-12-01

    We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.

  13. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.; Qian, L.; Carroll, R. J.

    2010-01-01

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks

  14. Maximum likelihood estimation of finite mixture model for economic data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-06-01

    Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.

  15. Attitude towards, and likelihood of, complaining in the banking ...

    African Journals Online (AJOL)

    aims to determine customers' attitudes towards complaining as well as their likelihood of voicing a .... is particularly powerful and impacts greatly on customer satisfaction and retention. ...... 'Cross-national analysis of hotel customers' attitudes ...

  16. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq; Al-Naffouri, Tareq Y.; Al-Ghadhban, Samir N.

    2012-01-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous

  17. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.; Ma, Y.; Sang, H.

    2011-01-01

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  18. On the likelihood function of Gaussian max-stable processes

    KAUST Repository

    Genton, M. G.

    2011-05-24

    We derive a closed form expression for the likelihood function of a Gaussian max-stable process indexed by ℝd at p≤d+1 sites, d≥1. We demonstrate the gain in efficiency in the maximum composite likelihood estimators of the covariance matrix from p=2 to p=3 sites in ℝ2 by means of a Monte Carlo simulation study. © 2011 Biometrika Trust.

  19. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan

    2014-05-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  20. Tapered composite likelihood for spatial max-stable models

    KAUST Repository

    Sang, Huiyan; Genton, Marc G.

    2014-01-01

    Spatial extreme value analysis is useful to environmental studies, in which extreme value phenomena are of interest and meaningful spatial patterns can be discerned. Max-stable process models are able to describe such phenomena. This class of models is asymptotically justified to characterize the spatial dependence among extremes. However, likelihood inference is challenging for such models because their corresponding joint likelihood is unavailable and only bivariate or trivariate distributions are known. In this paper, we propose a tapered composite likelihood approach by utilizing lower dimensional marginal likelihoods for inference on parameters of various max-stable process models. We consider a weighting strategy based on a "taper range" to exclude distant pairs or triples. The "optimal taper range" is selected to maximize various measures of the Godambe information associated with the tapered composite likelihood function. This method substantially reduces the computational cost and improves the efficiency over equally weighted composite likelihood estimators. We illustrate its utility with simulation experiments and an analysis of rainfall data in Switzerland.

  1. An evaluation of the use of remotely sensed parameters for prediction of incidence and risk associated with Vibrio parahaemolyticus in Gulf Coast oysters (Crassostrea virginica).

    Science.gov (United States)

    Phillips, A M B; Depaola, A; Bowers, J; Ladner, S; Grimes, D J

    2007-04-01

    The U.S. Food and Drug Administration recently published a Vibrio parahaemolyticus risk assessment for consumption of raw oysters that predicts V. parahaemolyticus densities at harvest based on water temperature. We retrospectively compared archived remotely sensed measurements (sea surface temperature, chlorophyll, and turbidity) with previously published data from an environmental study of V. parahaemolyticus in Alabama oysters to assess the utility of the former data for predicting V. parahaemolyticus densities in oysters. Remotely sensed sea surface temperature correlated well with previous in situ measurements (R(2) = 0.86) of bottom water temperature, supporting the notion that remotely sensed sea surface temperature data are a sufficiently accurate substitute for direct measurement. Turbidity and chlorophyll levels were not determined in the previous study, but in comparison with the V. parahaemolyticus data, remotely sensed values for these parameters may explain some of the variation in V. parahaemolyticus levels. More accurate determination of these effects and the temporal and spatial variability of these parameters may further improve the accuracy of prediction models. To illustrate the utility of remotely sensed data as a basis for risk management, predictions based on the U.S. Food and Drug Administration V. parahaemolyticus risk assessment model were integrated with remotely sensed sea surface temperature data to display graphically variations in V. parahaemolyticus density in oysters associated with spatial variations in water temperature. We believe images such as these could be posted in near real time, and that the availability of such information in a user-friendly format could be the basis for timely and informed risk management decisions.

  2. Prediction of SO{sub 2} pollution incidents near a power station using partially linear models and an historical matrix of predictor-response vectors

    Energy Technology Data Exchange (ETDEWEB)

    Prada-Sanchez, J.M.; Febrero-Bande, M.; Gonzalez-Manteiga, W. [Universidad de Santiago de Compostela, Dept. de Estadistica e Investigacion Operativa, Santiago de Compostela (Spain); Costos-Yanez, T. [Universidad de Vigo, Dept. de Estadistica e Investigacion Operativa, Orense (Spain); Bermudez-Cela, J.L.; Lucas-Dominguez, T. [Laboratorio, Central Termica de As Pontes, La Coruna (Spain)

    2000-07-01

    Atmospheric SO{sub 2} concentrations at sampling stations near the fossil fuel fired power station at As Pontes (La Coruna, Spain) were predicted using a model for the corresponding time series consisting of a self-explicative term and a linear combination of exogenous variables. In a supplementary simulation study, models of this kind behaved better than the corresponding pure self-explicative or pure linear regression models. (Author)

  3. Prediction of SO2 pollution incidents near a power station using partially linear models and an historical matrix of predictor-response vectors

    International Nuclear Information System (INIS)

    Prada-Sanchez, J.M.; Febrero-Bande, M.; Gonzalez-Manteiga, W.; Costos-Yanez, T.; Bermudez-Cela, J.L.; Lucas-Dominguez, T.

    2000-01-01

    Atmospheric SO 2 concentrations at sampling stations near the fossil fuel fired power station at As Pontes (La Coruna, Spain) were predicted using a model for the corresponding time series consisting of a self-explicative term and a linear combination of exogenous variables. In a supplementary simulation study, models of this kind behaved better than the corresponding pure self-explicative or pure linear regression models. (Author)

  4. Physical frailty predicts incident depressive symptoms in elderly people: prospective findings from the Obu Study of Health Promotion for the Elderly.

    Science.gov (United States)

    Makizako, Hyuma; Shimada, Hiroyuki; Doi, Takehiko; Yoshida, Daisuke; Anan, Yuya; Tsutsumimoto, Kota; Uemura, Kazuki; Liu-Ambrose, Teresa; Park, Hyuntae; Lee, Sanyoon; Suzuki, Takao

    2015-03-01

    The purpose of this study was to determine whether frailty is an important and independent predictor of incident depressive symptoms in elderly people without depressive symptoms at baseline. Fifteen-month prospective study. General community in Japan. A total of 3025 community-dwelling elderly people aged 65 years or over without depressive symptoms at baseline. The self-rated 15-item Geriatric Depression Scale was used to assess symptoms of depression with a score of 6 or more at baseline and 15-month follow-up. Participants underwent a structural interview designed to obtain demographic factors and frailty status, and completed cognitive testing with the Mini-Mental State Examination and physical performance testing with the Short Physical Performance Battery as potential predictors. At a 15-month follow-up survey, 226 participants (7.5%) reported the development of depressive symptoms. We found that frailty and poor self-rated general health (adjusted odds ratio 1.86, 95% confidence interval 1.30-2.66, P Examination, Short Physical Performance Battery, and Geriatric Depression Scale scores at baseline. Our findings suggested that frailty and poor self-rated general health were independent predictors of depressive symptoms in community-dwelling elderly people. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  5. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    Science.gov (United States)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  6. Image properties of list mode likelihood reconstruction for a rectangular positron emission mammography with DOI measurements

    International Nuclear Information System (INIS)

    Qi, Jinyi; Klein, Gregory J.; Huesman, Ronald H.

    2000-01-01

    A positron emission mammography scanner is under development at our Laboratory. The tomograph has a rectangular geometry consisting of four banks of detector modules. For each detector, the system can measure the depth of interaction information inside the crystal. The rectangular geometry leads to irregular radial and angular sampling and spatially variant sensitivity that are different from conventional PET systems. Therefore, it is of importance to study the image properties of the reconstructions. We adapted the theoretical analysis that we had developed for conventional PET systems to the list mode likelihood reconstruction for this tomograph. The local impulse response and covariance of the reconstruction can be easily computed using FFT. These theoretical results are also used with computer observer models to compute the signal-to-noise ratio for lesion detection. The analysis reveals the spatially variant resolution and noise properties of the list mode likelihood reconstruction. The theoretical predictions are in good agreement with Monte Carlo results

  7. Constraint likelihood analysis for a network of gravitational wave detectors

    International Nuclear Information System (INIS)

    Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.

    2005-01-01

    We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method

  8. A 2-stage ovarian cancer screening strategy using the Risk of Ovarian Cancer Algorithm (ROCA) identifies early-stage incident cancers and demonstrates high positive predictive value.

    Science.gov (United States)

    Lu, Karen H; Skates, Steven; Hernandez, Mary A; Bedi, Deepak; Bevers, Therese; Leeds, Leroy; Moore, Richard; Granai, Cornelius; Harris, Steven; Newland, William; Adeyinka, Olasunkanmi; Geffen, Jeremy; Deavers, Michael T; Sun, Charlotte C; Horick, Nora; Fritsche, Herbert; Bast, Robert C

    2013-10-01

    A 2-stage ovarian cancer screening strategy was evaluated that incorporates change of carbohydrate antigen 125 (CA125) levels over time and age to estimate risk of ovarian cancer. Women with high-risk scores were referred for transvaginal ultrasound (TVS). A single-arm, prospective study of postmenopausal women was conducted. Participants underwent an annual CA125 blood test. Based on the Risk of Ovarian Cancer Algorithm (ROCA) result, women were triaged to next annual CA125 test (low risk), repeat CA125 test in 3 months (intermediate risk), or TVS and referral to a gynecologic oncologist (high risk). A total of 4051 women participated over 11 years. The average annual rate of referral to a CA125 test in 3 months was 5.8%, and the average annual referral rate to TVS and review by a gynecologic oncologist was 0.9%. Ten women underwent surgery on the basis of TVS, with 4 invasive ovarian cancers (1 with stage IA disease, 2 with stage IC disease, and 1 with stage IIB disease), 2 ovarian tumors of low malignant potential (both stage IA), 1 endometrial cancer (stage I), and 3 benign ovarian tumors, providing a positive predictive value of 40% (95% confidence interval = 12.2%, 73.8%) for detecting invasive ovarian cancer. The specificity was 99.9% (95% confidence interval = 99.7%, 100%). All 4 women with invasive ovarian cancer were enrolled in the study for at least 3 years with low-risk annual CA125 test values prior to rising CA125 levels. ROCA followed by TVS demonstrated excellent specificity and positive predictive value in a population of US women at average risk for ovarian cancer. Copyright © 2013 American Cancer Society.

  9. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  10. Generalized empirical likelihood methods for analyzing longitudinal data

    KAUST Repository

    Wang, S.

    2010-02-16

    Efficient estimation of parameters is a major objective in analyzing longitudinal data. We propose two generalized empirical likelihood based methods that take into consideration within-subject correlations. A nonparametric version of the Wilks theorem for the limiting distributions of the empirical likelihood ratios is derived. It is shown that one of the proposed methods is locally efficient among a class of within-subject variance-covariance matrices. A simulation study is conducted to investigate the finite sample properties of the proposed methods and compare them with the block empirical likelihood method by You et al. (2006) and the normal approximation with a correctly estimated variance-covariance. The results suggest that the proposed methods are generally more efficient than existing methods which ignore the correlation structure, and better in coverage compared to the normal approximation with correctly specified within-subject correlation. An application illustrating our methods and supporting the simulation study results is also presented.

  11. Associations between active shooter incidents and gun ownership and storage among families with young children in the United States.

    Science.gov (United States)

    Morrissey, Taryn W

    2017-07-01

    The presence of firearms and their unsafe storage in the home can increase risk of firearm-related death and injury, but public opinion suggests that firearm ownership is a protective factor against gun violence. This study examined the effects of a recent nearby active shooter incident on gun ownership and storage practices among families with young children. A series of regression models, with data from the nationally representative Early Childhood Longitudinal Study-Birth Cohort merged with the FBI's Active Shooter Incidents data collected in 2003-2006, were used to examine whether household gun ownership and storage practices differed in the months prior to and following an active shooter incident that occurred anywhere in the United States or within the same state. Approximately one-fifth of young children lived in households with one or more guns; of these children, only two-thirds lived in homes that stored all guns in locked cabinets. Results suggest that the experience of a recent active shooter incident was associated with an increased likelihood of storing all guns locked, with the magnitude dependent on the temporal and geographic proximity of the incident. The severity of the incident, defined as the number of fatalities, predicted an increase in storing guns locked. Findings suggest that public shootings change behaviors related to firearm storage among families with young children. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Unbinned likelihood maximisation framework for neutrino clustering in Python

    Energy Technology Data Exchange (ETDEWEB)

    Coenders, Stefan [Technische Universitaet Muenchen, Boltzmannstr. 2, 85748 Garching (Germany)

    2016-07-01

    Albeit having detected an astrophysical neutrino flux with IceCube, sources of astrophysical neutrinos remain hidden up to now. A detection of a neutrino point source is a smoking gun for hadronic processes and acceleration of cosmic rays. The search for neutrino sources has many degrees of freedom, for example steady versus transient, point-like versus extended sources, et cetera. Here, we introduce a Python framework designed for unbinned likelihood maximisations as used in searches for neutrino point sources by IceCube. Implementing source scenarios in a modular way, likelihood searches on various kinds can be implemented in a user-friendly way, without sacrificing speed and memory management.

  13. Nearly Efficient Likelihood Ratio Tests of the Unit Root Hypothesis

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    Seemingly absent from the arsenal of currently available "nearly efficient" testing procedures for the unit root hypothesis, i.e. tests whose local asymptotic power functions are indistinguishable from the Gaussian power envelope, is a test admitting a (quasi-)likelihood ratio interpretation. We...... show that the likelihood ratio unit root test derived in a Gaussian AR(1) model with standard normal innovations is nearly efficient in that model. Moreover, these desirable properties carry over to more complicated models allowing for serially correlated and/or non-Gaussian innovations....

  14. A note on estimating errors from the likelihood function

    International Nuclear Information System (INIS)

    Barlow, Roger

    2005-01-01

    The points at which the log likelihood falls by 12 from its maximum value are often used to give the 'errors' on a result, i.e. the 68% central confidence interval. The validity of this is examined for two simple cases: a lifetime measurement and a Poisson measurement. Results are compared with the exact Neyman construction and with the simple Bartlett approximation. It is shown that the accuracy of the log likelihood method is poor, and the Bartlett construction explains why it is flawed

  15. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots

    DEFF Research Database (Denmark)

    Jansson, Michael; Nielsen, Morten Ørregaard

    In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show...... that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require...... the choice of a GLS detrending parameter, which our likelihood ratio tests do not....

  16. LDR: A Package for Likelihood-Based Sufficient Dimension Reduction

    Directory of Open Access Journals (Sweden)

    R. Dennis Cook

    2011-03-01

    Full Text Available We introduce a new mlab software package that implements several recently proposed likelihood-based methods for sufficient dimension reduction. Current capabilities include estimation of reduced subspaces with a fixed dimension d, as well as estimation of d by use of likelihood-ratio testing, permutation testing and information criteria. The methods are suitable for preprocessing data for both regression and classification. Implementations of related estimators are also available. Although the software is more oriented to command-line operation, a graphical user interface is also provided for prototype computations.

  17. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  18. Goiania incident case study

    International Nuclear Information System (INIS)

    Petterson, J.S.

    1988-06-01

    The reasons for wanting to document this case study and present the findings are simple. According to USDOE technical risk assessments (and our own initial work on the Hanford socioeconomic study), the likelihood of a major accident involving exposure to radioactive materials in the process of site characterization, construction, operation, and closure of a high-level waste repository is extremely remote. Most would agree, however, that there is a relatively high probability that a minor accident involving radiological contamination will occur sometime during the lifetime of the repository -- for example, during transport, at an MRS site or at the permanent site itself during repacking and deposition. Thus, one of the major concerns of the Yucca Mountain Socioeconomic Study is the potential impact of a relatively minor radiation-related accident. A large number of potential impact of a relatively minor radiation-related accident. A large number of potential accident scenarios have been under consideration (such as a transportation or other surface accident which results in a significant decline in tourism, the number of conventions, or the selection of Nevada as a retirement residence). The results of the work in Goiania make it clear, however, that such a significant shift in established social patterns and trends is not likely to occur as a direct outcome of a single nuclear-related accident (even, perhaps, a relatively major one), but rather, are likely to occur as a result of the enduring social interpretations of such an accident -- that is, as a result of the process of understanding, communicating, and socially sustaining a particular set of associations with respect to the initial incident

  19. Deoxyribonucleic acid telomere length shortening can predict the incidence of non-alcoholic fatty liver disease in patients with type 2 diabetes mellitus.

    Science.gov (United States)

    Ping, Fan; Li, Zeng-Yi; Lv, Ke; Zhou, Mei-Cen; Dong, Ya-Xiu; Sun, Qi; Li, Yu-Xiu

    2017-03-01

    To investigate the effect of telomere shortening and other predictive factors of non-alcoholic fatty liver disease (NAFLD) in type 2 diabetes mellitus patients in a 6-year prospective cohort study. A total of 70 type 2 diabetes mellitus (mean age 57.8 ± 6.7 years) patients without NAFLD were included in the study, and 64 of them were successfully followed up 6 years later, excluding four cases with significant alcohol consumption. NAFLD was diagnosed by the hepatorenal ratio obtained by a quantitative ultrasound method using NIH image analysis software. The 39 individuals that developed NAFLD were allocated to group A, and the 21 individuals that did not develop NAFLD were allocated to group B. Fluorescent real-time quantitative polymerase chain reaction was used to measure telomere length. There was no significant difference between the two groups in baseline telomere length; however, at the end of the 6th year, telomere length had become shorter in group A compared with group B. There were significant differences between these two groups in baseline body mass index, waistline, systolic blood pressure, glycated hemoglobin and fasting C-peptide level. In addition, the estimated indices of baseline insulin resistance increased in group A. Fasting insulin level, body mass index, systolic blood pressure at baseline and the shortening of telomere length were independent risk factors of NAFLD in type 2 diabetes mellitus patients. Telomere length became shorter in type 2 diabetes mellitus patients who developed NAFLD over the course of 6 years. Type 2 diabetes mellitus patients who developed NAFLD had more serious insulin resistance compared with those who did not develop NAFLD a long time ago. © 2016 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.

  20. Incidence and Predictive Factors of Pain Flare After Spine Stereotactic Body Radiation Therapy: Secondary Analysis of Phase 1/2 Trials

    International Nuclear Information System (INIS)

    Pan, Hubert Y.; Allen, Pamela K.; Wang, Xin S.; Chang, Eric L.; Rhines, Laurence D.; Tatsui, Claudio E.; Amini, Behrang; Wang, Xin A.; Tannir, Nizar M.; Brown, Paul D.; Ghia, Amol J.

    2014-01-01

    Purpose/Objective(s): To perform a secondary analysis of institutional prospective spine stereotactic body radiation therapy (SBRT) trials to investigate posttreatment acute pain flare. Methods and Materials: Medical records for enrolled patients were reviewed. Study protocol included baseline and follow-up surveys with pain assessment by Brief Pain Inventory and documentation of pain medications. Patients were considered evaluable for pain flare if clinical note or follow-up survey was completed within 2 weeks of SBRT. Pain flare was defined as a clinical note indicating increased pain at the treated site or survey showing a 2-point increase in worst pain score, a 25% increase in analgesic intake, or the initiation of steroids. Binary logistic regression was used to determine predictive factors for pain flare occurrence. Results: Of the 210 enrolled patients, 195 (93%) were evaluable for pain flare, including 172 (88%) clinically, 135 (69%) by survey, and 112 (57%) by both methods. Of evaluable patients, 61 (31%) had undergone prior surgery, 57 (29%) had received prior radiation, and 34 (17%) took steroids during treatment, mostly for prior conditions. Pain flare was observed in 44 patients (23%). Median time to pain flare was 5 days (range, 0-20 days) after the start of treatment. On multivariate analysis, the only independent factor associated with pain flare was the number of treatment fractions (odds ratio = 0.66, P=.004). Age, sex, performance status, spine location, number of treated vertebrae, prior radiation, prior surgery, primary tumor histology, baseline pain score, and steroid use were not significant. Conclusions: Acute pain flare after spine SBRT is a relatively common event, for which patients should be counseled. Additional study is needed to determine whether prophylactic or symptomatic intervention is preferred

  1. Incidence and Predictive Factors of Pain Flare After Spine Stereotactic Body Radiation Therapy: Secondary Analysis of Phase 1/2 Trials

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Hubert Y.; Allen, Pamela K. [Department of Radiation Oncology, University of Texas MD Anderson Cancer, Houston, Texas (United States); Wang, Xin S. [Department of Symptom Research, University of Texas MD Anderson Cancer, Houston, Texas (United States); Chang, Eric L. [Department of Radiation Oncology, University of Texas MD Anderson Cancer, Houston, Texas (United States); Department of Radiation Oncology, USC Norris Cancer Center, Los Angeles, California (United States); Rhines, Laurence D.; Tatsui, Claudio E. [Department of Neurosurgery, University of Texas MD Anderson Cancer, Houston, Texas (United States); Amini, Behrang [Department of Diagnostic Radiology, University of Texas MD Anderson Cancer, Houston, Texas (United States); Wang, Xin A. [Department of Radiation Physics, University of Texas MD Anderson Cancer, Houston, Texas (United States); Tannir, Nizar M. [Department of Genitourinary Medical Oncology, University of Texas MD Anderson Cancer, Houston, Texas (United States); Brown, Paul D. [Department of Radiation Oncology, University of Texas MD Anderson Cancer, Houston, Texas (United States); Ghia, Amol J., E-mail: AJGhia@mdanderson.org [Department of Radiation Oncology, University of Texas MD Anderson Cancer, Houston, Texas (United States)

    2014-11-15

    Purpose/Objective(s): To perform a secondary analysis of institutional prospective spine stereotactic body radiation therapy (SBRT) trials to investigate posttreatment acute pain flare. Methods and Materials: Medical records for enrolled patients were reviewed. Study protocol included baseline and follow-up surveys with pain assessment by Brief Pain Inventory and documentation of pain medications. Patients were considered evaluable for pain flare if clinical note or follow-up survey was completed within 2 weeks of SBRT. Pain flare was defined as a clinical note indicating increased pain at the treated site or survey showing a 2-point increase in worst pain score, a 25% increase in analgesic intake, or the initiation of steroids. Binary logistic regression was used to determine predictive factors for pain flare occurrence. Results: Of the 210 enrolled patients, 195 (93%) were evaluable for pain flare, including 172 (88%) clinically, 135 (69%) by survey, and 112 (57%) by both methods. Of evaluable patients, 61 (31%) had undergone prior surgery, 57 (29%) had received prior radiation, and 34 (17%) took steroids during treatment, mostly for prior conditions. Pain flare was observed in 44 patients (23%). Median time to pain flare was 5 days (range, 0-20 days) after the start of treatment. On multivariate analysis, the only independent factor associated with pain flare was the number of treatment fractions (odds ratio = 0.66, P=.004). Age, sex, performance status, spine location, number of treated vertebrae, prior radiation, prior surgery, primary tumor histology, baseline pain score, and steroid use were not significant. Conclusions: Acute pain flare after spine SBRT is a relatively common event, for which patients should be counseled. Additional study is needed to determine whether prophylactic or symptomatic intervention is preferred.

  2. Upgrading and downgrading of prostate cancer from biopsy to radical prostatectomy: incidence and predictive factors using the modified Gleason grading system and factoring in tertiary grades.

    Science.gov (United States)

    Epstein, Jonathan I; Feng, Zhaoyong; Trock, Bruce J; Pierorazio, Phillip M

    2012-05-01

    Prior studies assessing the correlation of Gleason score (GS) at needle biopsy and corresponding radical prostatectomy (RP) predated the use of the modified Gleason scoring system and did not factor in tertiary grade patterns. To assess the relation of biopsy and RP grade in the largest study to date. A total of 7643 totally embedded RP and corresponding needle biopsies (2004-2010) were analyzed according to the updated Gleason system. All patients underwent prostate biopsy prior to RP. The relation of upgrading or downgrading to patient and cancer characteristics was compared using the chi-square test, Student t test, and multivariable logistic regression. A total of 36.3% of cases were upgraded from a needle biopsy GS 5-6 to a higher grade at RP (11.2% with GS 6 plus tertiary). Half of the cases had matching GS 3+4=7 at biopsy and RP with an approximately equal number of cases downgraded and upgraded at RP. With biopsy GS 4+3=7, RP GS was almost equally 3+4=7 and 4+3=7. Biopsy GS 8 led to an almost equal distribution between RP GS 4+3=7, 8, and 9-10. A total of 58% of the cases had matching GS 9-10 at biopsy and RP. In multivariable analysis, increasing age (pfactoring in multiple variables including the number of positive cores and the maximum percentage of cancer per core, the concordance indexes were not sufficiently high to justify the use of nomograms for predicting upgrading and downgrading for the individual patient. Almost 20% of RP cases have tertiary patterns. A needle biopsy can sample a tertiary higher Gleason pattern in the RP, which is then not recorded in the standard GS reporting, resulting in an apparent overgrading on the needle biopsy. Copyright © 2012 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  3. Comparison of likelihood testing procedures for parallel systems with covariances

    International Nuclear Information System (INIS)

    Ayman Baklizi; Isa Daud; Noor Akma Ibrahim

    1998-01-01

    In this paper we considered investigating and comparing the behavior of the likelihood ratio, the Rao's and the Wald's statistics for testing hypotheses on the parameters of the simple linear regression model based on parallel systems with covariances. These statistics are asymptotically equivalent (Barndorff-Nielsen and Cox, 1994). However, their relative performances in finite samples are generally known. A Monte Carlo experiment is conducted to stimulate the sizes and the powers of these statistics for complete samples and in the presence of time censoring. Comparisons of the statistics are made according to the attainment of assumed size of the test and their powers at various points in the parameter space. The results show that the likelihood ratio statistics appears to have the best performance in terms of the attainment of the assumed size of the test. Power comparisons show that the Rao statistic has some advantage over the Wald statistic in almost all of the space of alternatives while likelihood ratio statistic occupies either the first or the last position in term of power. Overall, the likelihood ratio statistic appears to be more appropriate to the model under study, especially for small sample sizes

  4. Maximum likelihood estimation of the attenuated ultrasound pulse

    DEFF Research Database (Denmark)

    Rasmussen, Klaus Bolding

    1994-01-01

    The attenuated ultrasound pulse is divided into two parts: a stationary basic pulse and a nonstationary attenuation pulse. A standard ARMA model is used for the basic pulse, and a nonstandard ARMA model is derived for the attenuation pulse. The maximum likelihood estimator of the attenuated...

  5. Planck 2013 results. XV. CMB power spectra and likelihood

    CERN Document Server

    Ade, P.A.R.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A.J.; Barreiro, R.B.; Bartlett, J.G.; Battaner, E.; Benabed, K.; Benoit, A.; Benoit-Levy, A.; Bernard, J.P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J.J.; Bonaldi, A.; Bonavera, L.; Bond, J.R.; Borrill, J.; Bouchet, F.R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R.C.; Calabrese, E.; Cardoso, J.F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, L.Y.; Chiang, H.C.; Christensen, P.R.; Church, S.; Clements, D.L.; Colombi, S.; Colombo, L.P.L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B.P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R.D.; Davis, R.J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.M.; Desert, F.X.; Dickinson, C.; Diego, J.M.; Dole, H.; Donzelli, S.; Dore, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Ensslin, T.A.; Eriksen, H.K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A.A.; Franceschi, E.; Gaier, T.C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Heraud, Y.; Gjerlow, E.; Gonzalez-Nuevo, J.; Gorski, K.M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J.E.; Hansen, F.K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versille, S.; Hernandez-Monteagudo, C.; Herranz, D.; Hildebrandt, S.R.; Hivon, E.; Hobson, M.; Holmes, W.A.; Hornstrup, A.; Hovest, W.; Huffenberger, K.M.; Hurier, G.; Jaffe, T.R.; Jaffe, A.H.; Jewell, J.; Jones, W.C.; Juvela, M.; Keihanen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T.S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lahteenmaki, A.; Lamarre, J.M.; Lasenby, A.; Lattanzi, M.; Laureijs, R.J.; Lawrence, C.R.; Le Jeune, M.; Leach, S.; Leahy, J.P.; Leonardi, R.; Leon-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P.B.; Lindholm, V.; Linden-Vornle, M.; Lopez-Caniego, M.; Lubin, P.M.; Macias-Perez, J.F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D.J.; Martin, P.G.; Martinez-Gonzalez, E.; Masi, S.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P.R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschenes, M.A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C.B.; Norgaard-Nielsen, H.U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I.J.; Orieux, F.; Osborne, S.; Oxborrow, C.A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G.W.; Prezeau, G.; Prunet, S.; Puget, J.L.; Rachen, J.P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubino-Martin, J.A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M.D.; Shellard, E.P.S.; Spencer, L.D.; Starck, J.L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.S.; Sygnet, J.F.; Tauber, J.A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Turler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L.A.; Wandelt, B.D.; Wehus, I.K.; White, M.; White, S.D.M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-01-01

    We present the Planck likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations. We use this likelihood to derive the Planck CMB power spectrum over three decades in l, covering 2 = 50, we employ a correlated Gaussian likelihood approximation based on angular cross-spectra derived from the 100, 143 and 217 GHz channels. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on cosmological parameters. We find good internal agreement among the high-l cross-spectra with residuals of a few uK^2 at l <= 1000. We compare our results with foreground-cleaned CMB maps, and with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. The best-fit LCDM cosmology is in excellent agreement with preliminary Planck polarisation spectra. The standard LCDM cosmology is well constrained b...

  6. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  7. MAXIMUM-LIKELIHOOD-ESTIMATION OF THE ENTROPY OF AN ATTRACTOR

    NARCIS (Netherlands)

    SCHOUTEN, JC; TAKENS, F; VANDENBLEEK, CM

    In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the

  8. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  9. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  10. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  11. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  12. Likelihood-based inference for clustered line transect data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  13. Composite likelihood and two-stage estimation in family studies

    DEFF Research Database (Denmark)

    Andersen, Elisabeth Anne Wreford

    2004-01-01

    In this paper register based family studies provide the motivation for linking a two-stage estimation procedure in copula models for multivariate failure time data with a composite likelihood approach. The asymptotic properties of the estimators in both parametric and semi-parametric models are d...

  14. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    Science.gov (United States)

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  15. Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.

    Science.gov (United States)

    Heesacker, Martin

    1986-01-01

    Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…

  16. Cases in which ancestral maximum likelihood will be confusingly misleading.

    Science.gov (United States)

    Handelman, Tomer; Chor, Benny

    2017-05-07

    Ancestral maximum likelihood (AML) is a phylogenetic tree reconstruction criteria that "lies between" maximum parsimony (MP) and maximum likelihood (ML). ML has long been known to be statistically consistent. On the other hand, Felsenstein (1978) showed that MP is statistically inconsistent, and even positively misleading: There are cases where the parsimony criteria, applied to data generated according to one tree topology, will be optimized on a different tree topology. The question of weather AML is statistically consistent or not has been open for a long time. Mossel et al. (2009) have shown that AML can "shrink" short tree edges, resulting in a star tree with no internal resolution, which yields a better AML score than the original (resolved) model. This result implies that AML is statistically inconsistent, but not that it is positively misleading, because the star tree is compatible with any other topology. We show that AML is confusingly misleading: For some simple, four taxa (resolved) tree, the ancestral likelihood optimization criteria is maximized on an incorrect (resolved) tree topology, as well as on a star tree (both with specific edge lengths), while the tree with the original, correct topology, has strictly lower ancestral likelihood. Interestingly, the two short edges in the incorrect, resolved tree topology are of length zero, and are not adjacent, so this resolved tree is in fact a simple path. While for MP, the underlying phenomenon can be described as long edge attraction, it turns out that here we have long edge repulsion. Copyright © 2017. Published by Elsevier Ltd.

  17. Multilevel maximum likelihood estimation with application to covariance matrices

    Czech Academy of Sciences Publication Activity Database

    Turčičová, Marie; Mandel, J.; Eben, Kryštof

    Published online: 23 January ( 2018 ) ISSN 0361-0926 R&D Projects: GA ČR GA13-34856S Institutional support: RVO:67985807 Keywords : Fisher information * High dimension * Hierarchical maximum likelihood * Nested parameter spaces * Spectral diagonal covariance model * Sparse inverse covariance model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.311, year: 2016

  18. Pendeteksian Outlier pada Regresi Nonlinier dengan Metode statistik Likelihood Displacement

    Directory of Open Access Journals (Sweden)

    Siti Tabi'atul Hasanah

    2012-11-01

    Full Text Available Outlier is an observation that much different (extreme from the other observational data, or data can be interpreted that do not follow the general pattern of the model. Sometimes outliers provide information that can not be provided by other data. That's why outliers should not just be eliminated. Outliers can also be an influential observation. There are many methods that can be used to detect of outliers. In previous studies done on outlier detection of linear regression. Next will be developed detection of outliers in nonlinear regression. Nonlinear regression here is devoted to multiplicative nonlinear regression. To detect is use of statistical method likelihood displacement. Statistical methods abbreviated likelihood displacement (LD is a method to detect outliers by removing the suspected outlier data. To estimate the parameters are used to the maximum likelihood method, so we get the estimate of the maximum. By using LD method is obtained i.e likelihood displacement is thought to contain outliers. Further accuracy of LD method in detecting the outliers are shown by comparing the MSE of LD with the MSE from the regression in general. Statistic test used is Λ. Initial hypothesis was rejected when proved so is an outlier.

  19. Likelihood-based methods for evaluating principal surrogacy in augmented vaccine trials.

    Science.gov (United States)

    Liu, Wei; Zhang, Bo; Zhang, Hui; Zhang, Zhiwei

    2017-04-01

    There is growing interest in assessing immune biomarkers, which are quick to measure and potentially predictive of long-term efficacy, as surrogate endpoints in randomized, placebo-controlled vaccine trials. This can be done under a principal stratification approach, with principal strata defined using a subject's potential immune responses to vaccine and placebo (the latter may be assumed to be zero). In this context, principal surrogacy refers to the extent to which vaccine efficacy varies across principal strata. Because a placebo recipient's potential immune response to vaccine is unobserved in a standard vaccine trial, augmented vaccine trials have been proposed to produce the information needed to evaluate principal surrogacy. This article reviews existing methods based on an estimated likelihood and a pseudo-score (PS) and proposes two new methods based on a semiparametric likelihood (SL) and a pseudo-likelihood (PL), for analyzing augmented vaccine trials. Unlike the PS method, the SL method does not require a model for missingness, which can be advantageous when immune response data are missing by happenstance. The SL method is shown to be asymptotically efficient, and it performs similarly to the PS and PL methods in simulation experiments. The PL method appears to have a computational advantage over the PS and SL methods.

  20. A Maximum Likelihood Approach to Determine Sensor Radiometric Response Coefficients for NPP VIIRS Reflective Solar Bands

    Science.gov (United States)

    Lei, Ning; Chiang, Kwo-Fu; Oudrari, Hassan; Xiong, Xiaoxiong

    2011-01-01

    Optical sensors aboard Earth orbiting satellites such as the next generation Visible/Infrared Imager/Radiometer Suite (VIIRS) assume that the sensors radiometric response in the Reflective Solar Bands (RSB) is described by a quadratic polynomial, in relating the aperture spectral radiance to the sensor Digital Number (DN) readout. For VIIRS Flight Unit 1, the coefficients are to be determined before launch by an attenuation method, although the linear coefficient will be further determined on-orbit through observing the Solar Diffuser. In determining the quadratic polynomial coefficients by the attenuation method, a Maximum Likelihood approach is applied in carrying out the least-squares procedure. Crucial to the Maximum Likelihood least-squares procedure is the computation of the weight. The weight not only has a contribution from the noise of the sensor s digital count, with an important contribution from digitization error, but also is affected heavily by the mathematical expression used to predict the value of the dependent variable, because both the independent and the dependent variables contain random noise. In addition, model errors have a major impact on the uncertainties of the coefficients. The Maximum Likelihood approach demonstrates the inadequacy of the attenuation method model with a quadratic polynomial for the retrieved spectral radiance. We show that using the inadequate model dramatically increases the uncertainties of the coefficients. We compute the coefficient values and their uncertainties, considering both measurement and model errors.

  1. The Jarvis gas release incident

    International Nuclear Information System (INIS)

    Manocha, J.

    1992-01-01

    On 26 September, 1991, large volumes of natural gas were observed to be leaking from two water wells in the Town of Jarvis. Gas and water were being ejected from a drilled water well, at which a subsequent gas explosion occurred. Measurements of gas concentrations indicated levels far in excess of the lower flammability limit at several locations. Electrical power and natural gas services were cut off, and residents were evacuated. A state of emergency was declared, and gas was found to be flowing from water wells, around building foundations, and through other fractures in the ground. By 27 September the volumes of gas had reduced substantially, and by 30 September all residents had returned to their homes and the state of emergency was cancelled. The emergency response, possible pathways of natural gas into the aquifer, and public relations are discussed. It is felt that the likelihood of a similar incident occurring in the future is high. 11 figs

  2. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  3. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  4. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  5. Maximum-likelihood methods for array processing based on time-frequency distributions

    Science.gov (United States)

    Zhang, Yimin; Mu, Weifeng; Amin, Moeness G.

    1999-11-01

    This paper proposes a novel time-frequency maximum likelihood (t-f ML) method for direction-of-arrival (DOA) estimation for non- stationary signals, and compares this method with conventional maximum likelihood DOA estimation techniques. Time-frequency distributions localize the signal power in the time-frequency domain, and as such enhance the effective SNR, leading to improved DOA estimation. The localization of signals with different t-f signatures permits the division of the time-frequency domain into smaller regions, each contains fewer signals than those incident on the array. The reduction of the number of signals within different time-frequency regions not only reduces the required number of sensors, but also decreases the computational load in multi- dimensional optimizations. Compared to the recently proposed time- frequency MUSIC (t-f MUSIC), the proposed t-f ML method can be applied in coherent environments, without the need to perform any type of preprocessing that is subject to both array geometry and array aperture.

  6. A composite likelihood approach for spatially correlated survival data

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450

  7. A composite likelihood approach for spatially correlated survival data.

    Science.gov (United States)

    Paik, Jane; Ying, Zhiliang

    2013-01-01

    The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory.

  8. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  9. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  10. Likelihood inference for a nonstationary fractional autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2010-01-01

    This paper discusses model-based inference in an autoregressive model for fractional processes which allows the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X1......,...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that initial values are zero. For the statistical analysis we assume...... the conditional Gaussian likelihood and for the probability analysis we also condition on initial values but assume that the errors in the autoregressive model are i.i.d. with suitable moment conditions. We analyze the conditional likelihood and its derivatives as stochastic processes in the parameters, including...

  11. Maximum Likelihood Compton Polarimetry with the Compton Spectrometer and Imager

    Energy Technology Data Exchange (ETDEWEB)

    Lowell, A. W.; Boggs, S. E; Chiu, C. L.; Kierans, C. A.; Sleator, C.; Tomsick, J. A.; Zoglauer, A. C. [Space Sciences Laboratory, University of California, Berkeley (United States); Chang, H.-K.; Tseng, C.-H.; Yang, C.-Y. [Institute of Astronomy, National Tsing Hua University, Taiwan (China); Jean, P.; Ballmoos, P. von [IRAP Toulouse (France); Lin, C.-H. [Institute of Physics, Academia Sinica, Taiwan (China); Amman, M. [Lawrence Berkeley National Laboratory (United States)

    2017-10-20

    Astrophysical polarization measurements in the soft gamma-ray band are becoming more feasible as detectors with high position and energy resolution are deployed. Previous work has shown that the minimum detectable polarization (MDP) of an ideal Compton polarimeter can be improved by ∼21% when an unbinned, maximum likelihood method (MLM) is used instead of the standard approach of fitting a sinusoid to a histogram of azimuthal scattering angles. Here we outline a procedure for implementing this maximum likelihood approach for real, nonideal polarimeters. As an example, we use the recent observation of GRB 160530A with the Compton Spectrometer and Imager. We find that the MDP for this observation is reduced by 20% when the MLM is used instead of the standard method.

  12. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    Science.gov (United States)

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  13. Bayesian interpretation of Generalized empirical likelihood by maximum entropy

    OpenAIRE

    Rochet , Paul

    2011-01-01

    We study a parametric estimation problem related to moment condition models. As an alternative to the generalized empirical likelihood (GEL) and the generalized method of moments (GMM), a Bayesian approach to the problem can be adopted, extending the MEM procedure to parametric moment conditions. We show in particular that a large number of GEL estimators can be interpreted as a maximum entropy solution. Moreover, we provide a more general field of applications by proving the method to be rob...

  14. Menyoal Elaboration Likelihood Model (ELM) dan Teori Retorika

    OpenAIRE

    Yudi Perbawaningsih

    2012-01-01

    Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of ...

  15. Corporate governance effect on financial distress likelihood: Evidence from Spain

    Directory of Open Access Journals (Sweden)

    Montserrat Manzaneque

    2016-01-01

    Full Text Available The paper explores some mechanisms of corporate governance (ownership and board characteristics in Spanish listed companies and their impact on the likelihood of financial distress. An empirical study was conducted between 2007 and 2012 using a matched-pairs research design with 308 observations, with half of them classified as distressed and non-distressed. Based on the previous study by Pindado, Rodrigues, and De la Torre (2008, a broader concept of bankruptcy is used to define business failure. Employing several conditional logistic models, as well as to other previous studies on bankruptcy, the results confirm that in difficult situations prior to bankruptcy, the impact of board ownership and proportion of independent directors on business failure likelihood are similar to those exerted in more extreme situations. These results go one step further, to offer a negative relationship between board size and the likelihood of financial distress. This result is interpreted as a form of creating diversity and to improve the access to the information and resources, especially in contexts where the ownership is highly concentrated and large shareholders have a great power to influence the board structure. However, the results confirm that ownership concentration does not have a significant impact on financial distress likelihood in the Spanish context. It is argued that large shareholders are passive as regards an enhanced monitoring of management and, alternatively, they do not have enough incentives to hold back the financial distress. These findings have important implications in the Spanish context, where several changes in the regulatory listing requirements have been carried out with respect to corporate governance, and where there is no empirical evidence regarding this respect.

  16. Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation

    OpenAIRE

    Rajiv D. Banker

    1993-01-01

    This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical front...

  17. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  18. Menyoal Elaboration Likelihood Model (ELM) Dan Teori Retorika

    OpenAIRE

    Perbawaningsih, Yudi

    2012-01-01

    : Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM). This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the mess...

  19. Penggunaan Elaboration Likelihood Model dalam Menganalisis Penerimaan Teknologi Informasi

    OpenAIRE

    vitrian, vitrian2

    2010-01-01

    This article discusses some technology acceptance models in an organization. Thorough analysis of how technology is acceptable help managers make any planning to implement new teachnology and make sure that new technology could enhance organization's performance. Elaboration Likelihood Model (ELM) is the one which sheds light on some behavioral factors in acceptance of information technology. The basic tenet of ELM states that human behavior in principle can be influenced through central r...

  20. Statistical Bias in Maximum Likelihood Estimators of Item Parameters.

    Science.gov (United States)

    1982-04-01

    34 a> E r’r~e r ,C Ie I# ne,..,.rVi rnd Id.,flfv b1 - bindk numb.r) I; ,t-i i-cd I ’ tiie bias in the maximum likelihood ,st i- i;, ’ t iIeiIrs in...NTC, IL 60088 Psychometric Laboratory University of North Carolina I ERIC Facility-Acquisitions Davie Hall 013A 4833 Rugby Avenue Chapel Hill, NC

  1. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    Science.gov (United States)

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and

  2. Democracy, Autocracy and the Likelihood of International Conflict

    OpenAIRE

    Tangerås, Thomas

    2008-01-01

    This is a game-theoretic analysis of the link between regime type and international conflict. The democratic electorate can credibly punish the leader for bad conflict outcomes, whereas the autocratic selectorate cannot. For the fear of being thrown out of office, democratic leaders are (i) more selective about the wars they initiate and (ii) on average win more of the wars they start. Foreign policy behaviour is found to display strategic complementarities. The likelihood of interstate war, ...

  3. Moment Conditions Selection Based on Adaptive Penalized Empirical Likelihood

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available Empirical likelihood is a very popular method and has been widely used in the fields of artificial intelligence (AI and data mining as tablets and mobile application and social media dominate the technology landscape. This paper proposes an empirical likelihood shrinkage method to efficiently estimate unknown parameters and select correct moment conditions simultaneously, when the model is defined by moment restrictions in which some are possibly misspecified. We show that our method enjoys oracle-like properties; that is, it consistently selects the correct moment conditions and at the same time its estimator is as efficient as the empirical likelihood estimator obtained by all correct moment conditions. Moreover, unlike the GMM, our proposed method allows us to carry out confidence regions for the parameters included in the model without estimating the covariances of the estimators. For empirical implementation, we provide some data-driven procedures for selecting the tuning parameter of the penalty function. The simulation results show that the method works remarkably well in terms of correct moment selection and the finite sample properties of the estimators. Also, a real-life example is carried out to illustrate the new methodology.

  4. Approximate maximum likelihood estimation for population genetic inference.

    Science.gov (United States)

    Bertl, Johanna; Ewing, Gregory; Kosiol, Carolin; Futschik, Andreas

    2017-11-27

    In many population genetic problems, parameter estimation is obstructed by an intractable likelihood function. Therefore, approximate estimation methods have been developed, and with growing computational power, sampling-based methods became popular. However, these methods such as Approximate Bayesian Computation (ABC) can be inefficient in high-dimensional problems. This led to the development of more sophisticated iterative estimation methods like particle filters. Here, we propose an alternative approach that is based on stochastic approximation. By moving along a simulated gradient or ascent direction, the algorithm produces a sequence of estimates that eventually converges to the maximum likelihood estimate, given a set of observed summary statistics. This strategy does not sample much from low-likelihood regions of the parameter space, and is fast, even when many summary statistics are involved. We put considerable efforts into providing tuning guidelines that improve the robustness and lead to good performance on problems with high-dimensional summary statistics and a low signal-to-noise ratio. We then investigate the performance of our resulting approach and study its properties in simulations. Finally, we re-estimate parameters describing the demographic history of Bornean and Sumatran orang-utans.

  5. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine, E-mail: adam.bouland@aya.yale.edu, E-mail: richard.easther@yale.edu, E-mail: krosenfeld@cfa.harvard.edu [Department of Physics, Yale University, New Haven CT 06520 (United States)

    2011-05-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user.

  6. Caching and interpolated likelihoods: accelerating cosmological Monte Carlo Markov chains

    International Nuclear Information System (INIS)

    Bouland, Adam; Easther, Richard; Rosenfeld, Katherine

    2011-01-01

    We describe a novel approach to accelerating Monte Carlo Markov Chains. Our focus is cosmological parameter estimation, but the algorithm is applicable to any problem for which the likelihood surface is a smooth function of the free parameters and computationally expensive to evaluate. We generate a high-order interpolating polynomial for the log-likelihood using the first points gathered by the Markov chains as a training set. This polynomial then accurately computes the majority of the likelihoods needed in the latter parts of the chains. We implement a simple version of this algorithm as a patch (InterpMC) to CosmoMC and show that it accelerates parameter estimatation by a factor of between two and four for well-converged chains. The current code is primarily intended as a ''proof of concept'', and we argue that there is considerable room for further performance gains. Unlike other approaches to accelerating parameter fits, we make no use of precomputed training sets or special choices of variables, and InterpMC is almost entirely transparent to the user

  7. Maximum likelihood as a common computational framework in tomotherapy

    International Nuclear Information System (INIS)

    Olivera, G.H.; Shepard, D.M.; Reckwerdt, P.J.; Ruchala, K.; Zachman, J.; Fitchard, E.E.; Mackie, T.R.

    1998-01-01

    Tomotherapy is a dose delivery technique using helical or axial intensity modulated beams. One of the strengths of the tomotherapy concept is that it can incorporate a number of processes into a single piece of equipment. These processes include treatment optimization planning, dose reconstruction and kilovoltage/megavoltage image reconstruction. A common computational technique that could be used for all of these processes would be very appealing. The maximum likelihood estimator, originally developed for emission tomography, can serve as a useful tool in imaging and radiotherapy. We believe that this approach can play an important role in the processes of optimization planning, dose reconstruction and kilovoltage and/or megavoltage image reconstruction. These processes involve computations that require comparable physical methods. They are also based on equivalent assumptions, and they have similar mathematical solutions. As a result, the maximum likelihood approach is able to provide a common framework for all three of these computational problems. We will demonstrate how maximum likelihood methods can be applied to optimization planning, dose reconstruction and megavoltage image reconstruction in tomotherapy. Results for planning optimization, dose reconstruction and megavoltage image reconstruction will be presented. Strengths and weaknesses of the methodology are analysed. Future directions for this work are also suggested. (author)

  8. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  9. Comparisons of likelihood and machine learning methods of individual classification

    Science.gov (United States)

    Guinand, B.; Topchy, A.; Page, K.S.; Burnham-Curtis, M. K.; Punch, W.F.; Scribner, K.T.

    2002-01-01

    Classification methods used in machine learning (e.g., artificial neural networks, decision trees, and k-nearest neighbor clustering) are rarely used with population genetic data. We compare different nonparametric machine learning techniques with parametric likelihood estimations commonly employed in population genetics for purposes of assigning individuals to their population of origin (“assignment tests”). Classifier accuracy was compared across simulated data sets representing different levels of population differentiation (low and high FST), number of loci surveyed (5 and 10), and allelic diversity (average of three or eight alleles per locus). Empirical data for the lake trout (Salvelinus namaycush) exhibiting levels of population differentiation comparable to those used in simulations were examined to further evaluate and compare classification methods. Classification error rates associated with artificial neural networks and likelihood estimators were lower for simulated data sets compared to k-nearest neighbor and decision tree classifiers over the entire range of parameters considered. Artificial neural networks only marginally outperformed the likelihood method for simulated data (0–2.8% lower error rates). The relative performance of each machine learning classifier improved relative likelihood estimators for empirical data sets, suggesting an ability to “learn” and utilize properties of empirical genotypic arrays intrinsic to each population. Likelihood-based estimation methods provide a more accessible option for reliable assignment of individuals to the population of origin due to the intricacies in development and evaluation of artificial neural networks. In recent years, characterization of highly polymorphic molecular markers such as mini- and microsatellites and development of novel methods of analysis have enabled researchers to extend investigations of ecological and evolutionary processes below the population level to the level of

  10. Evaluation of Dynamic Coastal Response to Sea-level Rise Modifies Inundation Likelihood

    Science.gov (United States)

    Lentz, Erika E.; Thieler, E. Robert; Plant, Nathaniel G.; Stippa, Sawyer R.; Horton, Radley M.; Gesch, Dean B.

    2016-01-01

    Sea-level rise (SLR) poses a range of threats to natural and built environments, making assessments of SLR-induced hazards essential for informed decision making. We develop a probabilistic model that evaluates the likelihood that an area will inundate (flood) or dynamically respond (adapt) to SLR. The broad-area applicability of the approach is demonstrated by producing 30x30m resolution predictions for more than 38,000 sq km of diverse coastal landscape in the northeastern United States. Probabilistic SLR projections, coastal elevation and vertical land movement are used to estimate likely future inundation levels. Then, conditioned on future inundation levels and the current land-cover type, we evaluate the likelihood of dynamic response versus inundation. We find that nearly 70% of this coastal landscape has some capacity to respond dynamically to SLR, and we show that inundation models over-predict land likely to submerge. This approach is well suited to guiding coastal resource management decisions that weigh future SLR impacts and uncertainty against ecological targets and economic constraints.

  11. Costs, mortality likelihood and outcomes of hospitalized US children with traumatic brain injuries.

    Science.gov (United States)

    Shi, Junxin; Xiang, Huiyun; Wheeler, Krista; Smith, Gary A; Stallones, Lorann; Groner, Jonathan; Wang, Zengzhen

    2009-07-01

    To examine the hospitalization costs and discharge outcomes of US children with TBI and to evaluate a severity measure, the predictive mortality likelihood level. Data from the 2006 Healthcare Cost and Utilization Project Kids' Inpatient Database (KID) were used to report the national estimates and characteristics of TBI-associated hospitalizations among US children percentage of children with TBI caused by motor vehicle crashes (MVC) and falls was calculated according to the predictive mortality likelihood levels (PMLL), death in hospital and discharge into long-term rehabilitation facilities. Associations with the PMLL, discharge outcomes and average hospital charges were examined. In 2006, there were an estimated 58 900 TBI-associated hospitalizations among US children, accounting for $2.56 billion in hospital charges. MVCs caused 38.9% and falls caused 21.2% of TBI hospitalizations. The PMLL was strongly associated with TBI type, length of hospital stay, hospital charges and discharge disposition. About 4% of children with fall or MVC related TBIs died in hospital and 9% were discharged into long-term facilities. The PMLL may provide a useful tool to assess characteristics and treatment outcomes of hospitalized TBI children, but more research is still needed.

  12. Phenomenological modelling of second cancer incidence for radiation treatment planning

    International Nuclear Information System (INIS)

    Pfaffenberger, Asja; Oelfke, Uwe; Schneider, Uwe; Poppe, Bjoern

    2009-01-01

    It is still an unanswered question whether a relatively low dose of radiation to a large volume or a higher dose to a small volume produces the higher cancer incidence. This is of interest in view of modalities like IMRT or rotation therapy where high conformity to the target volume is achieved at the cost of a large volume of normal tissue exposed to radiation. Knowledge of the shape of the dose response for radiation-induced cancer is essential to answer the question of what risk of second cancer incidence is implied by which treatment modality. This study therefore models the dose response for radiation-induced second cancer after radiation therapy of which the exact mechanisms are still unknown. A second cancer risk estimation tool for treatment planning is presented which has the potential to be used for comparison of different treatment modalities, and risk is estimated on a voxel basis for different organs in two case studies. The presented phenomenological model summarises the impact of microscopic biological processes into effective parameters of mutation and cell sterilisation. In contrast to other models, the effective radiosensitivities of mutated and non-mutated cells are allowed to differ. Based on the number of mutated cells present after irradiation, the model is then linked to macroscopic incidence by summarising model parameters and modifying factors into natural cancer incidence and the dose response in the lower-dose region. It was found that all principal dose-response functions discussed in the literature can be derived from the model. However, from the investigation and due to scarcity of adequate data, rather vague statements about likelihood of dose-response functions can be made than a definite decision for one response. Based on the predicted model parameters, the linear response can probably be rejected using the dynamics described, but both a flattening response and a decrease appear likely, depending strongly on the effective cell

  13. Neural Networks Involved in Adolescent Reward Processing: An Activation Likelihood Estimation Meta-Analysis of Functional Neuroimaging Studies

    Science.gov (United States)

    Silverman, Merav H.; Jedd, Kelly; Luciana, Monica

    2015-01-01

    Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587

  14. Maximum likelihood estimation of semiparametric mixture component models for competing risks data.

    Science.gov (United States)

    Choi, Sangbum; Huang, Xuelin

    2014-09-01

    In the analysis of competing risks data, the cumulative incidence function is a useful quantity to characterize the crude risk of failure from a specific event type. In this article, we consider an efficient semiparametric analysis of mixture component models on cumulative incidence functions. Under the proposed mixture model, latency survival regressions given the event type are performed through a class of semiparametric models that encompasses the proportional hazards model and the proportional odds model, allowing for time-dependent covariates. The marginal proportions of the occurrences of cause-specific events are assessed by a multinomial logistic model. Our mixture modeling approach is advantageous in that it makes a joint estimation of model parameters associated with all competing risks under consideration, satisfying the constraint that the cumulative probability of failing from any cause adds up to one given any covariates. We develop a novel maximum likelihood scheme based on semiparametric regression analysis that facilitates efficient and reliable estimation. Statistical inferences can be conveniently made from the inverse of the observed information matrix. We establish the consistency and asymptotic normality of the proposed estimators. We validate small sample properties with simulations and demonstrate the methodology with a data set from a study of follicular lymphoma. © 2014, The International Biometric Society.

  15. Erythema nodosum and the risk of tuberculosis in a high incidence setting

    DEFF Research Database (Denmark)

    Bjorn-Mortensen, Karen; Ladefoged, Karin; Simonsen, Jacob

    2016-01-01

    OBJECTIVE: This study estimates the erythema nodosum (EN) incidence in a tuberculosis (TB) endemic setting and evaluates the likelihood of a subsequent TB diagnosis among individuals with Mycobacterium tuberculosis infection (MTI) with or without EN. DESIGN: We estimated EN incidence rates (IRs...

  16. Applying exclusion likelihoods from LHC searches to extended Higgs sectors

    International Nuclear Information System (INIS)

    Bechtle, Philip; Heinemeyer, Sven; Staal, Oscar; Stefaniak, Tim; Weiglein, Georg

    2015-01-01

    LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full 8 TeV dataset. In addition to publishing a 95 % C.L. exclusion limit, the full likelihood information for the narrowresonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the ττ search and the rate measurements of the SM-like Higgs at 125 GeV in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http:// higgsbounds.hepforge.org. (orig.)

  17. Australian food life style segments and elaboration likelihood differences

    DEFF Research Database (Denmark)

    Brunsø, Karen; Reid, Mike

    As the global food marketing environment becomes more competitive, the international and comparative perspective of consumers' attitudes and behaviours becomes more important for both practitioners and academics. This research employs the Food-Related Life Style (FRL) instrument in Australia...... in order to 1) determine Australian Life Style Segments and compare these with their European counterparts, and to 2) explore differences in elaboration likelihood among the Australian segments, e.g. consumers' interest and motivation to perceive product related communication. The results provide new...

  18. Maximum-likelihood method for numerical inversion of Mellin transform

    International Nuclear Information System (INIS)

    Iqbal, M.

    1997-01-01

    A method is described for inverting the Mellin transform which uses an expansion in Laguerre polynomials and converts the Mellin transform to Laplace transform, then the maximum-likelihood regularization method is used to recover the original function of the Mellin transform. The performance of the method is illustrated by the inversion of the test functions available in the literature (J. Inst. Math. Appl., 20 (1977) 73; Math. Comput., 53 (1989) 589). Effectiveness of the method is shown by results obtained through demonstration by means of tables and diagrams

  19. How to Improve the Likelihood of CDM Approval?

    DEFF Research Database (Denmark)

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2014-01-01

    How can the likelihood of Clean Development Mechanism (CDM) approval be improved in the face of institutional shortcomings? To answer this question, we focus on the three institutional shortcomings of income sharing, risk sharing and corruption prevention concerning afforestation/reforestation (A....../R). Furthermore, three main stakeholders are identified, namely investors, governments and agents in a principal-agent model regarding monitoring and enforcement capacity. Developing countries such as West Africa have, despite huge potentials, not been integrated in A/R CDM projects yet. Remote sensing, however...

  20. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  1. Elemental composition of cosmic rays using a maximum likelihood method

    International Nuclear Information System (INIS)

    Ruddick, K.

    1996-01-01

    We present a progress report on our attempts to determine the composition of cosmic rays in the knee region of the energy spectrum. We have used three different devices to measure properties of the extensive air showers produced by primary cosmic rays: the Soudan 2 underground detector measures the muon flux deep underground, a proportional tube array samples shower density at the surface of the earth, and a Cherenkov array observes light produced high in the atmosphere. We have begun maximum likelihood fits to these measurements with the hope of determining the nuclear mass number A on an event by event basis. (orig.)

  2. Likelihood-Based Inference in Nonlinear Error-Correction Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  3. Process criticality accident likelihoods, consequences and emergency planning

    International Nuclear Information System (INIS)

    McLaughlin, T.P.

    1992-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with national and international standards and regulations which require an evaluation of the net benefit of a criticality accident alarm system, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. (Author)

  4. Likelihood Estimation of Gamma Ray Bursts Duration Distribution

    OpenAIRE

    Horvath, Istvan

    2005-01-01

    Two classes of Gamma Ray Bursts have been identified so far, characterized by T90 durations shorter and longer than approximately 2 seconds. It was shown that the BATSE 3B data allow a good fit with three Gaussian distributions in log T90. In the same Volume in ApJ. another paper suggested that the third class of GRBs is may exist. Using the full BATSE catalog here we present the maximum likelihood estimation, which gives us 0.5% probability to having only two subclasses. The MC simulation co...

  5. Process criticality accident likelihoods, consequences, and emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, T.P.

    1991-01-01

    Evaluation of criticality accident risks in the processing of significant quantities of fissile materials is both complex and subjective, largely due to the lack of accident statistics. Thus, complying with standards such as ISO 7753 which mandates that the need for an alarm system be evaluated, is also subjective. A review of guidance found in the literature on potential accident magnitudes is presented for different material forms and arrangements. Reasoned arguments are also presented concerning accident prevention and accident likelihoods for these material forms and arrangements. 13 refs., 1 fig., 1 tab.

  6. Improved Likelihood Function in Particle-based IR Eye Tracking

    DEFF Research Database (Denmark)

    Satria, R.; Sorensen, J.; Hammoud, R.

    2005-01-01

    In this paper we propose a log likelihood-ratio function of foreground and background models used in a particle filter to track the eye region in dark-bright pupil image sequences. This model fuses information from both dark and bright pupil images and their difference image into one model. Our...... enhanced tracker overcomes the issues of prior selection of static thresholds during the detection of feature observations in the bright-dark difference images. The auto-initialization process is performed using cascaded classifier trained using adaboost and adapted to IR eye images. Experiments show good...

  7. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  8. Maximum Likelihood Joint Tracking and Association in Strong Clutter

    Directory of Open Access Journals (Sweden)

    Leonid I. Perlovsky

    2013-01-01

    Full Text Available We have developed a maximum likelihood formulation for a joint detection, tracking and association problem. An efficient non-combinatorial algorithm for this problem is developed in case of strong clutter for radar data. By using an iterative procedure of the dynamic logic process “from vague-to-crisp” explained in the paper, the new tracker overcomes the combinatorial complexity of tracking in highly-cluttered scenarios and results in an orders-of-magnitude improvement in signal-to-clutter ratio.

  9. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V. [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, Maryland 20892 (United States)

    2015-01-21

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.

  10. Evidence Based Medicine; Positive and Negative Likelihood Ratios of Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Alireza Baratloo

    2015-10-01

    Full Text Available In the previous two parts of educational manuscript series in Emergency, we explained some screening characteristics of diagnostic tests including accuracy, sensitivity, specificity, and positive and negative predictive values. In the 3rd  part we aimed to explain positive and negative likelihood ratio (LR as one of the most reliable performance measures of a diagnostic test. To better understand this characteristic of a test, it is first necessary to fully understand the concept of sensitivity and specificity. So we strongly advise you to review the 1st part of this series again. In short, the likelihood ratios are about the percentage of people with and without a disease but having the same test result. The prevalence of a disease can directly influence screening characteristics of a diagnostic test, especially its sensitivity and specificity. Trying to eliminate this effect, LR was developed. Pre-test probability of a disease multiplied by positive or negative LR can estimate post-test probability. Therefore, LR is the most important characteristic of a test to rule out or rule in a diagnosis. A positive likelihood ratio > 1 means higher probability of the disease to be present in a patient with a positive test. The further from 1, either higher or lower, the stronger the evidence to rule in or rule out the disease, respectively. It is obvious that tests with LR close to one are less practical. On the other hand, LR further from one will have more value for application in medicine. Usually tests with 0.1 < LR > 10 are considered suitable for implication in routine practice.

  11. DarkBit. A GAMBIT module for computing dark matter observables and likelihoods

    Energy Technology Data Exchange (ETDEWEB)

    Bringmann, Torsten; Dal, Lars A. [University of Oslo, Department of Physics, Oslo (Norway); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Kahlhoefer, Felix; Wild, Sebastian [DESY, Hamburg (Germany); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Scott, Pat [Blackett Laboratory, Imperial College London, Department of Physics, London (United Kingdom); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); White, Martin [University of Adelaide, Department of Physics, Adelaide, SA (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale, Parkville (Australia); Collaboration: The GAMBIT Dark Matter Workgroup

    2017-12-15

    We introduce DarkBit, an advanced software code for computing dark matter constraints on various extensions to the Standard Model of particle physics, comprising both new native code and interfaces to external packages. This release includes a dedicated signal yield calculator for gamma-ray observations, which significantly extends current tools by implementing a cascade-decay Monte Carlo, as well as a dedicated likelihood calculator for current and future experiments (gamLike). This provides a general solution for studying complex particle physics models that predict dark matter annihilation to a multitude of final states. We also supply a direct detection package that models a large range of direct detection experiments (DDCalc), and that provides the corresponding likelihoods for arbitrary combinations of spin-independent and spin-dependent scattering processes. Finally, we provide custom relic density routines along with interfaces to DarkSUSY, micrOMEGAs, and the neutrino telescope likelihood package nulike. DarkBit is written in the framework of the Global And Modular Beyond the Standard Model Inference Tool (GAMBIT), providing seamless integration into a comprehensive statistical fitting framework that allows users to explore new models with both particle and astrophysics constraints, and a consistent treatment of systematic uncertainties. In this paper we describe its main functionality, provide a guide to getting started quickly, and show illustrative examples for results obtained with DarkBit (both as a stand-alone tool and as a GAMBIT module). This includes a quantitative comparison between two of the main dark matter codes (DarkSUSY and micrOMEGAs), and application of DarkBit's advanced direct and indirect detection routines to a simple effective dark matter model. (orig.)

  12. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    Directory of Open Access Journals (Sweden)

    Fonseca Carlos M

    2010-10-01

    Full Text Available Abstract Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the

  13. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-11-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\\\H$-) matrix format with computational cost $\\\\mathcal{O}(k^2n \\\\log^2 n/p)$ and storage $\\\\mathcal{O}(kn \\\\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  14. Likelihood Approximation With Parallel Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander; Sun, Ying; Genton, Marc G.; Keyes, David E.

    2017-01-01

    The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Matérn covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\\H$-) matrix format with computational cost $\\mathcal{O}(k^2n \\log^2 n/p)$ and storage $\\mathcal{O}(kn \\log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.

  15. Superfast maximum-likelihood reconstruction for quantum tomography

    Science.gov (United States)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  16. Likelihood inference for a fractionally cointegrated vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren; Ørregård Nielsen, Morten

    2012-01-01

    such that the process X_{t} is fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b, and no other fractionality order is possible. We define the statistical model by 0inference when the true values satisfy b0¿1/2 and d0-b0......We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model with a restricted constant term, ¿, based on the Gaussian likelihood conditional on initial values. The model nests the I(d) VAR model. We give conditions on the parameters...... process in the parameters when errors are i.i.d. with suitable moment conditions and initial values are bounded. When the limit is deterministic this implies uniform convergence in probability of the conditional likelihood function. If the true value b0>1/2, we prove that the limit distribution of (ß...

  17. Likelihood-Based Inference of B Cell Clonal Families.

    Directory of Open Access Journals (Sweden)

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  18. Menyoal Elaboration Likelihood Model (ELM dan Teori Retorika

    Directory of Open Access Journals (Sweden)

    Yudi Perbawaningsih

    2012-06-01

    Full Text Available Abstract: Persuasion is a communication process to establish or change attitudes, which can be understood through theory of Rhetoric and theory of Elaboration Likelihood Model (ELM. This study elaborates these theories in a Public Lecture series which to persuade the students in choosing their concentration of study. The result shows that in term of persuasion effectiveness it is not quite relevant to separate the message and its source. The quality of source is determined by the quality of the message, and vice versa. Separating the two routes of the persuasion process as described in the ELM theory would not be relevant. Abstrak: Persuasi adalah proses komunikasi untuk membentuk atau mengubah sikap, yang dapat dipahami dengan teori Retorika dan teori Elaboration Likelihood Model (ELM. Penelitian ini mengelaborasi teori tersebut dalam Kuliah Umum sebagai sarana mempersuasi mahasiswa untuk memilih konsentrasi studi studi yang didasarkan pada proses pengolahan informasi. Menggunakan metode survey, didapatkan hasil yaitu tidaklah cukup relevan memisahkan pesan dan narasumber dalam melihat efektivitas persuasi. Keduanya menyatu yang berarti bahwa kualitas narasumber ditentukan oleh kualitas pesan yang disampaikannya, dan sebaliknya. Memisahkan proses persuasi dalam dua lajur seperti yang dijelaskan dalam ELM teori menjadi tidak relevan.

  19. Gauging the likelihood of stable cavitation from ultrasound contrast agents.

    Science.gov (United States)

    Bader, Kenneth B; Holland, Christy K

    2013-01-07

    The mechanical index (MI) was formulated to gauge the likelihood of adverse bioeffects from inertial cavitation. However, the MI formulation did not consider bubble activity from stable cavitation. This type of bubble activity can be readily nucleated from ultrasound contrast agents (UCAs) and has the potential to promote beneficial bioeffects. Here, the presence of stable cavitation is determined numerically by tracking the onset of subharmonic oscillations within a population of bubbles for frequencies up to 7 MHz and peak rarefactional pressures up to 3 MPa. In addition, the acoustic pressure rupture threshold of an UCA population was determined using the Marmottant model. The threshold for subharmonic emissions of optimally sized bubbles was found to be lower than the inertial cavitation threshold for all frequencies studied. The rupture thresholds of optimally sized UCAs were found to be lower than the threshold for subharmonic emissions for either single cycle or steady state acoustic excitations. Because the thresholds of both subharmonic emissions and UCA rupture are linearly dependent on frequency, an index of the form I(CAV) = P(r)/f (where P(r) is the peak rarefactional pressure in MPa and f is the frequency in MHz) was derived to gauge the likelihood of subharmonic emissions due to stable cavitation activity nucleated from UCAs.

  20. Safe semi-supervised learning based on weighted likelihood.

    Science.gov (United States)

    Kawakita, Masanori; Takeuchi, Jun'ichi

    2014-05-01

    We are interested in developing a safe semi-supervised learning that works in any situation. Semi-supervised learning postulates that n(') unlabeled data are available in addition to n labeled data. However, almost all of the previous semi-supervised methods require additional assumptions (not only unlabeled data) to make improvements on supervised learning. If such assumptions are not met, then the methods possibly perform worse than supervised learning. Sokolovska, Cappé, and Yvon (2008) proposed a semi-supervised method based on a weighted likelihood approach. They proved that this method asymptotically never performs worse than supervised learning (i.e., it is safe) without any assumption. Their method is attractive because it is easy to implement and is potentially general. Moreover, it is deeply related to a certain statistical paradox. However, the method of Sokolovska et al. (2008) assumes a very limited situation, i.e., classification, discrete covariates, n(')→∞ and a maximum likelihood estimator. In this paper, we extend their method by modifying the weight. We prove that our proposal is safe in a significantly wide range of situations as long as n≤n('). Further, we give a geometrical interpretation of the proof of safety through the relationship with the above-mentioned statistical paradox. Finally, we show that the above proposal is asymptotically safe even when n(')

  1. Incident Information Management Tool

    CERN Document Server

    Pejovic, Vladimir

    2015-01-01

    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  2. A utilização de uma rede neural artificial para previsão da incidência da malária no município de Cantá, estado de Roraima Use of an artificial neural network to predict the incidence of malaria in the city of Cantá, state of Roraima

    Directory of Open Access Journals (Sweden)

    Guilherme Bernardino da Cunha

    2010-10-01

    Full Text Available INTRODUÇÃO: A malária é uma doença endêmica na Amazônia Legal Brasileira, apresentando riscos diferentes para cada região. O Município de Cantá, no Estado de Roraima, apresentou para todo o período estudado, um dos maiores índices parasitários anuais do Brasil, com valor sempre maior que 50. O presente estudo visa à utilização de uma rede neural artificial para previsão da incidência da malária nesse município, a fim de auxiliar os coordenadores de saúde no planejamento e gestão dos recursos. MÉTODOS: Os dados foram coletados no site do Ministério da Saúde, SIVEP - Malária entre 2003 e 2009. Estruturou-se uma rede neural artificial com três neurônios na camada de entrada, duas camadas intermediárias e uma camada de saída com um neurônio. A função de ativação foi à sigmoide. No treinamento, utilizou-se o método backpropagation, com taxa de aprendizado de 0,05 e momentum 0,01. O critério de parada foi atingir 20.000 ciclos ou uma meta de 0,001. Os dados de 2003 a 2008 foram utilizados para treinamento e validação. Comparam-se os resultados com os de um modelo de regressão logística. RESULTADOS: Os resultados para todos os períodos previstos mostraram-se que as redes neurais artificiais obtiveram um menor erro quadrático médio e erro absoluto quando comparado com o modelo de regressão para o ano de 2009. CONCLUSÕES: A rede neural artificial se mostrou adequada para um sistema de previsão de malária no município estudado, determinando com pequenos erros absolutos os valores preditivos, quando comparados ao modelo de regressão logística e aos valores reais.INTRODUCTION: Malaria is endemic in the Brazilian Amazon region, with different risks for each region. The City of Cantá, State of Roraima, presented one of the largest annual parasite indices in Brazil for the entire study period, with a value always greater than 50. The present study aimed to use an artificial neural network to predict the

  3. Brief Communication: Likelihood of societal preparedness for global change: trend detection

    Directory of Open Access Journals (Sweden)

    R. M. Vogel

    2013-07-01

    Full Text Available Anthropogenic influences on earth system processes are now pervasive, resulting in trends in river discharge, pollution levels, ocean levels, precipitation, temperature, wind, landslides, bird and plant populations and a myriad of other important natural hazards relating to earth system state variables. Thousands of trend detection studies have been published which report the statistical significance of observed trends. Unfortunately, such studies only concentrate on the null hypothesis of "no trend". Little or no attention is given to the power of such statistical trend tests, which would quantify the likelihood that we might ignore a trend if it really existed. The probability of missing the trend, if it exists, known as the type II error, informs us about the likelihood of whether or not society is prepared to accommodate and respond to such trends. We describe how the power or probability of detecting a trend if it exists, depends critically on our ability to develop improved multivariate deterministic and statistical methods for predicting future trends in earth system processes. Several other research and policy implications for improving our understanding of trend detection and our societal response to those trends are discussed.

  4. Lay understanding of forensic statistics: Evaluation of random match probabilities, likelihood ratios, and verbal equivalents.

    Science.gov (United States)

    Thompson, William C; Newman, Eryn J

    2015-08-01

    Forensic scientists have come under increasing pressure to quantify the strength of their evidence, but it is not clear which of several possible formats for presenting quantitative conclusions will be easiest for lay people, such as jurors, to understand. This experiment examined the way that people recruited from Amazon's Mechanical Turk (n = 541) responded to 2 types of forensic evidence--a DNA comparison and a shoeprint comparison--when an expert explained the strength of this evidence 3 different ways: using random match probabilities (RMPs), likelihood ratios (LRs), or verbal equivalents of likelihood ratios (VEs). We found that verdicts were sensitive to the strength of DNA evidence regardless of how the expert explained it, but verdicts were sensitive to the strength of shoeprint evidence only when the expert used RMPs. The weight given to DNA evidence was consistent with the predictions of a Bayesian network model that incorporated the perceived risk of a false match from 3 causes (coincidence, a laboratory error, and a frame-up), but shoeprint evidence was undervalued relative to the same Bayesian model. Fallacious interpretations of the expert's testimony (consistent with the source probability error and the defense attorney's fallacy) were common and were associated with the weight given to the evidence and verdicts. The findings indicate that perceptions of forensic science evidence are shaped by prior beliefs and expectations as well as expert testimony and consequently that the best way to characterize and explain forensic evidence may vary across forensic disciplines. (c) 2015 APA, all rights reserved).

  5. Education and Income Imbalances Among Married Couples in Malawi as Predictors for Likelihood of Physical and Emotional Intimate Partner Violence.

    Science.gov (United States)

    Bonnes, Stephanie

    2016-01-01

    Intimate partner violence is a social and public health problem that is prevalent across the world. In many societies, power differentials in relationships, often supported by social norms that promote gender inequality, lead to incidents of intimate partner violence. Among other factors, both a woman's years of education and educational differences between a woman and her partner have been shown to have an effect on her likelihood of experiencing intimate partner abuse. Using the 2010 Malawian Demographic and Health Survey data to analyze intimate partner violence among 3,893 married Malawian women and their husbands, this article focuses on understanding the effect of educational differences between husband and wife on the likelihood of physical and emotional abuse within a marriage. The results from logistic regression models show that a woman's level of education is a significant predictor of her likelihood of experiencing intimate partner violence by her current husband, but that this effect is contingent on her husband's level of education. This study demonstrates the need to educate men alongside of women in Malawi to help decrease women's risk of physical and emotional intimate partner violence.

  6. HLA Match Likelihoods for Hematopoietic Stem-Cell Grafts in the U.S. Registry

    Science.gov (United States)

    Gragert, Loren; Eapen, Mary; Williams, Eric; Freeman, John; Spellman, Stephen; Baitty, Robert; Hartzman, Robert; Rizzo, J. Douglas; Horowitz, Mary; Confer, Dennis; Maiers, Martin

    2018-01-01

    Background Hematopoietic stem-cell transplantation (HSCT) is a potentially lifesaving therapy for several blood cancers and other diseases. For patients without a suitable related HLA-matched donor, unrelated-donor registries of adult volunteers and banked umbilical cord–blood units, such as the Be the Match Registry operated by the National Marrow Donor Program (NMDP), provide potential sources of donors. Our goal in the present study was to measure the likelihood of finding a suitable donor in the U.S. registry. Methods Using human HLA data from the NMDP donor and cord-blood-unit registry, we built population-based genetic models for 21 U.S. racial and ethnic groups to predict the likelihood of identifying a suitable donor (either an adult donor or a cord-blood unit) for patients in each group. The models incorporated the degree of HLA matching, adult-donor availability (i.e., ability to donate), and cord-blood-unit cell dose. Results Our models indicated that most candidates for HSCT will have a suitable (HLA-matched or minimally mismatched) adult donor. However, many patients will not have an optimal adult donor — that is, a donor who is matched at high resolution at HLA-A, HLA-B, HLA-C, and HLA-DRB1. The likelihood of finding an optimal donor varies among racial and ethnic groups, with the highest probability among whites of European descent, at 75%, and the lowest probability among blacks of South or Central American descent, at 16%. Likelihoods for other groups are intermediate. Few patients will have an optimal cord-blood unit — that is, one matched at the antigen level at HLA-A and HLA-B and matched at high resolution at HLA-DRB1. However, cord-blood units mismatched at one or two HLA loci are available for almost all patients younger than 20 years of age and for more than 80% of patients 20 years of age or older, regardless of racial and ethnic background. Conclusions Most patients likely to benefit from HSCT will have a donor. Public investment in

  7. Assessing Individual Weather Risk-Taking and Its Role in Modeling Likelihood of Hurricane Evacuation

    Science.gov (United States)

    Stewart, A. E.

    2017-12-01

    This research focuses upon measuring an individual's level of perceived risk of different severe and extreme weather conditions using a new self-report measure, the Weather Risk-Taking Scale (WRTS). For 32 severe and extreme situations in which people could perform an unsafe behavior (e. g., remaining outside with lightning striking close by, driving over roadways covered with water, not evacuating ahead of an approaching hurricane, etc.), people rated: 1.their likelihood of performing the behavior, 2. The perceived risk of performing the behavior, 3. the expected benefits of performing the behavior, and 4. whether the behavior has actually been performed in the past. Initial development research with the measure using 246 undergraduate students examined its psychometric properties and found that it was internally consistent (Cronbach's a ranged from .87 to .93 for the four scales) and that the scales possessed good temporal (test-retest) reliability (r's ranged from .84 to .91). A second regression study involving 86 undergraduate students found that taking weather risks was associated with having taken similar risks in one's past and with the personality trait of sensation-seeking. Being more attentive to the weather and perceiving its risks when it became extreme was associated with lower likelihoods of taking weather risks (overall regression model, R2adj = 0.60). A third study involving 334 people examined the contributions of weather risk perceptions and risk-taking in modeling the self-reported likelihood of complying with a recommended evacuation ahead of a hurricane. Here, higher perceptions of hurricane risks and lower perceived benefits of risk-taking along with fear of severe weather and hurricane personal self-efficacy ratings were all statistically significant contributors to the likelihood of evacuating ahead of a hurricane. Psychological rootedness and attachment to one's home also tend to predict lack of evacuation. This research highlights the

  8. Maximum likelihood positioning algorithm for high-resolution PET scanners

    International Nuclear Information System (INIS)

    Gross-Weege, Nicolas; Schug, David; Hallen, Patrick; Schulz, Volkmar

    2016-01-01

    Purpose: In high-resolution positron emission tomography (PET), lightsharing elements are incorporated into typical detector stacks to read out scintillator arrays in which one scintillator element (crystal) is smaller than the size of the readout channel. In order to identify the hit crystal by means of the measured light distribution, a positioning algorithm is required. One commonly applied positioning algorithm uses the center of gravity (COG) of the measured light distribution. The COG algorithm is limited in spatial resolution by noise and intercrystal Compton scatter. The purpose of this work is to develop a positioning algorithm which overcomes this limitation. Methods: The authors present a maximum likelihood (ML) algorithm which compares a set of expected light distributions given by probability density functions (PDFs) with the measured light distribution. Instead of modeling the PDFs by using an analytical model, the PDFs of the proposed ML algorithm are generated assuming a single-gamma-interaction model from measured data. The algorithm was evaluated with a hot-rod phantom measurement acquired with the preclinical HYPERION II D PET scanner. In order to assess the performance with respect to sensitivity, energy resolution, and image quality, the ML algorithm was compared to a COG algorithm which calculates the COG from a restricted set of channels. The authors studied the energy resolution of the ML and the COG algorithm regarding incomplete light distributions (missing channel information caused by detector dead time). Furthermore, the authors investigated the effects of using a filter based on the likelihood values on sensitivity, energy resolution, and image quality. Results: A sensitivity gain of up to 19% was demonstrated in comparison to the COG algorithm for the selected operation parameters. Energy resolution and image quality were on a similar level for both algorithms. Additionally, the authors demonstrated that the performance of the ML

  9. The asymptotic behaviour of the maximum likelihood function of Kriging approximations using the Gaussian correlation function

    CSIR Research Space (South Africa)

    Kok, S

    2012-07-01

    Full Text Available continuously as the correlation function hyper-parameters approach zero. Since the global minimizer of the maximum likelihood function is an asymptote in this case, it is unclear if maximum likelihood estimation (MLE) remains valid. Numerical ill...

  10. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...... of BEL based on a simple, though non-standard, data-blocking rule which uses a data block of every possible length. Consequently, the method involves no block selection and is also anticipated to exhibit better coverage performance. Its non-standard blocking scheme, however, induces non......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  11. Neutron spectra unfolding with maximum entropy and maximum likelihood

    International Nuclear Information System (INIS)

    Itoh, Shikoh; Tsunoda, Toshiharu

    1989-01-01

    A new unfolding theory has been established on the basis of the maximum entropy principle and the maximum likelihood method. This theory correctly embodies the Poisson statistics of neutron detection, and always brings a positive solution over the whole energy range. Moreover, the theory unifies both problems of overdetermined and of underdetermined. For the latter, the ambiguity in assigning a prior probability, i.e. the initial guess in the Bayesian sense, has become extinct by virtue of the principle. An approximate expression of the covariance matrix for the resultant spectra is also presented. An efficient algorithm to solve the nonlinear system, which appears in the present study, has been established. Results of computer simulation showed the effectiveness of the present theory. (author)

  12. Narrow band interference cancelation in OFDM: Astructured maximum likelihood approach

    KAUST Repository

    Sohail, Muhammad Sadiq

    2012-06-01

    This paper presents a maximum likelihood (ML) approach to mitigate the effect of narrow band interference (NBI) in a zero padded orthogonal frequency division multiplexing (ZP-OFDM) system. The NBI is assumed to be time variant and asynchronous with the frequency grid of the ZP-OFDM system. The proposed structure based technique uses the fact that the NBI signal is sparse as compared to the ZP-OFDM signal in the frequency domain. The structure is also useful in reducing the computational complexity of the proposed method. The paper also presents a data aided approach for improved NBI estimation. The suitability of the proposed method is demonstrated through simulations. © 2012 IEEE.

  13. Calibration of two complex ecosystem models with different likelihood functions

    Science.gov (United States)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    The biosphere is a sensitive carbon reservoir. Terrestrial ecosystems were approximately carbon neutral during the past centuries, but they became net carbon sinks due to climate change induced environmental change and associated CO2 fertilization effect of the atmosphere. Model studies and measurements indicate that the biospheric carbon sink can saturate in the future due to ongoing climate change which can act as a positive feedback. Robustness of carbon cycle models is a key issue when trying to choose the appropriate model for decision support. The input parameters of the process-based models are decisive regarding the model output. At the same time there are several input parameters for which accurate values are hard to obtain directly from experiments or no local measurements are available. Due to the uncertainty associated with the unknown model parameters significant bias can be experienced if the model is used to simulate the carbon and nitrogen cycle components of different ecosystems. In order to improve model performance the unknown model parameters has to be estimated. We developed a multi-objective, two-step calibration method based on Bayesian approach in order to estimate the unknown parameters of PaSim and Biome-BGC models. Biome-BGC and PaSim are a widely used biogeochemical models that simulate the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems (in this research the developed version of Biome-BGC is used which is referred as BBGC MuSo). Both models were calibrated regardless the simulated processes and type of model parameters. The calibration procedure is based on the comparison of measured data with simulated results via calculating a likelihood function (degree of goodness-of-fit between simulated and measured data). In our research different likelihood function formulations were used in order to examine the effect of the different model

  14. Preliminary attempt on maximum likelihood tomosynthesis reconstruction of DEI data

    International Nuclear Information System (INIS)

    Wang Zhentian; Huang Zhifeng; Zhang Li; Kang Kejun; Chen Zhiqiang; Zhu Peiping

    2009-01-01

    Tomosynthesis is a three-dimension reconstruction method that can remove the effect of superimposition with limited angle projections. It is especially promising in mammography where radiation dose is concerned. In this paper, we propose a maximum likelihood tomosynthesis reconstruction algorithm (ML-TS) on the apparent absorption data of diffraction enhanced imaging (DEI). The motivation of this contribution is to develop a tomosynthesis algorithm in low-dose or noisy circumstances and make DEI get closer to clinic application. The theoretical statistical models of DEI data in physics are analyzed and the proposed algorithm is validated with the experimental data at the Beijing Synchrotron Radiation Facility (BSRF). The results of ML-TS have better contrast compared with the well known 'shift-and-add' algorithm and FBP algorithm. (authors)

  15. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision

    Directory of Open Access Journals (Sweden)

    L. Balaji

    2015-01-01

    Full Text Available H.264 Advanced Video Coding (AVC was prolonged to Scalable Video Coding (SVC. SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  16. H.264 SVC Complexity Reduction Based on Likelihood Mode Decision.

    Science.gov (United States)

    Balaji, L; Thyagharajan, K K

    2015-01-01

    H.264 Advanced Video Coding (AVC) was prolonged to Scalable Video Coding (SVC). SVC executes in different electronics gadgets such as personal computer, HDTV, SDTV, IPTV, and full-HDTV in which user demands various scaling of the same content. The various scaling is resolution, frame rate, quality, heterogeneous networks, bandwidth, and so forth. Scaling consumes more encoding time and computational complexity during mode selection. In this paper, to reduce encoding time and computational complexity, a fast mode decision algorithm based on likelihood mode decision (LMD) is proposed. LMD is evaluated in both temporal and spatial scaling. From the results, we conclude that LMD performs well, when compared to the previous fast mode decision algorithms. The comparison parameters are time, PSNR, and bit rate. LMD achieve time saving of 66.65% with 0.05% detriment in PSNR and 0.17% increment in bit rate compared with the full search method.

  17. Likelihood Approximation With Hierarchical Matrices For Large Spatial Datasets

    KAUST Repository

    Litvinenko, Alexander

    2017-09-03

    We use available measurements to estimate the unknown parameters (variance, smoothness parameter, and covariance length) of a covariance function by maximizing the joint Gaussian log-likelihood function. To overcome cubic complexity in the linear algebra, we approximate the discretized covariance function in the hierarchical (H-) matrix format. The H-matrix format has a log-linear computational cost and storage O(kn log n), where the rank k is a small integer and n is the number of locations. The H-matrix technique allows us to work with general covariance matrices in an efficient way, since H-matrices can approximate inhomogeneous covariance functions, with a fairly general mesh that is not necessarily axes-parallel, and neither the covariance matrix itself nor its inverse have to be sparse. We demonstrate our method with Monte Carlo simulations and an application to soil moisture data. The C, C++ codes and data are freely available.

  18. Music genre classification via likelihood fusion from multiple feature models

    Science.gov (United States)

    Shiu, Yu; Kuo, C.-C. J.

    2005-01-01

    Music genre provides an efficient way to index songs in a music database, and can be used as an effective means to retrieval music of a similar type, i.e. content-based music retrieval. A new two-stage scheme for music genre classification is proposed in this work. At the first stage, we examine a couple of different features, construct their corresponding parametric models (e.g. GMM and HMM) and compute their likelihood functions to yield soft classification results. In particular, the timbre, rhythm and temporal variation features are considered. Then, at the second stage, these soft classification results are integrated to result in a hard decision for final music genre classification. Experimental results are given to demonstrate the performance of the proposed scheme.

  19. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  20. Maximum likelihood estimation of phase-type distributions

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R

    for both univariate and multivariate cases. Methods like the EM algorithm and Markov chain Monte Carlo are applied for this purpose. Furthermore, this thesis provides explicit formulae for computing the Fisher information matrix for discrete and continuous phase-type distributions, which is needed to find......This work is concerned with the statistical inference of phase-type distributions and the analysis of distributions with rational Laplace transform, known as matrix-exponential distributions. The thesis is focused on the estimation of the maximum likelihood parameters of phase-type distributions...... confidence regions for their estimated parameters. Finally, a new general class of distributions, called bilateral matrix-exponential distributions, is defined. These distributions have the entire real line as domain and can be used, for instance, for modelling. In addition, this class of distributions...

  1. The elaboration likelihood model and communication about food risks.

    Science.gov (United States)

    Frewer, L J; Howard, C; Hedderley, D; Shepherd, R

    1997-12-01

    Factors such as hazard type and source credibility have been identified as important in the establishment of effective strategies for risk communication. The elaboration likelihood model was adapted to investigate the potential impact of hazard type, information source, and persuasive content of information on individual engagement in elaborative, or thoughtful, cognitions about risk messages. One hundred sixty respondents were allocated to one of eight experimental groups, and the effects of source credibility, persuasive content of information and hazard type were systematically varied. The impact of the different factors on beliefs about the information and elaborative processing examined. Low credibility was particularly important in reducing risk perceptions, although persuasive content and hazard type were also influential in determining whether elaborative processing occurred.

  2. Maximum Likelihood Blood Velocity Estimator Incorporating Properties of Flow Physics

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2004-01-01

    )-data under investigation. The flow physic properties are exploited in the second term, as the range of velocity values investigated in the cross-correlation analysis are compared to the velocity estimates in the temporal and spatial neighborhood of the signal segment under investigation. The new estimator...... has been compared to the cross-correlation (CC) estimator and the previously developed maximum likelihood estimator (MLE). The results show that the CMLE can handle a larger velocity search range and is capable of estimating even low velocity levels from tissue motion. The CC and the MLE produce...... for the CC and the MLE. When the velocity search range is set to twice the limit of the CC and the MLE, the number of incorrect velocity estimates are 0, 19.1, and 7.2% for the CMLE, CC, and MLE, respectively. The ability to handle a larger search range and estimating low velocity levels was confirmed...

  3. Accelerated maximum likelihood parameter estimation for stochastic biochemical systems

    Directory of Open Access Journals (Sweden)

    Daigle Bernie J

    2012-05-01

    Full Text Available Abstract Background A prerequisite for the mechanistic simulation of a biochemical system is detailed knowledge of its kinetic parameters. Despite recent experimental advances, the estimation of unknown parameter values from observed data is still a bottleneck for obtaining accurate simulation results. Many methods exist for parameter estimation in deterministic biochemical systems; methods for discrete stochastic systems are less well developed. Given the probabilistic nature of stochastic biochemical models, a natural approach is to choose parameter values that maximize the probability of the observed data with respect to the unknown parameters, a.k.a. the maximum likelihood parameter estimates (MLEs. MLE computation for all but the simplest models requires the simulation of many system trajectories that are consistent with experimental data. For models with unknown parameters, this presents a computational challenge, as the generation of consistent trajectories can be an extremely rare occurrence. Results We have developed Monte Carlo Expectation-Maximization with Modified Cross-Entropy Method (MCEM2: an accelerated method for calculating MLEs that combines advances in rare event simulation with a computationally efficient version of the Monte Carlo expectation-maximization (MCEM algorithm. Our method requires no prior knowledge regarding parameter values, and it automatically provides a multivariate parameter uncertainty estimate. We applied the method to five stochastic systems of increasing complexity, progressing from an analytically tractable pure-birth model to a computationally demanding model of yeast-polarization. Our results demonstrate that MCEM2 substantially accelerates MLE computation on all tested models when compared to a stand-alone version of MCEM. Additionally, we show how our method identifies parameter values for certain classes of models more accurately than two recently proposed computationally efficient methods

  4. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    International Nuclear Information System (INIS)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Green, Gregory M.; Hogg, David W.

    2015-01-01

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf

  5. Likelihood of illegal alcohol sales at professional sport stadiums.

    Science.gov (United States)

    Toomey, Traci L; Erickson, Darin J; Lenk, Kathleen M; Kilian, Gunna R

    2008-11-01

    Several studies have assessed the propensity for illegal alcohol sales at licensed alcohol establishments and community festivals, but no previous studies examined the propensity for these sales at professional sport stadiums. In this study, we assessed the likelihood of alcohol sales to both underage youth and obviously intoxicated patrons at professional sports stadiums across the United States, and assessed the factors related to likelihood of both types of alcohol sales. We conducted pseudo-underage (i.e., persons age 21 or older who appear under 21) and pseudo-intoxicated (i.e., persons feigning intoxication) alcohol purchase attempts at stadiums that house professional hockey, basketball, baseball, and football teams. We conducted the purchase attempts at 16 sport stadiums located in 5 states. We measured 2 outcome variables: pseudo-underage sale (yes, no) and pseudo-intoxicated sale (yes, no), and 3 types of independent variables: (1) seller characteristics, (2) purchase attempt characteristics, and (3) event characteristics. Following univariate and bivariate analyses, we a separate series of logistic generalized mixed regression models for each outcome variable. The overall sales rates to the pseudo-underage and pseudo-intoxicated buyers were 18% and 74%, respectively. In the multivariate logistic analyses, we found that the odds of a sale to a pseudo-underage buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (30% vs. 13%; p = 0.01). The odds of a sale to an obviously intoxicated buyer in the stands was 2.9 as large as the odds of a sale at the concession booths (89% vs. 73%; p = 0.02). Similar to studies assessing illegal alcohol sales at licensed alcohol establishments and community festivals, findings from this study shows the need for interventions specifically focused on illegal alcohol sales at professional sporting events.

  6. Targeted maximum likelihood estimation for a binary treatment: A tutorial.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Schomaker, Michael; Rachet, Bernard; Schnitzer, Mireille E

    2018-04-23

    When estimating the average effect of a binary treatment (or exposure) on an outcome, methods that incorporate propensity scores, the G-formula, or targeted maximum likelihood estimation (TMLE) are preferred over naïve regression approaches, which are biased under misspecification of a parametric outcome model. In contrast propensity score methods require the correct specification of an exposure model. Double-robust methods only require correct specification of either the outcome or the exposure model. Targeted maximum likelihood estimation is a semiparametric double-robust method that improves the chances of correct model specification by allowing for flexible estimation using (nonparametric) machine-learning methods. It therefore requires weaker assumptions than its competitors. We provide a step-by-step guided implementation of TMLE and illustrate it in a realistic scenario based on cancer epidemiology where assumptions about correct model specification and positivity (ie, when a study participant had 0 probability of receiving the treatment) are nearly violated. This article provides a concise and reproducible educational introduction to TMLE for a binary outcome and exposure. The reader should gain sufficient understanding of TMLE from this introductory tutorial to be able to apply the method in practice. Extensive R-code is provided in easy-to-read boxes throughout the article for replicability. Stata users will find a testing implementation of TMLE and additional material in the Appendix S1 and at the following GitHub repository: https://github.com/migariane/SIM-TMLE-tutorial. © 2018 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  7. Number of Siblings During Childhood and the Likelihood of Divorce in Adulthood.

    Science.gov (United States)

    Bobbitt-Zeher, Donna; Downey, Douglas B; Merry, Joseph

    2016-11-01

    Despite fertility decline across economically developed countries, relatively little is known about the social consequences of children being raised with fewer siblings. Much research suggests that growing up with fewer siblings is probably positive, as children tend to do better in school when sibship size is small. Less scholarship, however, has explored how growing up with few siblings influences children's ability to get along with peers and develop long-term meaningful relationships. If siblings serve as important social practice partners during childhood, individuals with few or no siblings may struggle to develop successful social lives later in adulthood. With data from the General Social Surveys 1972-2012 , we explore this possibility by testing whether sibship size during childhood predicts the probability of divorce in adulthood. We find that, among those who ever marry, each additional sibling is associated with a three percent decline in the likelihood of divorce, net of covariates.

  8. Parametric Roll Resonance Detection using Phase Correlation and Log-likelihood Testing Techniques

    DEFF Research Database (Denmark)

    Galeazzi, Roberto; Blanke, Mogens; Poulsen, Niels Kjølstad

    2009-01-01

    generation warning system the purpose of which is to provide the master with an onboard system able to trigger an alarm when parametric roll is likely to happen within the immediate future. A detection scheme is introduced, which is able to issue a warning within five roll periods after a resonant motion......Real-time detection of parametric roll is still an open issue that is gathering an increasing attention. A first generation warning systems, based on guidelines and polar diagrams, showed their potential to face issues like long-term prediction and risk assessment. This paper presents a second...... started. After having determined statistical properties of the signals at hand, a detector based on the generalised log-likelihood ratio test (GLRT) is designed to look for variation in signal power. The ability of the detector to trigger alarms when parametric roll is going to onset is evaluated on two...

  9. Maximum-likelihood estimation of the hyperbolic parameters from grouped observations

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1988-01-01

    a least-squares problem. The second procedure Hypesti first approaches the maximum-likelihood estimate by iterating in the profile-log likelihood function for the scale parameter. Close to the maximum of the likelihood function, the estimation is brought to an end by iteration, using all four parameters...

  10. A short proof that phylogenetic tree reconstruction by maximum likelihood is hard.

    Science.gov (United States)

    Roch, Sebastien

    2006-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  11. A Short Proof that Phylogenetic Tree Reconstruction by Maximum Likelihood is Hard

    OpenAIRE

    Roch, S.

    2005-01-01

    Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.

  12. Can surgical oncologists reliably predict the likelihood for non-SLN metastases in breast cancer patients?

    NARCIS (Netherlands)

    Smidt, M.L.; Strobbe, L.J.; Groenewoud, J.M.M.; Wilt, G.J. van der; Zee, K.J. van; Wobbes, Th.

    2007-01-01

    BACKGROUND: In approximately 40% of the breast cancer patients with sentinel lymph node (SLN) metastases, additional nodal metastases are detected in the completion axillary lymph node dissection (cALND). The MSKCC nomogram can help to quantify a patient's individual risk for non-SLN metastases with

  13. The incidence of urea cycle disorders

    OpenAIRE

    Summar, Marshall L.; Koelker, Stefan; Freedenberg, Debra; Le Mons, Cynthia; Haberle, Johannes; Lee, Hye-Seung; Kirmse, Brian

    2013-01-01

    A key question for urea cycle disorders is their incidence. In the United States two UCDs argininosuccinic synthetase and lyase deficiency are currently detected by newborn screening. We used newborn screening data on over 6. million births and data from the large US and European longitudinal registries to determine how common these conditions are. The incidence for the United States is predicted to be 1 urea cycle disorder patient for every 35000 births presenting about 113 new patients per ...

  14. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    Science.gov (United States)

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  15. Planck 2013 results. XV. CMB power spectra and likelihood

    Science.gov (United States)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.

    2014-11-01

    This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the spectral index of scalar perturbations, for which we report a 5.4σ deviation from scale invariance, ns = 1. Increasing the multipole range beyond ℓ ≃ 1500 does not increase our accuracy for the ΛCDM parameters, but instead allows us to study extensions beyond the standard model. We find no indication of significant departures from the ΛCDM framework. Finally, we report a tension between the Planck best-fit ΛCDM model and the low-ℓ spectrum in the form of a power deficit of 5-10% at ℓ ≲ 40, with a statistical significance of 2.5-3σ. Without a theoretically motivated model for

  16. Likelihood ratio model for classification of forensic evidence

    Energy Technology Data Exchange (ETDEWEB)

    Zadora, G., E-mail: gzadora@ies.krakow.pl [Institute of Forensic Research, Westerplatte 9, 31-033 Krakow (Poland); Neocleous, T., E-mail: tereza@stats.gla.ac.uk [University of Glasgow, Department of Statistics, 15 University Gardens, Glasgow G12 8QW (United Kingdom)

    2009-05-29

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H{sub 1})/p(E|H{sub 2}). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI{sub b}) and after (RI{sub a}) the annealing process, in the form of dRI = log{sub 10}|RI{sub a} - RI{sub b}|. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this

  17. Likelihood ratio model for classification of forensic evidence

    International Nuclear Information System (INIS)

    Zadora, G.; Neocleous, T.

    2009-01-01

    One of the problems of analysis of forensic evidence such as glass fragments, is the determination of their use-type category, e.g. does a glass fragment originate from an unknown window or container? Very small glass fragments arise during various accidents and criminal offences, and could be carried on the clothes, shoes and hair of participants. It is therefore necessary to obtain information on their physicochemical composition in order to solve the classification problem. Scanning Electron Microscopy coupled with an Energy Dispersive X-ray Spectrometer and the Glass Refractive Index Measurement method are routinely used in many forensic institutes for the investigation of glass. A natural form of glass evidence evaluation for forensic purposes is the likelihood ratio-LR = p(E|H 1 )/p(E|H 2 ). The main aim of this paper was to study the performance of LR models for glass object classification which considered one or two sources of data variability, i.e. between-glass-object variability and(or) within-glass-object variability. Within the proposed model a multivariate kernel density approach was adopted for modelling the between-object distribution and a multivariate normal distribution was adopted for modelling within-object distributions. Moreover, a graphical method of estimating the dependence structure was employed to reduce the highly multivariate problem to several lower-dimensional problems. The performed analysis showed that the best likelihood model was the one which allows to include information about between and within-object variability, and with variables derived from elemental compositions measured by SEM-EDX, and refractive values determined before (RI b ) and after (RI a ) the annealing process, in the form of dRI = log 10 |RI a - RI b |. This model gave better results than the model with only between-object variability considered. In addition, when dRI and variables derived from elemental compositions were used, this model outperformed two other

  18. A simulation study of likelihood inference procedures in rayleigh distribution with censored data

    International Nuclear Information System (INIS)

    Baklizi, S. A.; Baker, H. M.

    2001-01-01

    Inference procedures based on the likelihood function are considered for the one parameter Rayleigh distribution with type1 and type 2 censored data. Using simulation techniques, the finite sample performances of the maximum likelihood estimator and the large sample likelihood interval estimation procedures based on the Wald, the Rao, and the likelihood ratio statistics are investigated. It appears that the maximum likelihood estimator is unbiased. The approximate variance estimates obtained from the asymptotic normal distribution of the maximum likelihood estimator are accurate under type 2 censored data while they tend to be smaller than the actual variances when considering type1 censored data of small size. It appears also that interval estimation based on the Wald and Rao statistics need much more sample size than interval estimation based on the likelihood ratio statistic to attain reasonable accuracy. (authors). 15 refs., 4 tabs

  19. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  20. Qualitative release assessment to estimate the likelihood of henipavirus entering the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Emma L Snary

    Full Text Available The genus Henipavirus includes Hendra virus (HeV and Nipah virus (NiV, for which fruit bats (particularly those of the genus Pteropus are considered to be the wildlife reservoir. The recognition of henipaviruses occurring across a wider geographic and host range suggests the possibility of the virus entering the United Kingdom (UK. To estimate the likelihood of henipaviruses entering the UK, a qualitative release assessment was undertaken. To facilitate the release assessment, the world was divided into four zones according to location of outbreaks of henipaviruses, isolation of henipaviruses, proximity to other countries where incidents of henipaviruses have occurred and the distribution of Pteropus spp. fruit bats. From this release assessment, the key findings are that the importation of fruit from Zone 1 and 2 and bat bushmeat from Zone 1 each have a Low annual probability of release of henipaviruses into the UK. Similarly, the importation of bat meat from Zone 2, horses and companion animals from Zone 1 and people travelling from Zone 1 and entering the UK was estimated to pose a Very Low probability of release. The annual probability of release for all other release routes was assessed to be Negligible. It is recommended that the release assessment be periodically re-assessed to reflect changes in knowledge and circumstances over time.

  1. Efficient algorithms for maximum likelihood decoding in the surface code

    Science.gov (United States)

    Bravyi, Sergey; Suchara, Martin; Vargo, Alexander

    2014-09-01

    We describe two implementations of the optimal error correction algorithm known as the maximum likelihood decoder (MLD) for the two-dimensional surface code with a noiseless syndrome extraction. First, we show how to implement MLD exactly in time O (n2), where n is the number of code qubits. Our implementation uses a reduction from MLD to simulation of matchgate quantum circuits. This reduction however requires a special noise model with independent bit-flip and phase-flip errors. Secondly, we show how to implement MLD approximately for more general noise models using matrix product states (MPS). Our implementation has running time O (nχ3), where χ is a parameter that controls the approximation precision. The key step of our algorithm, borrowed from the density matrix renormalization-group method, is a subroutine for contracting a tensor network on the two-dimensional grid. The subroutine uses MPS with a bond dimension χ to approximate the sequence of tensors arising in the course of contraction. We benchmark the MPS-based decoder against the standard minimum weight matching decoder observing a significant reduction of the logical error probability for χ ≥4.

  2. Maximum likelihood sequence estimation for optical complex direct modulation.

    Science.gov (United States)

    Che, Di; Yuan, Feng; Shieh, William

    2017-04-17

    Semiconductor lasers are versatile optical transmitters in nature. Through the direct modulation (DM), the intensity modulation is realized by the linear mapping between the injection current and the light power, while various angle modulations are enabled by the frequency chirp. Limited by the direct detection, DM lasers used to be exploited only as 1-D (intensity or angle) transmitters by suppressing or simply ignoring the other modulation. Nevertheless, through the digital coherent detection, simultaneous intensity and angle modulations (namely, 2-D complex DM, CDM) can be realized by a single laser diode. The crucial technique of CDM is the joint demodulation of intensity and differential phase with the maximum likelihood sequence estimation (MLSE), supported by a closed-form discrete signal approximation of frequency chirp to characterize the MLSE transition probability. This paper proposes a statistical method for the transition probability to significantly enhance the accuracy of the chirp model. Using the statistical estimation, we demonstrate the first single-channel 100-Gb/s PAM-4 transmission over 1600-km fiber with only 10G-class DM lasers.

  3. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d 2 ], where t is the time and d is dose. The coefficient of the d 2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  4. Ringing Artefact Reduction By An Efficient Likelihood Improvement Method

    Science.gov (United States)

    Fuderer, Miha

    1989-10-01

    In MR imaging, the extent of the acquired spatial frequencies of the object is necessarily finite. The resulting image shows artefacts caused by "truncation" of its Fourier components. These are known as Gibbs artefacts or ringing artefacts. These artefacts are particularly. visible when the time-saving reduced acquisition method is used, say, when scanning only the lowest 70% of the 256 data lines. Filtering the data results in loss of resolution. A method is described that estimates the high frequency data from the low-frequency data lines, with the likelihood of the image as criterion. It is a computationally very efficient method, since it requires practically only two extra Fourier transforms, in addition to the normal. reconstruction. The results of this method on MR images of human subjects are promising. Evaluations on a 70% acquisition image show about 20% decrease of the error energy after processing. "Error energy" is defined as the total power of the difference to a 256-data-lines reference image. The elimination of ringing artefacts then appears almost complete..

  5. Scale invariant for one-sided multivariate likelihood ratio tests

    Directory of Open Access Journals (Sweden)

    Samruam Chongcharoen

    2010-07-01

    Full Text Available Suppose 1 2 , ,..., n X X X is a random sample from Np ( ,V distribution. Consider 0 1 2 : ... 0 p H      and1 : 0 for 1, 2,..., i H   i  p , let 1 0 H  H denote the hypothesis that 1 H holds but 0 H does not, and let ~ 0 H denote thehypothesis that 0 H does not hold. Because the likelihood ratio test (LRT of 0 H versus 1 0 H  H is complicated, severalad hoc tests have been proposed. Tang, Gnecco and Geller (1989 proposed an approximate LRT, Follmann (1996 suggestedrejecting 0 H if the usual test of 0 H versus ~ 0 H rejects 0 H with significance level 2 and a weighted sum of the samplemeans is positive, and Chongcharoen, Singh and Wright (2002 modified Follmann’s test to include information about thecorrelation structure in the sum of the sample means. Chongcharoen and Wright (2007, 2006 give versions of the Tang-Gnecco-Geller tests and Follmann-type tests, respectively, with invariance properties. With LRT’s scale invariant desiredproperty, we investigate its powers by using Monte Carlo techniques and compare them with the tests which we recommendin Chongcharoen and Wright (2007, 2006.

  6. Maximum-likelihood estimation of recent shared ancestry (ERSA).

    Science.gov (United States)

    Huff, Chad D; Witherspoon, David J; Simonson, Tatum S; Xing, Jinchuan; Watkins, W Scott; Zhang, Yuhua; Tuohy, Therese M; Neklason, Deborah W; Burt, Randall W; Guthery, Stephen L; Woodward, Scott R; Jorde, Lynn B

    2011-05-01

    Accurate estimation of recent shared ancestry is important for genetics, evolution, medicine, conservation biology, and forensics. Established methods estimate kinship accurately for first-degree through third-degree relatives. We demonstrate that chromosomal segments shared by two individuals due to identity by descent (IBD) provide much additional information about shared ancestry. We developed a maximum-likelihood method for the estimation of recent shared ancestry (ERSA) from the number and lengths of IBD segments derived from high-density SNP or whole-genome sequence data. We used ERSA to estimate relationships from SNP genotypes in 169 individuals from three large, well-defined human pedigrees. ERSA is accurate to within one degree of relationship for 97% of first-degree through fifth-degree relatives and 80% of sixth-degree and seventh-degree relatives. We demonstrate that ERSA's statistical power approaches the maximum theoretical limit imposed by the fact that distant relatives frequently share no DNA through a common ancestor. ERSA greatly expands the range of relationships that can be estimated from genetic data and is implemented in a freely available software package.

  7. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  8. Affective mapping: An activation likelihood estimation (ALE) meta-analysis.

    Science.gov (United States)

    Kirby, Lauren A J; Robinson, Jennifer L

    2017-11-01

    Functional neuroimaging has the spatial resolution to explain the neural basis of emotions. Activation likelihood estimation (ALE), as opposed to traditional qualitative meta-analysis, quantifies convergence of activation across studies within affective categories. Others have used ALE to investigate a broad range of emotions, but without the convenience of the BrainMap database. We used the BrainMap database and analysis resources to run separate meta-analyses on coordinates reported for anger, anxiety, disgust, fear, happiness, humor, and sadness. Resultant ALE maps were compared to determine areas of convergence between emotions, as well as to identify affect-specific networks. Five out of the seven emotions demonstrated consistent activation within the amygdala, whereas all emotions consistently activated the right inferior frontal gyrus, which has been implicated as an integration hub for affective and cognitive processes. These data provide the framework for models of affect-specific networks, as well as emotional processing hubs, which can be used for future studies of functional or effective connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Dark matter CMB constraints and likelihoods for poor particle physicists

    Energy Technology Data Exchange (ETDEWEB)

    Cline, James M.; Scott, Pat, E-mail: jcline@physics.mcgill.ca, E-mail: patscott@physics.mcgill.ca [Department of Physics, McGill University, 3600 rue University, Montréal, QC, H3A 2T8 (Canada)

    2013-03-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m{sub χ}, for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels.

  10. Dark matter CMB constraints and likelihoods for poor particle physicists

    International Nuclear Information System (INIS)

    Cline, James M.; Scott, Pat

    2013-01-01

    The cosmic microwave background provides constraints on the annihilation and decay of light dark matter at redshifts between 100 and 1000, the strength of which depends upon the fraction of energy ending up in the form of electrons and photons. The resulting constraints are usually presented for a limited selection of annihilation and decay channels. Here we provide constraints on the annihilation cross section and decay rate, at discrete values of the dark matter mass m χ , for all the annihilation and decay channels whose secondary spectra have been computed using PYTHIA in arXiv:1012.4515 (''PPPC 4 DM ID: a poor particle physicist cookbook for dark matter indirect detection''), namely e, μ, τ, V → e, V → μ, V → τ, u, d s, c, b, t, γ, g, W, Z and h. By interpolating in mass, these can be used to find the CMB constraints and likelihood functions from WMAP7 and Planck for a wide range of dark matter models, including those with annihilation or decay into a linear combination of different channels

  11. Maximum likelihood estimation for cytogenetic dose-response curves

    International Nuclear Information System (INIS)

    Frome, E.L.; DuFrain, R.J.

    1986-01-01

    In vitro dose-response curves are used to describe the relation between chromosome aberrations and radiation dose for human lymphocytes. The lymphocytes are exposed to low-LET radiation, and the resulting dicentric chromosome aberrations follow the Poisson distribution. The expected yield depends on both the magnitude and the temporal distribution of the dose. A general dose-response model that describes this relation has been presented by Kellerer and Rossi (1972, Current Topics on Radiation Research Quarterly 8, 85-158; 1978, Radiation Research 75, 471-488) using the theory of dual radiation action. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting dose-time-response models are intrinsically nonlinear in the parameters. A general-purpose maximum likelihood estimation procedure is described, and estimation for the nonlinear models is illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure

  12. Physical activity may decrease the likelihood of children developing constipation.

    Science.gov (United States)

    Seidenfaden, Sandra; Ormarsson, Orri Thor; Lund, Sigrun H; Bjornsson, Einar S

    2018-01-01

    Childhood constipation is common. We evaluated children diagnosed with constipation, who were referred to an Icelandic paediatric emergency department, and determined the effect of lifestyle factors on its aetiology. The parents of children who were diagnosed with constipation and participated in a phase IIB clinical trial on laxative suppositories answered an online questionnaire about their children's lifestyle and constipation in March-April 2013. The parents of nonconstipated children that visited the paediatric department of Landspitali University Hospital or an Icelandic outpatient clinic answered the same questionnaire. We analysed responses regarding 190 children aged one year to 18 years: 60 with constipation and 130 without. We found that 40% of the constipated children had recurrent symptoms, 27% had to seek medical attention more than once and 33% received medication per rectum. The 47 of 130 control group subjects aged 10-18 were much more likely to exercise more than three times a week (72%) and for more than a hour (62%) than the 26 of 60 constipated children of the same age (42% and 35%, respectively). Constipation risk factors varied with age and many children diagnosed with constipation had recurrent symptoms. Physical activity may affect the likelihood of developing constipation in older children. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  13. Maximum likelihood pedigree reconstruction using integer linear programming.

    Science.gov (United States)

    Cussens, James; Bartlett, Mark; Jones, Elinor M; Sheehan, Nuala A

    2013-01-01

    Large population biobanks of unrelated individuals have been highly successful in detecting common genetic variants affecting diseases of public health concern. However, they lack the statistical power to detect more modest gene-gene and gene-environment interaction effects or the effects of rare variants for which related individuals are ideally required. In reality, most large population studies will undoubtedly contain sets of undeclared relatives, or pedigrees. Although a crude measure of relatedness might sometimes suffice, having a good estimate of the true pedigree would be much more informative if this could be obtained efficiently. Relatives are more likely to share longer haplotypes around disease susceptibility loci and are hence biologically more informative for rare variants than unrelated cases and controls. Distant relatives are arguably more useful for detecting variants with small effects because they are less likely to share masking environmental effects. Moreover, the identification of relatives enables appropriate adjustments of statistical analyses that typically assume unrelatedness. We propose to exploit an integer linear programming optimisation approach to pedigree learning, which is adapted to find valid pedigrees by imposing appropriate constraints. Our method is not restricted to small pedigrees and is guaranteed to return a maximum likelihood pedigree. With additional constraints, we can also search for multiple high-probability pedigrees and thus account for the inherent uncertainty in any particular pedigree reconstruction. The true pedigree is found very quickly by comparison with other methods when all individuals are observed. Extensions to more complex problems seem feasible. © 2012 Wiley Periodicals, Inc.

  14. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  15. Race of source effects in the elaboration likelihood model.

    Science.gov (United States)

    White, P H; Harkins, S G

    1994-11-01

    In a series of experiments, we investigated the effect of race of source on persuasive communications in the Elaboration Likelihood Model (R.E. Petty & J.T. Cacioppo, 1981, 1986). In Experiment 1, we found no evidence that White participants responded to a Black source as a simple negative cue. Experiment 2 suggested the possibility that exposure to a Black source led to low-involvement message processing. In Experiments 3 and 4, a distraction paradigm was used to test this possibility, and it was found that participants under low involvement were highly motivated to process a message presented by a Black source. In Experiment 5, we found that attitudes toward the source's ethnic group, rather than violations of expectancies, accounted for this processing effect. Taken together, the results of these experiments are consistent with S.L. Gaertner and J.F. Dovidio's (1986) theory of aversive racism, which suggests that Whites, because of a combination of egalitarian values and underlying negative racial attitudes, are very concerned about not appearing unfavorable toward Blacks, leading them to be highly motivated to process messages presented by a source from this group.

  16. Maximum likelihood estimation for cytogenetic dose-response curves

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L; DuFrain, R.J.

    1983-10-01

    In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.

  17. Cancer incidence among waiters

    DEFF Research Database (Denmark)

    Reijula, Jere; Kjaerheim, Kristina; Lynge, Elsebeth

    2015-01-01

    AIMS: To study cancer risk patterns among waiters in the Nordic countries. METHODS: We identified a cohort of 16,134 male and 81,838 female waiters from Denmark, Finland, Iceland, Norway and Sweden. During the follow-up period from 1961 to 2005, we found that 19,388 incident cancer cases were...... diagnosed. Standardised incidence ratio (SIR) was defined as the observed number of cancer cases divided by the expected number, based on national age, time period and gender-specific cancer incidence rates in the general population. RESULTS: The SIR of all cancers in waiters, in the five countries combined...... INCIDENCE IN SOME CANCER SITES CAN LIKELY BE EXPLAINED BY HIGHER ALCOHOL CONSUMPTION, THE PREVALENCE OF SMOKING AND OCCUPATIONAL EXPOSURE TO TOBACCO SMOKE HOPEFULLY, THE INCIDENCE OF CANCER AMONG WAITERS WILL DECREASE IN THE FUTURE, DUE TO THE BANNING OF TOBACCO SMOKING IN RESTAURANTS AND BARS IN THE NORDIC...

  18. Incident detection and isolation in drilling using analytical redundancy relations

    DEFF Research Database (Denmark)

    Willersrud, Anders; Blanke, Mogens; Imsland, Lars

    2015-01-01

    must be avoided. This paper employs model-based diagnosis using analytical redundancy relations to obtain residuals which are affected differently by the different incidents. Residuals are found to be non-Gaussian - they follow a multivariate t-distribution - hence, a dedicated generalized likelihood...... measurements available. In the latter case, isolation capability is shown to be reduced to group-wise isolation, but the method would still detect all serious events with the prescribed false alarm probability...

  19. Radiological incidents in radiotherapy

    International Nuclear Information System (INIS)

    Hobzova, L.; Novotny, J.

    2008-01-01

    In many countries a reporting system of radiological incidents to national regulatory body exists and providers of radiotherapy treatment are obliged to report all major and/or in some countries all incidents occurring in institution. State Office for Nuclear Safety (SONS) is providing a systematic guidance for radiotherapy departments from 1997 by requiring inclusion of radiation safety problems into Quality assurance manual, which is the basic document for obtaining a license of SONS for handling with sources of ionizing radiation. For that purpose SONS also issued the recommendation 'Introduction of QA system for important sources in radiotherapy-radiological incidents' in which the radiological incidents are defined and the basic guidance for their classification (category A, B, C, D), investigation and reporting are given. At regular periods the SONS in co-operation with radiotherapy centers is making a survey of all radiological incidents occurring in institutions and it is presenting obtained information in synoptic communication (2003 Motolske dny, 2005 Novy Jicin). This presentation is another summary report of radiological incidents that occurred in our radiotherapy institutions during last 3 years. Emphasis is given not only to survey and statistics, but also to analysis of reasons of the radiological incidents and to their detection and prevention. Analyses of incidents in radiotherapy have led to a much broader understanding of incident causation. Information about the error should be shared as early as possible during or after investigation by all radiotherapy centers. Learning from incidents, errors and near misses should be a part of improvement of the QA system in institutions. Generally, it is recommended that all radiotherapy facilities should participate in the reporting, analyzing and learning system to facilitate the dissemination of knowledge throughout the whole country to prevent errors in radiotherapy.(authors)

  20. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  1. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed; Alsolami, Fawaz; Chikalov, Igor; Algharbi, Salem; Aboudi, Faisal; Khudiri, Musab

    2016-01-01

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  2. DREAM3: network inference using dynamic context likelihood of relatedness and the inferelator.

    Directory of Open Access Journals (Sweden)

    Aviv Madar

    2010-03-01

    Full Text Available Many current works aiming to learn regulatory networks from systems biology data must balance model complexity with respect to data availability and quality. Methods that learn regulatory associations based on unit-less metrics, such as Mutual Information, are attractive in that they scale well and reduce the number of free parameters (model complexity per interaction to a minimum. In contrast, methods for learning regulatory networks based on explicit dynamical models are more complex and scale less gracefully, but are attractive as they may allow direct prediction of transcriptional dynamics and resolve the directionality of many regulatory interactions.We aim to investigate whether scalable information based methods (like the Context Likelihood of Relatedness method and more explicit dynamical models (like Inferelator 1.0 prove synergistic when combined. We test a pipeline where a novel modification of the Context Likelihood of Relatedness (mixed-CLR, modified to use time series data is first used to define likely regulatory interactions and then Inferelator 1.0 is used for final model selection and to build an explicit dynamical model.Our method ranked 2nd out of 22 in the DREAM3 100-gene in silico networks challenge. Mixed-CLR and Inferelator 1.0 are complementary, demonstrating a large performance gain relative to any single tested method, with precision being especially high at low recall values. Partitioning the provided data set into four groups (knock-down, knock-out, time-series, and combined revealed that using comprehensive knock-out data alone provides optimal performance. Inferelator 1.0 proved particularly powerful at resolving the directionality of regulatory interactions, i.e. "who regulates who" (approximately of identified true positives were correctly resolved. Performance drops for high in-degree genes, i.e. as the number of regulators per target gene increases, but not with out-degree, i.e. performance is not affected by

  3. Predictors of Likelihood of Speaking Up about Safety Concerns in Labour and Delivery

    Science.gov (United States)

    Lyndon, Audrey; Sexton, J. Bryan; Simpson, Kathleen Rice; Rosenstein, Alan; Lee, Kathryn A.; Wachter, Robert M.

    2011-01-01

    Background Despite widespread emphasis on promoting “assertive communication” by caregivers as essential to patient safety improvement efforts, fairly little is known about when and how clinicians speak up to address safety concerns. In this cross-sectional study we use a new measure of speaking up to begin exploring this issue in maternity care. Methods We developed a scenario-based measure of clinician’s assessment of potential harm and likelihood of speaking up in response to perceived harm. We embedded this scale in a survey with measures of safety climate, teamwork climate, disruptive behaviour, work stress, and personality traits of bravery and assertiveness. The survey was distributed to all registered nurses and obstetricians practicing in two US Labour & Delivery units. Results The response rate was 54% (125 of 230 potential respondents). Respondents were experienced clinicians (13.7 ± 11 years in specialty). Higher perception of harm, respondent role, specialty experience, and site predicted likelihood of speaking up when controlling for bravery and assertiveness. Physicians rated potential harm in common clinical scenarios lower than nurses did (7.5 vs. 8.4 on 2–10 scale; p<0.001). Some participants (12%) indicated they were unlikely to speak up despite perceiving high potential for harm in certain situations. Discussion This exploratory study found nurses and physicians differed in their harm ratings, and harm rating was a predictor of speaking up. This may partially explain persistent discrepancies between physicians and nurses in teamwork climate scores. Differing assessments of potential harms inherent in everyday practice may be a target for teamwork intervention in maternity care. PMID:22927492

  4. Comparison between artificial neural networks and maximum likelihood classification in digital soil mapping

    Directory of Open Access Journals (Sweden)

    César da Silva Chagas

    2013-04-01

    Full Text Available Soil surveys are the main source of spatial information on soils and have a range of different applications, mainly in agriculture. The continuity of this activity has however been severely compromised, mainly due to a lack of governmental funding. The purpose of this study was to evaluate the feasibility of two different classifiers (artificial neural networks and a maximum likelihood algorithm in the prediction of soil classes in the northwest of the state of Rio de Janeiro. Terrain attributes such as elevation, slope, aspect, plan curvature and compound topographic index (CTI and indices of clay minerals, iron oxide and Normalized Difference Vegetation Index (NDVI, derived from Landsat 7 ETM+ sensor imagery, were used as discriminating variables. The two classifiers were trained and validated for each soil class using 300 and 150 samples respectively, representing the characteristics of these classes in terms of the discriminating variables. According to the statistical tests, the accuracy of the classifier based on artificial neural networks (ANNs was greater than of the classic Maximum Likelihood Classifier (MLC. Comparing the results with 126 points of reference showed that the resulting ANN map (73.81 % was superior to the MLC map (57.94 %. The main errors when using the two classifiers were caused by: a the geological heterogeneity of the area coupled with problems related to the geological map; b the depth of lithic contact and/or rock exposure, and c problems with the environmental correlation model used due to the polygenetic nature of the soils. This study confirms that the use of terrain attributes together with remote sensing data by an ANN approach can be a tool to facilitate soil mapping in Brazil, primarily due to the availability of low-cost remote sensing data and the ease by which terrain attributes can be obtained.

  5. Smoking increases the likelihood of Helicobacter pylori treatment failure.

    Science.gov (United States)

    Itskoviz, David; Boltin, Doron; Leibovitzh, Haim; Tsadok Perets, Tsachi; Comaneshter, Doron; Cohen, Arnon; Niv, Yaron; Levi, Zohar

    2017-07-01

    Data regarding the impact of smoking on the success of Helicobacter pylori (H. pylori) eradication are conflicting, partially due to the fact that sociodemographic status is associated with both smoking and H. pylori treatment success. We aimed to assess the effect of smoking on H. pylori eradication rates after controlling for sociodemographic confounders. Included were subjects aged 15 years or older, with a first time positive C 13 -urea breath test (C 13 -UBT) between 2007 to 2014, who underwent a second C 13 -UBT after receiving clarithromycin-based triple therapy. Data regarding age, gender, socioeconomic status (SES), smoking (current smokers or "never smoked"), and drug use were extracted from the Clalit health maintenance organization database. Out of 120,914 subjects with a positive first time C 13 -UBT, 50,836 (42.0%) underwent a second C 13 -UBT test. After excluding former smokers, 48,130 remained who were eligible for analysis. The mean age was 44.3±18.2years, 69.2% were females, 87.8% were Jewish and 12.2% Arabs, 25.5% were current smokers. The overall eradication failure rates were 33.3%: 34.8% in current smokers and 32.8% in subjects who never smoked. In a multivariate analysis, eradication failure was positively associated with current smoking (Odds Ratio {OR} 1.15, 95% CI 1.10-1.20, psmoking was found to significantly increase the likelihood of unsuccessful first-line treatment for H. pylori infection. Copyright © 2017 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  6. Obstetric History and Likelihood of Preterm Birth of Twins.

    Science.gov (United States)

    Easter, Sarah Rae; Little, Sarah E; Robinson, Julian N; Mendez-Figueroa, Hector; Chauhan, Suneet P

    2018-01-05

     The objective of this study was to investigate the relationship between preterm birth in a prior pregnancy and preterm birth in a twin pregnancy.  We performed a secondary analysis of a randomized controlled trial evaluating 17-α-hydroxyprogesterone caproate in twins. Women were classified as nulliparous, multiparous with a prior term birth, or multiparous with a prior preterm birth. We used logistic regression to examine the odds of spontaneous preterm birth of twins before 35 weeks according to past obstetric history.  Of the 653 women analyzed, 294 were nulliparas, 310 had a prior term birth, and 49 had a prior preterm birth. Prior preterm birth increased the likelihood of spontaneous delivery before 35 weeks (adjusted odds ratio [aOR]: 2.44, 95% confidence interval [CI]: 1.28-4.66), whereas prior term delivery decreased these odds (aOR: 0.55, 95% CI: 0.38-0.78) in the current twin pregnancy compared with the nulliparous reference group. This translated into a lower odds of composite neonatal morbidity (aOR: 0.38, 95% CI: 0.27-0.53) for women with a prior term delivery.  For women carrying twins, a history of preterm birth increases the odds of spontaneous preterm birth, whereas a prior term birth decreases odds of spontaneous preterm birth and neonatal morbidity for the current twin pregnancy. These results offer risk stratification and reassurance for clinicians. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  7. Beyond Sex: Likelihood and Predictors of Effective and Ineffective Intervention in Intimate Partner Violence in Bystanders Perceiving an Emergency.

    Science.gov (United States)

    Chabot, Heather Frasier; Gray, Melissa L; Makande, Tariro B; Hoyt, Robert L

    2016-01-06

    Within the framework of the bystander model of intervention, we examined specific correlates and the likelihood of effective and ineffective intervention strategies of bystanders to an instance of intimate partner violence (IPV) identified as an emergency. We measured psychological variables associated with general prosocial behavior (including sex, instrumentality, expressiveness, empathy, personal distress, dispositional anger, and perceived barriers) as influential predictors in four IPV intervention behaviors (i.e., calling 911, talking to the victim, talking to the perpetrator, and physically interacting with the perpetrator). One hundred seventeen college community members completed preintervention measures, watched a film clip of IPV which they identified as an emergency, reported their likelihood of becoming involved and utilizing intervention behaviors, and identified perceived barriers to intervention. Participants were more likely to indicate using effective over ineffective intervention tactics. Lower perceived barriers to intervention predicted greater intervention likelihood. Hierarchical regression indicated that men and individuals higher in anger and instrumental traits were more likely to report that they would engage in riskier ineffective forms of intervention. Implications regarding bystander training and associations to intervention in related forms of violence including sexual assault are discussed. © The Author(s) 2016.

  8. Maximum Likelihood Time-of-Arrival Estimation of Optical Pulses via Photon-Counting Photodetectors

    Science.gov (United States)

    Erkmen, Baris I.; Moision, Bruce E.

    2010-01-01

    Many optical imaging, ranging, and communications systems rely on the estimation of the arrival time of an optical pulse. Recently, such systems have been increasingly employing photon-counting photodetector technology, which changes the statistics of the observed photocurrent. This requires time-of-arrival estimators to be developed and their performances characterized. The statistics of the output of an ideal photodetector, which are well modeled as a Poisson point process, were considered. An analytical model was developed for the mean-square error of the maximum likelihood (ML) estimator, demonstrating two phenomena that cause deviations from the minimum achievable error at low signal power. An approximation was derived to the threshold at which the ML estimator essentially fails to provide better than a random guess of the pulse arrival time. Comparing the analytic model performance predictions to those obtained via simulations, it was verified that the model accurately predicts the ML performance over all regimes considered. There is little prior art that attempts to understand the fundamental limitations to time-of-arrival estimation from Poisson statistics. This work establishes both a simple mathematical description of the error behavior, and the associated physical processes that yield this behavior. Previous work on mean-square error characterization for ML estimators has predominantly focused on additive Gaussian noise. This work demonstrates that the discrete nature of the Poisson noise process leads to a distinctly different error behavior.

  9. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2015-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  10. High-order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano

    2015-09-29

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of locations is a very challenging problem in computational statistics, and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely-used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  11. Supplementary Material for: High-Order Composite Likelihood Inference for Max-Stable Distributions and Processes

    KAUST Repository

    Castruccio, Stefano; Huser, Raphaë l; Genton, Marc G.

    2016-01-01

    In multivariate or spatial extremes, inference for max-stable processes observed at a large collection of points is a very challenging problem and current approaches typically rely on less expensive composite likelihoods constructed from small subsets of data. In this work, we explore the limits of modern state-of-the-art computational facilities to perform full likelihood inference and to efficiently evaluate high-order composite likelihoods. With extensive simulations, we assess the loss of information of composite likelihood estimators with respect to a full likelihood approach for some widely used multivariate or spatial extreme models, we discuss how to choose composite likelihood truncation to improve the efficiency, and we also provide recommendations for practitioners. This article has supplementary material online.

  12. Police Incident Blotter (Archive)

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — The Police Blotter Archive contains crime incident data after it has been validated and processed to meet Uniform Crime Reporting (UCR) standards, published on a...

  13. 2011 Japanese Nuclear Incident

    Science.gov (United States)

    EPA’s RadNet system monitored the environmental radiation levels in the United States and parts of the Pacific following the Japanese Nuclear Incident. Learn about EPA’s response and view historical laboratory data and news releases.

  14. Marine Animal Incident Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Large whale stranding, death, ship strike and entanglement incidents are all recorded to monitor the health of each population and track anthropogenic factors that...

  15. Acute incidents during anaesthesia

    African Journals Online (AJOL)

    management of acute incidents and the prevention of ... High or total (complete) spinal blocks in obstetric .... Pain and opioid analgesics lead to delayed ... Step up postoperative care and use ... recognise suprasternal and supraclavicular.

  16. Maximum Likelihood Estimation and Inference With Examples in R, SAS and ADMB

    CERN Document Server

    Millar, Russell B

    2011-01-01

    This book takes a fresh look at the popular and well-established method of maximum likelihood for statistical estimation and inference. It begins with an intuitive introduction to the concepts and background of likelihood, and moves through to the latest developments in maximum likelihood methodology, including general latent variable models and new material for the practical implementation of integrated likelihood using the free ADMB software. Fundamental issues of statistical inference are also examined, with a presentation of some of the philosophical debates underlying the choice of statis

  17. Task-based detectability in CT image reconstruction by filtered backprojection and penalized likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Grace J. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (Canada); Stayman, J. Webster; Zbijewski, Wojciech [Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (United States); Siewerdsen, Jeffrey H., E-mail: jeff.siewerdsen@jhu.edu [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2014-08-15

    Purpose: Nonstationarity is an important aspect of imaging performance in CT and cone-beam CT (CBCT), especially for systems employing iterative reconstruction. This work presents a theoretical framework for both filtered-backprojection (FBP) and penalized-likelihood (PL) reconstruction that includes explicit descriptions of nonstationary noise, spatial resolution, and task-based detectability index. Potential utility of the model was demonstrated in the optimal selection of regularization parameters in PL reconstruction. Methods: Analytical models for local modulation transfer function (MTF) and noise-power spectrum (NPS) were investigated for both FBP and PL reconstruction, including explicit dependence on the object and spatial location. For FBP, a cascaded systems analysis framework was adapted to account for nonstationarity by separately calculating fluence and system gains for each ray passing through any given voxel. For PL, the point-spread function and covariance were derived using the implicit function theorem and first-order Taylor expansion according toFessler [“Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): Applications to tomography,” IEEE Trans. Image Process. 5(3), 493–506 (1996)]. Detectability index was calculated for a variety of simple tasks. The model for PL was used in selecting the regularization strength parameter to optimize task-based performance, with both a constant and a spatially varying regularization map. Results: Theoretical models of FBP and PL were validated in 2D simulated fan-beam data and found to yield accurate predictions of local MTF and NPS as a function of the object and the spatial location. The NPS for both FBP and PL exhibit similar anisotropic nature depending on the pathlength (and therefore, the object and spatial location within the object) traversed by each ray, with the PL NPS experiencing greater smoothing along directions with higher noise. The MTF of FBP

  18. Hazmat Yearly Incident Summary Reports

    Data.gov (United States)

    Department of Transportation — Series of Incident data and summary statistics reports produced which provide statistical information on incidents by type, year, geographical location, and others....

  19. Radiation incidents in dentistry

    International Nuclear Information System (INIS)

    Lovelock, D.J.

    1996-01-01

    Most dental practitioners act as their own radiographer and radiologist, unlike their medical colleagues. Virtually all dental surgeons have a dental X-ray machine for intraoral radiography available to them and 40% of dental practices have equipment for dental panoramic tomography. Because of the low energy of X-ray equipment used in dentistry, radiation incidents tend to be less serious than those associated with other aspects of patient care. Details of 47 known incidents are given. The advent of the 1985 and 1988 Ionising Radiation Regulations has made dental surgeons more aware of the hazards of radiation. These regulations, and general health and safety legislation, have led to a few dental surgeons facing legal action. Because of the publicity associated with these court cases, it is expected that there will be a decrease in radiation incidents arising from the practice of dentistry. (author)

  20. Bias Correction for the Maximum Likelihood Estimate of Ability. Research Report. ETS RR-05-15

    Science.gov (United States)

    Zhang, Jinming

    2005-01-01

    Lord's bias function and the weighted likelihood estimation method are effective in reducing the bias of the maximum likelihood estimate of an examinee's ability under the assumption that the true item parameters are known. This paper presents simulation studies to determine the effectiveness of these two methods in reducing the bias when the item…

  1. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

    DEFF Research Database (Denmark)

    Nielsen, Jan; Parner, Erik

    2010-01-01

    In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

  2. Existence and uniqueness of the maximum likelihood estimator for models with a Kronecker product covariance structure

    NARCIS (Netherlands)

    Ros, B.P.; Bijma, F.; de Munck, J.C.; de Gunst, M.C.M.

    2016-01-01

    This paper deals with multivariate Gaussian models for which the covariance matrix is a Kronecker product of two matrices. We consider maximum likelihood estimation of the model parameters, in particular of the covariance matrix. There is no explicit expression for the maximum likelihood estimator

  3. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  4. Use of deterministic sampling for exploring likelihoods in linkage analysis for quantitative traits.

    NARCIS (Netherlands)

    Mackinnon, M.J.; Beek, van der S.; Kinghorn, B.P.

    1996-01-01

    Deterministic sampling was used to numerically evaluate the expected log-likelihood surfaces of QTL-marker linkage models in large pedigrees with simple structures. By calculating the expected values of likelihoods, questions of power of experimental designs, bias in parameter estimates, approximate

  5. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  6. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  7. Predictors of Self-Reported Likelihood of Working with Older Adults

    Science.gov (United States)

    Eshbaugh, Elaine M.; Gross, Patricia E.; Satrom, Tatum

    2010-01-01

    This study examined the self-reported likelihood of working with older adults in a future career among 237 college undergraduates at a midsized Midwestern university. Although aging anxiety was not significantly related to likelihood of working with older adults, those students who had a greater level of death anxiety were less likely than other…

  8. Organizational Justice and Men's Likelihood to Sexually Harass: The Moderating Role of Sexism and Personality

    Science.gov (United States)

    Krings, Franciska; Facchin, Stephanie

    2009-01-01

    This study demonstrated relations between men's perceptions of organizational justice and increased sexual harassment proclivities. Respondents reported higher likelihood to sexually harass under conditions of low interactional justice, suggesting that sexual harassment likelihood may increase as a response to perceived injustice. Moreover, the…

  9. Sampling variability in forensic likelihood-ratio computation: A simulation study

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Meuwly, Didier

    2015-01-01

    Recently, in the forensic biometric community, there is a growing interest to compute a metric called “likelihood- ratio‿ when a pair of biometric specimens is compared using a biometric recognition system. Generally, a biomet- ric recognition system outputs a score and therefore a likelihood-ratio

  10. The dorsal medial frontal cortex is sensitive to time on task, not response conflict or error likelihood.

    Science.gov (United States)

    Grinband, Jack; Savitskaya, Judith; Wager, Tor D; Teichert, Tobias; Ferrera, Vincent P; Hirsch, Joy

    2011-07-15

    The dorsal medial frontal cortex (dMFC) is highly active during choice behavior. Though many models have been proposed to explain dMFC function, the conflict monitoring model is the most influential. It posits that dMFC is primarily involved in detecting interference between competing responses thus signaling the need for control. It accurately predicts increased neural activity and response time (RT) for incompatible (high-interference) vs. compatible (low-interference) decisions. However, it has been shown that neural activity can increase with time on task, even when no decisions are made. Thus, the greater dMFC activity on incompatible trials may stem from longer RTs rather than response conflict. This study shows that (1) the conflict monitoring model fails to predict the relationship between error likelihood and RT, and (2) the dMFC activity is not sensitive to congruency, error likelihood, or response conflict, but is monotonically related to time on task. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  12. The likelihood principle and its proof – a never-ending story…

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini

    2015-01-01

    An ongoing controversy in philosophy of statistics is the so-called “likelihood principle” essentially stating that all evidence which is obtained from an experiment about an unknown quantity θ is contained in the likelihood function of θ. Common classical statistical methodology, such as the use...... of significance tests, and confidence intervals, depends on the experimental procedure and unrealized events and thus violates the likelihood principle. The likelihood principle was identified by that name and proved in a famous paper by Allan Birnbaum in 1962. However, ever since both the principle itself...... as well as the proof has been highly debated. This presentation will illustrate the debate of both the principle and its proof, from 1962 and up to today. An often-used experiment to illustrate the controversy between classical interpretation and evidential confirmation based on the likelihood principle...

  13. Major Accidents (Gray Swans) Likelihood Modeling Using Accident Precursors and Approximate Reasoning.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-07-01

    Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.

  14. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  15. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    Science.gov (United States)

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  16. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which is a r...

  17. Sampling of systematic errors to estimate likelihood weights in nuclear data uncertainty propagation

    International Nuclear Information System (INIS)

    Helgesson, P.; Sjöstrand, H.; Koning, A.J.; Rydén, J.; Rochman, D.; Alhassan, E.; Pomp, S.

    2016-01-01

    In methodologies for nuclear data (ND) uncertainty assessment and propagation based on random sampling, likelihood weights can be used to infer experimental information into the distributions for the ND. As the included number of correlated experimental points grows large, the computational time for the matrix inversion involved in obtaining the likelihood can become a practical problem. There are also other problems related to the conventional computation of the likelihood, e.g., the assumption that all experimental uncertainties are Gaussian. In this study, a way to estimate the likelihood which avoids matrix inversion is investigated; instead, the experimental correlations are included by sampling of systematic errors. It is shown that the model underlying the sampling methodology (using univariate normal distributions for random and systematic errors) implies a multivariate Gaussian for the experimental points (i.e., the conventional model). It is also shown that the likelihood estimates obtained through sampling of systematic errors approach the likelihood obtained with matrix inversion as the sample size for the systematic errors grows large. In studied practical cases, it is seen that the estimates for the likelihood weights converge impractically slowly with the sample size, compared to matrix inversion. The computational time is estimated to be greater than for matrix inversion in cases with more experimental points, too. Hence, the sampling of systematic errors has little potential to compete with matrix inversion in cases where the latter is applicable. Nevertheless, the underlying model and the likelihood estimates can be easier to intuitively interpret than the conventional model and the likelihood function involving the inverted covariance matrix. Therefore, this work can both have pedagogical value and be used to help motivating the conventional assumption of a multivariate Gaussian for experimental data. The sampling of systematic errors could also

  18. Incident users of antipsychotics

    DEFF Research Database (Denmark)

    Baandrup, Lone; Kruse, Marie

    2016-01-01

    PURPOSE: In Denmark, as well as in many other countries, consumption of antipsychotics is on the rise, partly due to increasing off-label use. The aim of this study was to analyze and quantify the extent of off-label use and polypharmacy in incident users of antipsychotic medication, and to examine...

  19. The incidence of urea cycle disorders.

    Science.gov (United States)

    Summar, Marshall L; Koelker, Stefan; Freedenberg, Debra; Le Mons, Cynthia; Haberle, Johannes; Lee, Hye-Seung; Kirmse, Brian

    2013-01-01

    A key question for urea cycle disorders is their incidence. In the United States two UCDs, argininosuccinic synthetase and lyase deficiency, are currently detected by newborn screening. We used newborn screening data on over 6million births and data from the large US and European longitudinal registries to determine how common these conditions are. The incidence for the United States is predicted to be 1 urea cycle disorder patient for every 35,000 births presenting about 113 new patients per year across all age groups. © 2013.

  20. Profile of preoperative fecal organic acids closely predicts the incidence of postoperative infectious complications after major hepatectomy with extrahepatic bile duct resection: Importance of fecal acetic acid plus butyric acid minus lactic acid gap.

    Science.gov (United States)

    Yokoyama, Yukihiro; Mizuno, Takashi; Sugawara, Gen; Asahara, Takashi; Nomoto, Koji; Igami, Tsuyoshi; Ebata, Tomoki; Nagino, Masato

    2017-10-01

    To investigate the association between preoperative fecal organic acid concentrations and the incidence of postoperative infectious complications in patients undergoing major hepatectomy with extrahepatic bile duct resection for biliary malignancies. The fecal samples of 44 patients were collected before undergoing hepatectomy with bile duct resection for biliary malignancies. The concentrations of fecal organic acids, including acetic acid, butyric acid, and lactic acid, and representative fecal bacteria were measured. The perioperative clinical characteristics and the concentrations of fecal organic acids were compared between patients with and without postoperative infectious complications. Among 44 patients, 13 (30%) developed postoperative infectious complications. Patient age and intraoperative bleeding were significantly greater in patients with postoperative infectious complications compared with those without postoperative infectious complications. The concentrations of fecal acetic acid and butyric acid were significantly less, whereas the concentration of fecal lactic acid tended to be greater in the patients with postoperative infectious complications. The calculated gap between the concentrations of fecal acetic acid plus butyric acid minus lactic acid gap was less in the patients with postoperative infectious complications (median 43.5 vs 76.1 μmol/g of feces, P = .011). Multivariate analysis revealed that an acetic acid plus butyric acid minus lactic acid gap acid profile (especially low acetic acid, low butyric acid, and high lactic acid) had a clinically important impact on the incidence of postoperative infectious complications in patients undergoing major hepatectomy with extrahepatic bile duct resection. Copyright © 2017. Published by Elsevier Inc.

  1. Incidents with hazardous radiation sources

    International Nuclear Information System (INIS)

    Schoenhacker, Stefan

    2016-01-01

    Incidents with hazardous radiation sources can occur in any country, even those without nuclear facilities. Preparedness for such incidents is supposed to fulfill globally agreed minimum standards. Incidents are categorized in incidents with licensed handling of radiation sources as for material testing, transport accidents of hazardous radiation sources, incidents with radionuclide batteries, incidents with satellites containing radioactive inventory, incidents wit not licensed handling of illegally acquired hazardous radiation sources. The emergency planning in Austria includes a differentiation according to the consequences: incidents with release of radioactive materials resulting in restricted contamination, incidents with release of radioactive materials resulting in local contamination, and incidents with the hazard of e@nhanced exposure due to the radiation source.

  2. Full likelihood analysis of genetic risk with variable age at onset disease--combining population-based registry data and demographic information.

    Directory of Open Access Journals (Sweden)

    Janne Pitkäniemi

    Full Text Available BACKGROUND: In genetic studies of rare complex diseases it is common to ascertain familial data from population based registries through all incident cases diagnosed during a pre-defined enrollment period. Such an ascertainment procedure is typically taken into account in the statistical analysis of the familial data by constructing either a retrospective or prospective likelihood expression, which conditions on the ascertainment event. Both of these approaches lead to a substantial loss of valuable data. METHODOLOGY AND FINDINGS: Here we consider instead the possibilities provided by a Bayesian approach to risk analysis, which also incorporates the ascertainment procedure and reference information concerning the genetic composition of the target population to the considered statistical model. Furthermore, the proposed Bayesian hierarchical survival model does not require the considered genotype or haplotype effects be expressed as functions of corresponding allelic effects. Our modeling strategy is illustrated by a risk analysis of type 1 diabetes mellitus (T1D in the Finnish population-based on the HLA-A, HLA-B and DRB1 human leucocyte antigen (HLA information available for both ascertained sibships and a large number of unrelated individuals from the Finnish bone marrow donor registry. The heterozygous genotype DR3/DR4 at the DRB1 locus was associated with the lowest predictive probability of T1D free survival to the age of 15, the estimate being 0.936 (0.926; 0.945 95% credible interval compared to the average population T1D free survival probability of 0.995. SIGNIFICANCE: The proposed statistical method can be modified to other population-based family data ascertained from a disease registry provided that the ascertainment process is well documented, and that external information concerning the sizes of birth cohorts and a suitable reference sample are available. We confirm the earlier findings from the same data concerning the HLA-DR3

  3. The fine-tuning cost of the likelihood in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    In SUSY models, the fine tuning of the electroweak (EW) scale with respect to their parameters gamma_i={m_0, m_{1/2}, mu_0, A_0, B_0,...} and the maximal likelihood L to fit the experimental data are usually regarded as two different problems. We show that, if one regards the EW minimum conditions as constraints that fix the EW scale, this commonly held view is not correct and that the likelihood contains all the information about fine-tuning. In this case we show that the corrected likelihood is equal to the ratio L/Delta of the usual likelihood L and the traditional fine tuning measure Delta of the EW scale. A similar result is obtained for the integrated likelihood over the set {gamma_i}, that can be written as a surface integral of the ratio L/Delta, with the surface in gamma_i space determined by the EW minimum constraints. As a result, a large likelihood actually demands a large ratio L/Delta or equivalently, a small chi^2_{new}=chi^2_{old}+2*ln(Delta). This shows the fine-tuning cost to the likelihood ...

  4. Physical and Sexual Violence and Incident Sexually Transmitted Infections

    Science.gov (United States)

    Anand, Mallika; Redding, Colleen A.; Peipert, Jeffrey F.

    2009-01-01

    Abstract Objective To investigate whether women aged 13–35 who were victims of interpersonal violence were more likely than nonvictims to experience incident sexually transmitted infections (STIs). Methods We examined 542 women aged 13–35 enrolled in Project PROTECT, a randomized clinical trial that compared two different methods of computer-based intervention to promote the use of dual methods of contraception. Participants completed a baseline questionnaire that included questions about their history of interpersonal violence and were followed for incident STIs over the 2-year study period. We compared the incidence of STIs in women with and without a history of interpersonal violence using bivariate analyses and multiple logistic regression. Results In the bivariate analyses, STI incidence was found to be significantly associated with African American race/ethnicity, a higher number of sexual partners in the past month, and a lower likelihood of avoidance of sexual partners who pressure to have sex without a condom. In both crude and adjusted regression analyses, time to STI incidence was faster among women who reported physical or sexual abuse in the year before study enrollment (HRRadj = 1.68, 95% CI 1.06, 2.65). Conclusions Women with a recent history of abuse are at significantly increased risk of STI incidence than are nonvictims. PMID:19245303

  5. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  6. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  7. Undernutrition among adults in India: the significance of individual-level and contextual factors impacting on the likelihood of underweight across sub-populations.

    Science.gov (United States)

    Siddiqui, Md Zakaria; Donato, Ronald

    2017-01-01

    To investigate the extent to which individual-level as well as macro-level contextual factors influence the likelihood of underweight across adult sub-populations in India. Population-based cross-sectional survey included in India's National Health Family Survey conducted in 2005-06. We disaggregated into eight sub-populations. Multistage nationally representative household survey covering 99 % of India's population. The survey covered 124 385 females aged 15-49 years and 74 369 males aged 15-54 years. A social gradient in underweight exists in India. Even after allowing for wealth status, differences in the predicted probability of underweight persisted based upon rurality, age/maturity and gender. We found individual-level education lowered the likelihood of underweight for males, but no statistical association for females. Paradoxically, rural young (15-24 years) females from more educated villages had a higher likelihood of underweight relative to those in less educated villages; but for rural mature (>24 years) females the opposite was the case. Christians had a significantly lower likelihood of underweight relative to other socio-religious groups (OR=0·53-0·80). Higher state-level inequality increased the likelihood of underweight across most population groups, while neighbourhood inequality exhibited a similar relationship for the rural young population subgroups only. Individual states/neighbourhoods accounted for 5-9 % of the variation in the prediction of underweight. We found that rural young females represent a particularly highly vulnerable sub-population. Economic growth alone is unlikely to reduce the burden of malnutrition in India; accordingly, policy makers need to address the broader social determinants that contribute to higher underweight prevalence in specific demographic subgroups.

  8. Maximal information analysis: I - various Wayne State plots and the most common likelihood principle

    International Nuclear Information System (INIS)

    Bonvicini, G.

    2005-01-01

    Statistical analysis using all moments of the likelihood L(y vertical bar α) (y being the data and α being the fit parameters) is presented. The relevant plots for various data fitting situations are presented. The goodness of fit (GOF) parameter (currently the χ 2 ) is redefined as the isoprobability level in a multidimensional space. Many useful properties of statistical analysis are summarized in a new statistical principle which states that the most common likelihood, and not the tallest, is the best possible likelihood, when comparing experiments or hypotheses

  9. Simplified likelihood for the re-interpretation of public CMS results

    CERN Document Server

    The CMS Collaboration

    2017-01-01

    In this note, a procedure for the construction of simplified likelihoods for the re-interpretation of the results of CMS searches for new physics is presented. The procedure relies on the use of a reduced set of information on the background models used in these searches which can readily be provided by the CMS collaboration. A toy example is used to demonstrate the procedure and its accuracy in reproducing the full likelihood for setting limits in models for physics beyond the standard model. Finally, two representative searches from the CMS collaboration are used to demonstrate the validity of the simplified likelihood approach under realistic conditions.

  10. Maximum Credible Incidents

    CERN Document Server

    Strait, J

    2009-01-01

    Following the incident in sector 34, considerable effort has been made to improve the systems for detecting similar faults and to improve the safety systems to limit the damage if a similar incident should occur. Nevertheless, even after the consolidation and repairs are completed, other faults may still occur in the superconducting magnet systems, which could result in damage to the LHC. Such faults include both direct failures of a particular component or system, or an incorrect response to a “normal” upset condition, for example a quench. I will review a range of faults which could be reasonably expected to occur in the superconducting magnet systems, and which could result in substantial damage and down-time to the LHC. I will evaluate the probability and the consequences of such faults, and suggest what mitigations, if any, are possible to protect against each.

  11. Contaminated Mexican steel incident

    International Nuclear Information System (INIS)

    1985-01-01

    This report documents the circumstances contributing to the inadvertent melting of cobalt 60 (Co-60) contaminated scrap metal in two Mexican steel foundries and the subsequent distribution of contaminated steel products into the United States. The report addresses mainly those actions taken by US Federal and state agencies to protect the US population from radiation risks associated with the incident. Mexico had much more serious radiation exposure and contamination problems to manage. The United States Government maintained a standing offer to provide technical and medical assistance to the Mexican Government. The report covers the tracing of the source to its origin, response actions to recover radioactive steel in the United States, and return of the contaminated materials to Mexico. The incident resulted in significant radiation exposures within Mexico, but no known significant exposure within the United States. Response to the incident required the combined efforts of the Nuclear Regulatory Commission (NRC), Department of Energy, Department of Transportation, Department of State, and US Customs Service (Department of Treasury) personnel at the Federal level and representatives of all 50 State Radiation Control Programs and, in some instances, local and county government personnel. The response also required a diplomatic interface with the Mexican Government and cooperation of numerous commercial establishments and members of the general public. The report describes the factual information associated with the event and may serve as information for subsequent recommendations and actions by the NRC. 8 figures

  12. Current incidence of duplicate publication in otolaryngology.

    Science.gov (United States)

    Cheung, Veronique Wan Fook; Lam, Gilbert O A; Wang, Yun Fan; Chadha, Neil K

    2014-03-01

    Duplicate publication--deemed highly unethical--is the reproduction of substantial content in another article by the same authors. In 1999, Rosenthal et al. identified an 8.5% incidence of duplicate articles in two otolaryngology journals. We explored the current incidence in three otolaryngology journals in North America and Europe. Retrospective literature review. Index articles in 2008 in Archives of Otolaryngology-Head and Neck Surgery, Laryngoscope, and Clinical Otolaryngology were searched using MEDLINE. Potential duplicate publications in 2006 through 2010 were identified using the first, second, and last authors' names. Three authors independently investigated suspected duplicate publications--classifying them by degree of duplication. Of 358 index articles screened, 75 (20.9%) had 119 potential duplicates from 2006 to 2010. Full review of these 119 potential duplicates revealed a total of 40 articles with some form of redundancy (33.6% of the potential duplicates) involving 27 index articles (7.5% of 358 index articles); one (0.8%) "dual" publication (identical or nearly identical data and conclusions to the index article); three (2.5%) "suspected" dual publications (less than 50% new data and same conclusions); and 36 (30.3%) publications with "salami-slicing" (portion of the index article data repeated) were obtained. Further analysis compared the likelihood of duplicate publication by study source and subspecialty within otolaryngology. The incidence of duplicate publication has not significantly changed over 10 years. "Salami-slicing" was a concerning practice, with no cross-referencing in 61% of these cases. Detecting and eliminating redundant publications is a laborious task, but it is essential in upholding the journal quality and research integrity. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  13. Debris Likelihood, based on GhostNet, NASA Aqua MODIS, and GOES Imager, EXPERIMENTAL

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Debris Likelihood Index (Estimated) is calculated from GhostNet, NASA Aqua MODIS Chl a and NOAA GOES Imager SST data. THIS IS AN EXPERIMENTAL PRODUCT: intended...

  14. A biclustering algorithm for binary matrices based on penalized Bernoulli likelihood

    KAUST Repository

    Lee, Seokho; Huang, Jianhua Z.

    2013-01-01

    We propose a new biclustering method for binary data matrices using the maximum penalized Bernoulli likelihood estimation. Our method applies a multi-layer model defined on the logits of the success probabilities, where each layer represents a

  15. Performances of the likelihood-ratio classifier based on different data modelings

    NARCIS (Netherlands)

    Chen, C.; Veldhuis, Raymond N.J.

    2008-01-01

    The classical likelihood ratio classifier easily collapses in many biometric applications especially with independent training-test subjects. The reason lies in the inaccurate estimation of the underlying user-specific feature density. Firstly, the feature density estimation suffers from

  16. Finite mixture model: A maximum likelihood estimation approach on time series data

    Science.gov (United States)

    Yen, Phoong Seuk; Ismail, Mohd Tahir; Hamzah, Firdaus Mohamad

    2014-09-01

    Recently, statistician emphasized on the fitting of finite mixture model by using maximum likelihood estimation as it provides asymptotic properties. In addition, it shows consistency properties as the sample sizes increases to infinity. This illustrated that maximum likelihood estimation is an unbiased estimator. Moreover, the estimate parameters obtained from the application of maximum likelihood estimation have smallest variance as compared to others statistical method as the sample sizes increases. Thus, maximum likelihood estimation is adopted in this paper to fit the two-component mixture model in order to explore the relationship between rubber price and exchange rate for Malaysia, Thailand, Philippines and Indonesia. Results described that there is a negative effect among rubber price and exchange rate for all selected countries.

  17. Parameter estimation in astronomy through application of the likelihood ratio. [satellite data analysis techniques

    Science.gov (United States)

    Cash, W.

    1979-01-01

    Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.

  18. Maximum Likelihood Approach for RFID Tag Set Cardinality Estimation with Detection Errors

    DEFF Research Database (Denmark)

    Nguyen, Chuyen T.; Hayashi, Kazunori; Kaneko, Megumi

    2013-01-01

    Abstract Estimation schemes of Radio Frequency IDentification (RFID) tag set cardinality are studied in this paper using Maximum Likelihood (ML) approach. We consider the estimation problem under the model of multiple independent reader sessions with detection errors due to unreliable radio...... is evaluated under dierent system parameters and compared with that of the conventional method via computer simulations assuming flat Rayleigh fading environments and framed-slotted ALOHA based protocol. Keywords RFID tag cardinality estimation maximum likelihood detection error...

  19. Modified Moment, Maximum Likelihood and Percentile Estimators for the Parameters of the Power Function Distribution

    Directory of Open Access Journals (Sweden)

    Azam Zaka

    2014-10-01

    Full Text Available This paper is concerned with the modifications of maximum likelihood, moments and percentile estimators of the two parameter Power function distribution. Sampling behavior of the estimators is indicated by Monte Carlo simulation. For some combinations of parameter values, some of the modified estimators appear better than the traditional maximum likelihood, moments and percentile estimators with respect to bias, mean square error and total deviation.

  20. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.