Augmenting Data with Published Results in Bayesian Linear Regression
de Leeuw, Christiaan; Klugkist, Irene
2012-01-01
In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.
Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H
2016-04-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.
Applications of MIDAS regression in analysing trends in water quality
Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.
2014-04-01
We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies
Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.
2016-01-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epide...
Repeated Results Analysis for Middleware Regression Benchmarking
Czech Academy of Sciences Publication Activity Database
Bulej, Lubomír; Kalibera, T.; Tůma, P.
2005-01-01
Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005
Statistical and regression analyses of detected extrasolar systems
Czech Academy of Sciences Publication Activity Database
Pintr, Pavel; Peřinová, V.; Lukš, A.; Pathak, A.
2013-01-01
Roč. 75, č. 1 (2013), s. 37-45 ISSN 0032-0633 Institutional support: RVO:61389021 Keywords : Exoplanets * Kepler candidates * Regression analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.630, year: 2013 http://www.sciencedirect.com/science/article/pii/S0032063312003066
Analysing inequalities in Germany a structured additive distributional regression approach
Silbersdorff, Alexander
2017-01-01
This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals’ realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.
USE OF THE SIMPLE LINEAR REGRESSION MODEL IN MACRO-ECONOMICAL ANALYSES
Directory of Open Access Journals (Sweden)
Constantin ANGHELACHE
2011-10-01
Full Text Available The article presents the fundamental aspects of the linear regression, as a toolbox which can be used in macroeconomic analyses. The article describes the estimation of the parameters, the statistical tests used, the homoscesasticity and heteroskedasticity. The use of econometrics instrument in macroeconomics is an important factor that guarantees the quality of the models, analyses, results and possible interpretation that can be drawn at this level.
Mapping the results of local statistics: Using geographically weighted regression
Directory of Open Access Journals (Sweden)
Stephen A. Matthews
2012-03-01
Full Text Available BACKGROUND The application of geographically weighted regression (GWR - a local spatial statistical technique used to test for spatial nonstationarity - has grown rapidly in the social, health, and demographic sciences. GWR is a useful exploratory analytical tool that generates a set of location-specific parameter estimates which can be mapped and analysed to provide information on spatial nonstationarity in the relationships between predictors and the outcome variable. OBJECTIVE A major challenge to users of GWR methods is how best to present and synthesize the large number of mappable results, specifically the local parameter parameter estimates and local t-values, generated from local GWR models. We offer an elegant solution. METHODS This paper introduces a mapping technique to simultaneously display local parameter estimates and local t-values on one map based on the use of data selection and transparency techniques. We integrate GWR software and GIS software package (ArcGIS and adapt earlier work in cartography on bivariate mapping. We compare traditional mapping strategies (i.e., side-by-side comparison and isoline overlay maps with our method using an illustration focusing on US county infant mortality data. CONCLUSIONS The resultant map design is more elegant than methods used to date. This type of map presentation can facilitate the exploration and interpretation of nonstationarity, focusing map reader attention on the areas of primary interest.
Tripepi, Giovanni; Jager, Kitty J.; Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine
2011-01-01
Because of some limitations of stratification methods, epidemiologists frequently use multiple linear and logistic regression analyses to address specific epidemiological questions. If the dependent variable is a continuous one (for example, systolic pressure and serum creatinine), the researcher
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
Directory of Open Access Journals (Sweden)
Esther Leushuis
2016-12-01
Full Text Available Background: Standardization of the semen analysis may improve reproducibility. We assessed variability between laboratories in semen analyses and evaluated whether a transformation using Z scores and regression statistics was able to reduce this variability. Materials and Methods: We performed a retrospective cohort study. We calculated between-laboratory coefficients of variation (CVB for sperm concentration and for morphology. Subsequently, we standardized the semen analysis results by calculating laboratory specific Z scores, and by using regression. We used analysis of variance for four semen parameters to assess systematic differences between laboratories before and after the transformations, both in the circulation samples and in the samples obtained in the prospective cohort study in the Netherlands between January 2002 and February 2004. Results: The mean CVB was 7% for sperm concentration (range 3 to 13% and 32% for sperm morphology (range 18 to 51%. The differences between the laboratories were statistically significant for all semen parameters (all P<0.001. Standardization using Z scores did not reduce the differences in semen analysis results between the laboratories (all P<0.001. Conclusion: There exists large between-laboratory variability for sperm morphology and small, but statistically significant, between-laboratory variation for sperm concentration. Standardization using Z scores does not eliminate between-laboratory variability.
International Nuclear Information System (INIS)
Slutskaya, N.G.; Mosseh, I.B.
2006-01-01
Data about genetic mutations under radiation and chemical treatment for different types of cells have been analyzed with correlation and regression analyses. Linear correlation between different genetic effects in sex cells and somatic cells have found. The results may be extrapolated on sex cells of human and mammals. (authors)
The number of subjects per variable required in linear regression analyses.
Austin, Peter C; Steyerberg, Ewout W
2015-06-01
To determine the number of independent variables that can be included in a linear regression model. We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression coefficients and standard errors, on the empirical coverage of estimated confidence intervals, and on the accuracy of the estimated R(2) of the fitted model. A minimum of approximately two SPV tended to result in estimation of regression coefficients with relative bias of less than 10%. Furthermore, with this minimum number of SPV, the standard errors of the regression coefficients were accurately estimated and estimated confidence intervals had approximately the advertised coverage rates. A much higher number of SPV were necessary to minimize bias in estimating the model R(2), although adjusted R(2) estimates behaved well. The bias in estimating the model R(2) statistic was inversely proportional to the magnitude of the proportion of variation explained by the population regression model. Linear regression models require only two SPV for adequate estimation of regression coefficients, standard errors, and confidence intervals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T
2016-02-01
The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.
Two SPSS programs for interpreting multiple regression results.
Lorenzo-Seva, Urbano; Ferrando, Pere J; Chico, Eliseo
2010-02-01
When multiple regression is used in explanation-oriented designs, it is very important to determine both the usefulness of the predictor variables and their relative importance. Standardized regression coefficients are routinely provided by commercial programs. However, they generally function rather poorly as indicators of relative importance, especially in the presence of substantially correlated predictors. We provide two user-friendly SPSS programs that implement currently recommended techniques and recent developments for assessing the relevance of the predictors. The programs also allow the user to take into account the effects of measurement error. The first program, MIMR-Corr.sps, uses a correlation matrix as input, whereas the second program, MIMR-Raw.sps, uses the raw data and computes bootstrap confidence intervals of different statistics. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from http://brm.psychonomic-journals.org/content/supplemental.
Directory of Open Access Journals (Sweden)
Giuliano de Oliveira Freitas
2013-10-01
Full Text Available PURPOSE: To determine linear regression models between Alpins descriptive indices and Thibos astigmatic power vectors (APV, assessing the validity and strength of such correlations. METHODS: This case series prospectively assessed 62 eyes of 31 consecutive cataract patients with preoperative corneal astigmatism between 0.75 and 2.50 diopters in both eyes. Patients were randomly assorted among two phacoemulsification groups: one assigned to receive AcrySof®Toric intraocular lens (IOL in both eyes and another assigned to have AcrySof Natural IOL associated with limbal relaxing incisions, also in both eyes. All patients were reevaluated postoperatively at 6 months, when refractive astigmatism analysis was performed using both Alpins and Thibos methods. The ratio between Thibos postoperative APV and preoperative APV (APVratio and its linear regression to Alpins percentage of success of astigmatic surgery, percentage of astigmatism corrected and percentage of astigmatism reduction at the intended axis were assessed. RESULTS: Significant negative correlation between the ratio of post- and preoperative Thibos APVratio and Alpins percentage of success (%Success was found (Spearman's ρ=-0.93; linear regression is given by the following equation: %Success = (-APVratio + 1.00x100. CONCLUSION: The linear regression we found between APVratio and %Success permits a validated mathematical inference concerning the overall success of astigmatic surgery.
Karami, K; Zerehdaran, S; Barzanooni, B; Lotfi, E
2017-12-01
1. The aim of the present study was to estimate genetic parameters for average egg weight (EW) and egg number (EN) at different ages in Japanese quail using multi-trait random regression (MTRR) models. 2. A total of 8534 records from 900 quail, hatched between 2014 and 2015, were used in the study. Average weekly egg weights and egg numbers were measured from second until sixth week of egg production. 3. Nine random regression models were compared to identify the best order of the Legendre polynomials (LP). The most optimal model was identified by the Bayesian Information Criterion. A model with second order of LP for fixed effects, second order of LP for additive genetic effects and third order of LP for permanent environmental effects (MTRR23) was found to be the best. 4. According to the MTRR23 model, direct heritability for EW increased from 0.26 in the second week to 0.53 in the sixth week of egg production, whereas the ratio of permanent environment to phenotypic variance decreased from 0.48 to 0.1. Direct heritability for EN was low, whereas the ratio of permanent environment to phenotypic variance decreased from 0.57 to 0.15 during the production period. 5. For each trait, estimated genetic correlations among weeks of egg production were high (from 0.85 to 0.98). Genetic correlations between EW and EN were low and negative for the first two weeks, but they were low and positive for the rest of the egg production period. 6. In conclusion, random regression models can be used effectively for analysing egg production traits in Japanese quail. Response to selection for increased egg weight would be higher at older ages because of its higher heritability and such a breeding program would have no negative genetic impact on egg production.
Analyses of Developmental Rate Isomorphy in Ectotherms: Introducing the Dirichlet Regression.
Directory of Open Access Journals (Sweden)
David S Boukal
Full Text Available Temperature drives development in insects and other ectotherms because their metabolic rate and growth depends directly on thermal conditions. However, relative durations of successive ontogenetic stages often remain nearly constant across a substantial range of temperatures. This pattern, termed 'developmental rate isomorphy' (DRI in insects, appears to be widespread and reported departures from DRI are generally very small. We show that these conclusions may be due to the caveats hidden in the statistical methods currently used to study DRI. Because the DRI concept is inherently based on proportional data, we propose that Dirichlet regression applied to individual-level data is an appropriate statistical method to critically assess DRI. As a case study we analyze data on five aquatic and four terrestrial insect species. We find that results obtained by Dirichlet regression are consistent with DRI violation in at least eight of the studied species, although standard analysis detects significant departure from DRI in only four of them. Moreover, the departures from DRI detected by Dirichlet regression are consistently much larger than previously reported. The proposed framework can also be used to infer whether observed departures from DRI reflect life history adaptations to size- or stage-dependent effects of varying temperature. Our results indicate that the concept of DRI in insects and other ectotherms should be critically re-evaluated and put in a wider context, including the concept of 'equiproportional development' developed for copepods.
International Nuclear Information System (INIS)
Bhowmik, K.R.; Islam, S.
2016-01-01
Logistic regression (LR) analysis is the most common statistical methodology to find out the determinants of childhood mortality. However, the significant predictors cannot be ranked according to their influence on the response variable. Multiple classification (MC) analysis can be applied to identify the significant predictors with a priority index which helps to rank the predictors. The main objective of the study is to find the socio-demographic determinants of childhood mortality at neonatal, post-neonatal, and post-infant period by fitting LR model as well as to rank those through MC analysis. The study is conducted using the data of Bangladesh Demographic and Health Survey 2007 where birth and death information of children were collected from their mothers. Three dichotomous response variables are constructed from children age at death to fit the LR and MC models. Socio-economic and demographic variables significantly associated with the response variables separately are considered in LR and MC analyses. Both the LR and MC models identified the same significant predictors for specific childhood mortality. For both the neonatal and child mortality, biological factors of children, regional settings, and parents socio-economic status are found as 1st, 2nd, and 3rd significant groups of predictors respectively. Mother education and household environment are detected as major significant predictors of post-neonatal mortality. This study shows that MC analysis with or without LR analysis can be applied to detect determinants with rank which help the policy makers taking initiatives on a priority basis. (author)
The number of subjects per variable required in linear regression analyses
P.C. Austin (Peter); E.W. Steyerberg (Ewout)
2015-01-01
textabstractObjectives To determine the number of independent variables that can be included in a linear regression model. Study Design and Setting We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression
Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.
Onder, Seyhan; Mutlu, Mert
2017-09-01
Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.
Kromhout, D.
2009-01-01
Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements of the
Li, Spencer D.
2011-01-01
Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…
Wu, Dane W.
2002-01-01
The year 2000 US presidential election between Al Gore and George Bush has been the most intriguing and controversial one in American history. The state of Florida was the trigger for the controversy, mainly, due to the use of the misleading "butterfly ballot". Using prediction (or confidence) intervals for least squares regression lines…
Check-all-that-apply data analysed by Partial Least Squares regression
DEFF Research Database (Denmark)
Rinnan, Åsmund; Giacalone, Davide; Frøst, Michael Bom
2015-01-01
are analysed by multivariate techniques. CATA data can be analysed both by setting the CATA as the X and the Y. The former is the PLS-Discriminant Analysis (PLS-DA) version, while the latter is the ANOVA-PLS (A-PLS) version. We investigated the difference between these two approaches, concluding...
DEFF Research Database (Denmark)
Scott, Neil W; Fayers, Peter M; Aaronson, Neil K
2010-01-01
Differential item functioning (DIF) methods can be used to determine whether different subgroups respond differently to particular items within a health-related quality of life (HRQoL) subscale, after allowing for overall subgroup differences in that scale. This article reviews issues that arise ...... when testing for DIF in HRQoL instruments. We focus on logistic regression methods, which are often used because of their efficiency, simplicity and ease of application....
DEFF Research Database (Denmark)
Tybjærg-Hansen, Anne
2009-01-01
Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements...... of the risk factors are observed on a subsample. We extend the multivariate RC techniques to a meta-analysis framework where multiple studies provide independent repeat measurements and information on disease outcome. We consider the cases where some or all studies have repeat measurements, and compare study......-specific, averaged and empirical Bayes estimates of RC parameters. Additionally, we allow for binary covariates (e.g. smoking status) and for uncertainty and time trends in the measurement error corrections. Our methods are illustrated using a subset of individual participant data from prospective long-term studies...
Huang, Banglian; Yang, Yiming; Luo, Tingting; Wu, S.; Du, Xuezhu; Cai, Detian; Loo, van, E.N.; Huang Bangquan
2013-01-01
In the present study correlation, regression and path analyses were carried out to decide correlations among the agro- nomic traits and their contributions to seed yield per plant in Crambe abyssinica. Partial correlation analysis indicated that plant height (X1) was significantly correlated with branching height and the number of first branches (P <0.01); Branching height (X2) was significantly correlated with pod number of primary inflorescence (P <0.01) and number of secondary branch...
Nuclear power plants: Results of recent safety analyses
International Nuclear Information System (INIS)
Steinmetz, E.
1987-01-01
The contributions deal with the problems posed by low radiation doses, with the information currently available from analyses of the Chernobyl reactor accident, and with risk assessments in connection with nuclear power plant accidents. Other points of interest include latest results on fission product release from reactor core or reactor building, advanced atmospheric dispersion models for incident and accident analyses, reliability studies on safety systems, and assessment of fire hazard in nuclear installations. The various contributions are found as separate entries in the database. (DG) [de
Valle, Denis; Lima, Joanna M Tucker; Millar, Justin; Amratia, Punam; Haque, Ubydul
2015-11-04
Logistic regression is a statistical model widely used in cross-sectional and cohort studies to identify and quantify the effects of potential disease risk factors. However, the impact of imperfect tests on adjusted odds ratios (and thus on the identification of risk factors) is under-appreciated. The purpose of this article is to draw attention to the problem associated with modelling imperfect diagnostic tests, and propose simple Bayesian models to adequately address this issue. A systematic literature review was conducted to determine the proportion of malaria studies that appropriately accounted for false-negatives/false-positives in a logistic regression setting. Inference from the standard logistic regression was also compared with that from three proposed Bayesian models using simulations and malaria data from the western Brazilian Amazon. A systematic literature review suggests that malaria epidemiologists are largely unaware of the problem of using logistic regression to model imperfect diagnostic test results. Simulation results reveal that statistical inference can be substantially improved when using the proposed Bayesian models versus the standard logistic regression. Finally, analysis of original malaria data with one of the proposed Bayesian models reveals that microscopy sensitivity is strongly influenced by how long people have lived in the study region, and an important risk factor (i.e., participation in forest extractivism) is identified that would have been missed by standard logistic regression. Given the numerous diagnostic methods employed by malaria researchers and the ubiquitous use of logistic regression to model the results of these diagnostic tests, this paper provides critical guidelines to improve data analysis practice in the presence of misclassification error. Easy-to-use code that can be readily adapted to WinBUGS is provided, enabling straightforward implementation of the proposed Bayesian models.
Directory of Open Access Journals (Sweden)
Kevin D. Cashman
2017-05-01
Full Text Available Dietary Reference Values (DRVs for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years of the vitamin D intake–serum 25-hydroxyvitamin D (25(OHD dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OHD concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OHD >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OHD to vitamin D intake.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Fuzzy Logic Based Method for Analysing Test Results
Directory of Open Access Journals (Sweden)
Le Xuan Vinh
2017-11-01
Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.
Cashman, Kevin D.; Ritz, Christian; Kiely, Mairead
2017-01-01
Dietary Reference Values (DRVs) for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs) are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD)-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years) of the vitamin D intake–serum 25-hydroxyvitamin D (25(OH)D) dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OH)D concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years) from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OH)D >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OH)D to vitamin D intake. PMID:28481259
Dyer, Betsey D.; Kahn, Michael J.; LeBlanc, Mark D.
2008-01-01
Classification and regression tree (CART) analysis was applied to genome-wide tetranucleotide frequencies (genomic signatures) of 195 archaea and bacteria. Although genomic signatures have typically been used to classify evolutionary divergence, in this study, convergent evolution was the focus. Temperature optima for most of the organisms examined could be distinguished by CART analyses of tetranucleotide frequencies. This suggests that pervasive (nonlinear) qualities of genomes may reflect certain environmental conditions (such as temperature) in which those genomes evolved. The predominant use of GAGA and AGGA as the discriminating tetramers in CART models suggests that purine-loading and codon biases of thermophiles may explain some of the results. PMID:19054742
Directory of Open Access Journals (Sweden)
Luise A Seeker
Full Text Available Telomeres cap the ends of linear chromosomes and shorten with age in many organisms. In humans short telomeres have been linked to morbidity and mortality. With the accumulation of longitudinal datasets the focus shifts from investigating telomere length (TL to exploring TL change within individuals over time. Some studies indicate that the speed of telomere attrition is predictive of future disease. The objectives of the present study were to 1 characterize the change in bovine relative leukocyte TL (RLTL across the lifetime in Holstein Friesian dairy cattle, 2 estimate genetic parameters of RLTL over time and 3 investigate the association of differences in individual RLTL profiles with productive lifespan. RLTL measurements were analysed using Legendre polynomials in a random regression model to describe TL profiles and genetic variance over age. The analyses were based on 1,328 repeated RLTL measurements of 308 female Holstein Friesian dairy cattle. A quadratic Legendre polynomial was fitted to the fixed effect of age in months and to the random effect of the animal identity. Changes in RLTL, heritability and within-trait genetic correlation along the age trajectory were calculated and illustrated. At a population level, the relationship between RLTL and age was described by a positive quadratic function. Individuals varied significantly regarding the direction and amount of RLTL change over life. The heritability of RLTL ranged from 0.36 to 0.47 (SE = 0.05-0.08 and remained statistically unchanged over time. The genetic correlation of RLTL at birth with measurements later in life decreased with the time interval between samplings from near unity to 0.69, indicating that TL later in life might be regulated by different genes than TL early in life. Even though animals differed in their RLTL profiles significantly, those differences were not correlated with productive lifespan (p = 0.954.
Analyses results of the EHF FW Panel with welded fingers
International Nuclear Information System (INIS)
Sviridenko, M.N.; Leshukov, A.Yu.; Razmerov, A.V.; Tomilov, S.N.; Danilov, I.V.; Strebkov, Yu.S.; Mazul, I.V.; Labusov, A.; Gervash, A.A.; Belov, A.V.; Semichev, D.
2014-01-01
Highlights: • The design of FW panel with welded fingers has been developed. • The FW panel with welded fingers has been analyzed. • The pressure drop in FW panel coolant path do not exceed allowable one. • The mass flow rate distribution between finger pairs are on acceptable level. • Temperatures in FW components do not exceed allowable one. - Abstract: According to Procurement Arrangement (PA) Russian Federation will procure 40% of enhanced heat flux first wall (FW) panels. The signing of PA is scheduled on November 2013. In framework of PA preparation the RF specialists perform EHF FW design optimization in order to provide the ability to operation of EHF FW panel under ITER conditions. This article contains the design description of EHF FW 14 developed by RF and following analysis have been performed: • Hydraulic analysis; • Transient thermal analysis; • Structural analysis. Analyses results show that new design of FW panel with two straight welds for finger fixation on FW beam developed by RF specialists can be used as a reference design for ITER blanket EHF FW panel loaded by 5 MW/m 2 peak heat flux
Laszlo, Sarah; Federmeier, Kara D.
2010-01-01
Linking print with meaning tends to be divided into subprocesses, such as recognition of an input's lexical entry and subsequent access of semantics. However, recent results suggest that the set of semantic features activated by an input is broader than implied by a view wherein access serially follows recognition. EEG was collected from participants who viewed items varying in number and frequency of both orthographic neighbors and lexical associates. Regression analysis of single item ERPs replicated past findings, showing that N400 amplitudes are greater for items with more neighbors, and further revealed that N400 amplitudes increase for items with more lexical associates and with higher frequency neighbors or associates. Together, the data suggest that in the N400 time window semantic features of items broadly related to inputs are active, consistent with models in which semantic access takes place in parallel with stimulus recognition. PMID:20624252
Posterior consistency for Bayesian inverse problems through stability and regression results
International Nuclear Information System (INIS)
Vollmer, Sebastian J
2013-01-01
In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)
Directory of Open Access Journals (Sweden)
Željko V. Račić
2010-12-01
Full Text Available This paper aims to present the specifics of the application of multiple linear regression model. The economic (financial crisis is analyzed in terms of gross domestic product which is in a function of the foreign trade balance (on one hand and the credit cards, i.e. indebtedness of the population on this basis (on the other hand, in the USA (from 1999. to 2008. We used the extended application model which shows how the analyst should run the whole development process of regression model. This process began with simple statistical features and the application of regression procedures, and ended with residual analysis, intended for the study of compatibility of data and model settings. This paper also analyzes the values of some standard statistics used in the selection of appropriate regression model. Testing of the model is carried out with the use of the Statistics PASW 17 program.
Directory of Open Access Journals (Sweden)
Svetlana O. Musienko
2017-03-01
Full Text Available Objective to develop the economicmathematical model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies. Methods using comparative analysis the article studies the existing approaches to the construction of the company management models. Applying the regression analysis and the least squares method which is widely used for financial management of enterprises in Russia and abroad the author builds a model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies which can be used in the financial analysis and prediction of small enterprisesrsquo performance. Results the article states the need to identify factors affecting the financial management efficiency. The author analyzed scientific research and revealed the lack of comprehensive studies on the methodology for assessing the small enterprisesrsquo management while the methods used for large companies are not always suitable for the task. The systematized approaches of various authors to the formation of regression models describe the influence of certain factors on the company activity. It is revealed that the resulting indicators in the studies were revenue profit or the company relative profitability. The main drawback of most models is the mathematical not economic approach to the definition of the dependent and independent variables. Basing on the analysis it was determined that the most correct is the model of dependence between revenues and total assets of the company using the decimal logarithm. The model was built using data on the activities of the 507 small businesses operating in three spheres of economic activity. Using the presented model it was proved that there is direct dependence between the sales proceeds and the main items of the asset balance as well as differences in the degree of this effect depending on the economic activity of small
Directory of Open Access Journals (Sweden)
Abdelfattah M. Selim
2018-03-01
Full Text Available Aim: The present cross-sectional study was conducted to determine the seroprevalence and potential risk factors associated with Bovine viral diarrhea virus (BVDV disease in cattle and buffaloes in Egypt, to model the potential risk factors associated with the disease using logistic regression (LR models, and to fit the best predictive model for the current data. Materials and Methods: A total of 740 blood samples were collected within November 2012-March 2013 from animals aged between 6 months and 3 years. The potential risk factors studied were species, age, sex, and herd location. All serum samples were examined with indirect ELIZA test for antibody detection. Data were analyzed with different statistical approaches such as Chi-square test, odds ratios (OR, univariable, and multivariable LR models. Results: Results revealed a non-significant association between being seropositive with BVDV and all risk factors, except for species of animal. Seroprevalence percentages were 40% and 23% for cattle and buffaloes, respectively. OR for all categories were close to one with the highest OR for cattle relative to buffaloes, which was 2.237. Likelihood ratio tests showed a significant drop of the -2LL from univariable LR to multivariable LR models. Conclusion: There was an evidence of high seroprevalence of BVDV among cattle as compared with buffaloes with the possibility of infection in different age groups of animals. In addition, multivariable LR model was proved to provide more information for association and prediction purposes relative to univariable LR models and Chi-square tests if we have more than one predictor.
Cooperative biogas plants. Economic results and analyses. Status report 1998
International Nuclear Information System (INIS)
Hjort-Gregersen, K.
1998-11-01
The years 1995 - 1998 have been characterised by stabilisation of operation and economy of the Danish co-operative biogas plants. Most of the plants have obtained increasingly better economic results although the increase has been less significant than during earlier periods. There are several reasons for the increase. Most of the plants have been able to increase the sales income because of larger amounts of biomass available resulting in an increased biogas production. Furthermore it has been possible to contain the income level for biomass receipt. Several plants have established gas collection in storage tanks, which has resulted in increased gas yield. The operational stability related to both technique and processes have improved. The operational costs have been stabilised and are under control at most of the plants. The improved economic results have resulted in most of the plants having a satisfactory operation and economy. However, it must be stressed that some of the oldest plants have not been able to settle the investment dept at normal conditions. Also some, even rather new plants, still are in a difficult economic situation. Most of the plants established in the 90'ies have had a good start both operationally and economically. Thus the economic risk of establishing a plant has been reduced compared to earlier years. Generally the prerequisites for establishing a biogas plant are favourable economic conditions and quality assurance of the project. (LN)
Hutton, Eileen K; Simioni, Julia C; Thabane, Lehana
2017-08-01
Among women with a fetus with a non-cephalic presentation, external cephalic version (ECV) has been shown to reduce the rate of breech presentation at birth and cesarean birth. Compared with ECV at term, beginning ECV prior to 37 weeks' gestation decreases the number of infants in a non-cephalic presentation at birth. The purpose of this secondary analysis was to investigate factors associated with a successful ECV procedure and to present this in a clinically useful format. Data were collected as part of the Early ECV Pilot and Early ECV2 Trials, which randomized 1776 women with a fetus in breech presentation to either early ECV (34-36 weeks' gestation) or delayed ECV (at or after 37 weeks). The outcome of interest was successful ECV, defined as the fetus being in a cephalic presentation immediately following the procedure, as well as at the time of birth. The importance of several factors in predicting successful ECV was investigated using two statistical methods: logistic regression and classification and regression tree (CART) analyses. Among nulliparas, non-engagement of the presenting part and an easily palpable fetal head were independently associated with success. Among multiparas, non-engagement of the presenting part, gestation less than 37 weeks and an easily palpable fetal head were found to be independent predictors of success. These findings were consistent with results of the CART analyses. Regardless of parity, descent of the presenting part was the most discriminating factor in predicting successful ECV and cephalic presentation at birth. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.
Energy Technology Data Exchange (ETDEWEB)
Reddy, T.A. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States)); Claridge, D.E. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States))
1994-01-01
Multiple regression modeling of monitored building energy use data is often faulted as a reliable means of predicting energy use on the grounds that multicollinearity between the regressor variables can lead both to improper interpretation of the relative importance of the various physical regressor parameters and to a model with unstable regressor coefficients. Principal component analysis (PCA) has the potential to overcome such drawbacks. While a few case studies have already attempted to apply this technique to building energy data, the objectives of this study were to make a broader evaluation of PCA and multiple regression analysis (MRA) and to establish guidelines under which one approach is preferable to the other. Four geographic locations in the US with different climatic conditions were selected and synthetic data sequence representative of daily energy use in large institutional buildings were generated in each location using a linear model with outdoor temperature, outdoor specific humidity and solar radiation as the three regression variables. MRA and PCA approaches were then applied to these data sets and their relative performances were compared. Conditions under which PCA seems to perform better than MRA were identified and preliminary recommendations on the use of either modeling approach formulated. (orig.)
Results of Analyses of the Next Generation Solvent for Parsons
International Nuclear Information System (INIS)
Peters, T.; Washington, A.; Fink, S.
2012-01-01
Savannah River National Laboratory (SRNL) prepared a nominal 150 gallon batch of Next Generation Solvent (NGS) for Parsons. This material was then analyzed and tested for cesium mass transfer efficiency. The bulk of the results indicate that the solvent is qualified as acceptable for use in the upcoming pilot-scale testing at Parsons Technology Center. This report describes the analysis and testing of a batch of Next Generation Solvent (NGS) prepared in support of pilot-scale testing in the Parsons Technology Center. A total of ∼150 gallons of NGS solvent was prepared in late November of 2011. Details for the work are contained in a controlled laboratory notebook. Analysis of the Parsons NGS solvent indicates that the material is acceptable for use. SRNL is continuing to improve the analytical method for the guanidine.
Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh
2017-08-01
The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations
Greene, LaVana; Elzey, Brianda; Franklin, Mariah; Fakayode, Sayo O
2017-03-05
The negative health impact of polycyclic aromatic hydrocarbons (PAHs) and differences in pharmacological activity of enantiomers of chiral molecules in humans highlights the need for analysis of PAHs and their chiral analogue molecules in humans. Herein, the first use of cyclodextrin guest-host inclusion complexation, fluorescence spectrophotometry, and chemometric approach to PAH (anthracene) and chiral-PAH analogue derivatives (1-(9-anthryl)-2,2,2-triflouroethanol (TFE)) analyses are reported. The binding constants (K b ), stoichiometry (n), and thermodynamic properties (Gibbs free energy (ΔG), enthalpy (ΔH), and entropy (ΔS)) of anthracene and enantiomers of TFE-methyl-β-cyclodextrin (Me-β-CD) guest-host complexes were also determined. Chemometric partial-least-square (PLS) regression analysis of emission spectra data of Me-β-CD-guest-host inclusion complexes was used for the determination of anthracene and TFE enantiomer concentrations in Me-β-CD-guest-host inclusion complex samples. The values of calculated K b and negative ΔG suggest the thermodynamic favorability of anthracene-Me-β-CD and enantiomeric of TFE-Me-β-CD inclusion complexation reactions. However, anthracene-Me-β-CD and enantiomer TFE-Me-β-CD inclusion complexations showed notable differences in the binding affinity behaviors and thermodynamic properties. The PLS regression analysis resulted in square-correlation-coefficients of 0.997530 or better and a low LOD of 3.81×10 -7 M for anthracene and 3.48×10 -8 M for TFE enantiomers at physiological conditions. Most importantly, PLS regression accurately determined the anthracene and TFE enantiomer concentrations with an average low error of 2.31% for anthracene, 4.44% for R-TFE and 3.60% for S-TFE. The results of the study are highly significant because of its high sensitivity and accuracy for analysis of PAH and chiral PAH analogue derivatives without the need of an expensive chiral column, enantiomeric resolution, or use of a polarized
Botha, J.; De Ridder, J.H.; Potgieter, J.C.; Steyn, H.S.; Malan, L.
2013-01-01
A recently proposed model for waist circumference cut points (RPWC), driven by increased blood pressure, was demonstrated in an African population. We therefore aimed to validate the RPWC by comparing the RPWC and the Joint Statement Consensus (JSC) models via Logistic Regression (LR) and Neural Networks (NN) analyses. Urban African gender groups (N=171) were stratified according to the JSC and RPWC cut point models. Ultrasound carotid intima media thickness (CIMT), blood pressure (BP) and fa...
Johansen, Mette; Bahrt, Henriette; Altman, Roy D; Bartels, Else M; Juhl, Carsten B; Bliddal, Henning; Lund, Hans; Christensen, Robin
2016-08-01
The aim was to identify factors explaining inconsistent observations concerning the efficacy of intra-articular hyaluronic acid compared to intra-articular sham/control, or non-intervention control, in patients with symptomatic osteoarthritis, based on randomized clinical trials (RCTs). A systematic review and meta-regression analyses of available randomized trials were conducted. The outcome, pain, was assessed according to a pre-specified hierarchy of potentially available outcomes. Hedges׳s standardized mean difference [SMD (95% CI)] served as effect size. REstricted Maximum Likelihood (REML) mixed-effects models were used to combine study results, and heterogeneity was calculated and interpreted as Tau-squared and I-squared, respectively. Overall, 99 studies (14,804 patients) met the inclusion criteria: Of these, only 71 studies (72%), including 85 comparisons (11,216 patients), had adequate data available for inclusion in the primary meta-analysis. Overall, compared with placebo, intra-articular hyaluronic acid reduced pain with an effect size of -0.39 [-0.47 to -0.31; P hyaluronic acid. Based on available trial data, intra-articular hyaluronic acid showed a better effect than intra-articular saline on pain reduction in osteoarthritis. Publication bias and the risk of selective outcome reporting suggest only small clinical effect compared to saline. Copyright © 2016 Elsevier Inc. All rights reserved.
Samdal, Gro Beate; Eide, Geir Egil; Barth, Tom; Williams, Geoffrey; Meland, Eivind
2017-03-28
This systematic review aims to explain the heterogeneity in results of interventions to promote physical activity and healthy eating for overweight and obese adults, by exploring the differential effects of behaviour change techniques (BCTs) and other intervention characteristics. The inclusion criteria specified RCTs with ≥ 12 weeks' duration, from January 2007 to October 2014, for adults (mean age ≥ 40 years, mean BMI ≥ 30). Primary outcomes were measures of healthy diet or physical activity. Two reviewers rated study quality, coded the BCTs, and collected outcome results at short (≤6 months) and long term (≥12 months). Meta-analyses and meta-regressions were used to estimate effect sizes (ES), heterogeneity indices (I 2 ) and regression coefficients. We included 48 studies containing a total of 82 outcome reports. The 32 long term reports had an overall ES = 0.24 with 95% confidence interval (CI): 0.15 to 0.33 and I 2 = 59.4%. The 50 short term reports had an ES = 0.37 with 95% CI: 0.26 to 0.48, and I 2 = 71.3%. The number of BCTs unique to the intervention group, and the BCTs goal setting and self-monitoring of behaviour predicted the effect at short and long term. The total number of BCTs in both intervention arms and using the BCTs goal setting of outcome, feedback on outcome of behaviour, implementing graded tasks, and adding objects to the environment, e.g. using a step counter, significantly predicted the effect at long term. Setting a goal for change; and the presence of reporting bias independently explained 58.8% of inter-study variation at short term. Autonomy supportive and person-centred methods as in Motivational Interviewing, the BCTs goal setting of behaviour, and receiving feedback on the outcome of behaviour, explained all of the between study variations in effects at long term. There are similarities, but also differences in effective BCTs promoting change in healthy eating and physical activity and
Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R
2016-12-01
: MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We
Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A
2015-01-01
Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.
An illustration of harmonic regression based on the results of the fast Fourier transformation
Directory of Open Access Journals (Sweden)
Bertfai Imre
2002-01-01
Full Text Available The well-known methodology of the Fourier analysis is put against the background in the 2nd half of the century parallel to the development of the time-domain approach in the analysis of mainly economical time series. However, from the author's point of view, the former possesses some hidden analytical advantages which deserve to be re-introduced to the toolbox of analysts. This paper, through several case studies, reports research results for computer algorithm providing a harmonic model for time series. The starting point of the particular method is a harmonic analysis (Fourier-analysis or Lomb-periodogram. The results are optimized in a multifold manner resulting in a model which is easy to handle and able to forecast the underlying data. The results provided are particularly free from limitations characteristic for that methods. Furthermore, the calculated results are easy to interpret and use for further decisions. Nevertheless, the author intends to enhance the procedure in several ways. The method shown seems to be very effective and useful in modeling time series consisting of periodic terms. An additional advantage is the easy interpretation of the obtained parameters.
Energy Technology Data Exchange (ETDEWEB)
Bargalló, Enric, E-mail: enric.bargallo@esss.se [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Arroyo, Jose Manuel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain); Abal, Javier; Dies, Javier; De Blas, Alfredo; Tapia, Carlos [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Moya, Joaquin; Ibarra, Angel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain)
2015-10-15
Highlights: • RAMI methodology used for IFMIF accelerator facility is presented. • Availability analyses and results are shown. • Main accelerator design changes are proposed. • Consequences and conclusions of the RAMI analyses are described. - Abstract: This paper presents a summary of the RAMI (Reliability Availability Maintainability Inspectability) analyses done for the IFMIF (International Fusion Materials Irradiation Facility) Accelerator facility in the Engineering Design Phase. The methodology followed, the analyses performed, the results obtained and the conclusions drawn are described. Moreover, the consequences of the incorporation of the RAMI studies in the IFMIF design are presented and the main outcomes of these analyses are shown.
International Nuclear Information System (INIS)
Bargalló, Enric; Arroyo, Jose Manuel; Abal, Javier; Dies, Javier; De Blas, Alfredo; Tapia, Carlos; Moya, Joaquin; Ibarra, Angel
2015-01-01
Highlights: • RAMI methodology used for IFMIF accelerator facility is presented. • Availability analyses and results are shown. • Main accelerator design changes are proposed. • Consequences and conclusions of the RAMI analyses are described. - Abstract: This paper presents a summary of the RAMI (Reliability Availability Maintainability Inspectability) analyses done for the IFMIF (International Fusion Materials Irradiation Facility) Accelerator facility in the Engineering Design Phase. The methodology followed, the analyses performed, the results obtained and the conclusions drawn are described. Moreover, the consequences of the incorporation of the RAMI studies in the IFMIF design are presented and the main outcomes of these analyses are shown.
de Vet, Emely; Chinapaw, Mai JM; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-01-01
Background Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games—active games—seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. Objective The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. Methods A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Results Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a
Pan, Xin; Cao, Chen; Yang, Yingbao; Li, Xiaolong; Shan, Liangliang; Zhu, Xi
2018-04-01
The land surface temperature (LST) derived from thermal infrared satellite images is a meaningful variable in many remote sensing applications. However, at present, the spatial resolution of the satellite thermal infrared remote sensing sensor is coarser, which cannot meet the needs. In this study, LST image was downscaled by a random forest model between LST and multiple predictors in an arid region with an oasis-desert ecotone. The proposed downscaling approach was evaluated using LST derived from the MODIS LST product of Zhangye City in Heihe Basin. The primary result of LST downscaling has been shown that the distribution of downscaled LST matched with that of the ecosystem of oasis and desert. By the way of sensitivity analysis, the most sensitive factors to LST downscaling were modified normalized difference water index (MNDWI)/normalized multi-band drought index (NMDI), soil adjusted vegetation index (SAVI)/ shortwave infrared reflectance (SWIR)/normalized difference vegetation index (NDVI), normalized difference building index (NDBI)/SAVI and SWIR/NDBI/MNDWI/NDWI for the region of water, vegetation, building and desert, with LST variation (at most) of 0.20/-0.22 K, 0.92/0.62/0.46 K, 0.28/-0.29 K and 3.87/-1.53/-0.64/-0.25 K in the situation of +/-0.02 predictor perturbances, respectively.
Xie, Heping; Wang, Fuxing; Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan
2017-01-01
Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = -0.11, 95% CI = [-0.19, -0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = -0.70, 95% CI = [-1.02, -0.38], p < 0.001), as well as dtransfer for cueing (β = -0.60, 95% CI = [-0.92, -0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning.
Simons, Monique; de Vet, Emely; Chinapaw, Mai Jm; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-04-04
Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games-active games-seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a correlate of both active and non-active gaming
Programs in Fortran language for reporting the results of the analyses by ICP emission spectroscopy
International Nuclear Information System (INIS)
Roca, M.
1985-01-01
Three programs, written in FORTRAN IV language, for reporting the results of the analyses by ICP emission spectroscopy from data stored in files on floppy disks have been developed. They are intended, respectively, for the analyses of: 1) waters, 2) granites and slates, and 3) different kinds of geological materials. (Author) 8 refs
Peluso, Marco E M; Munnia, Armelle; Ceppi, Marcello
2014-11-05
Exposures to bisphenol-A, a weak estrogenic chemical, largely used for the production of plastic containers, can affect the rodent behaviour. Thus, we examined the relationships between bisphenol-A and the anxiety-like behaviour, spatial skills, and aggressiveness, in 12 toxicity studies of rodent offspring from females orally exposed to bisphenol-A, while pregnant and/or lactating, by median and linear splines analyses. Subsequently, the meta-regression analysis was applied to quantify the behavioural changes. U-shaped, inverted U-shaped and J-shaped dose-response curves were found to describe the relationships between bisphenol-A with the behavioural outcomes. The occurrence of anxiogenic-like effects and spatial skill changes displayed U-shaped and inverted U-shaped curves, respectively, providing examples of effects that are observed at low-doses. Conversely, a J-dose-response relationship was observed for aggressiveness. When the proportion of rodents expressing certain traits or the time that they employed to manifest an attitude was analysed, the meta-regression indicated that a borderline significant increment of anxiogenic-like effects was present at low-doses regardless of sexes (β)=-0.8%, 95% C.I. -1.7/0.1, P=0.076, at ≤120 μg bisphenol-A. Whereas, only bisphenol-A-males exhibited a significant inhibition of spatial skills (β)=0.7%, 95% C.I. 0.2/1.2, P=0.004, at ≤100 μg/day. A significant increment of aggressiveness was observed in both the sexes (β)=67.9,C.I. 3.4, 172.5, P=0.038, at >4.0 μg. Then, bisphenol-A treatments significantly abrogated spatial learning and ability in males (Pbisphenol-A, e.g. ≤120 μg/day, were associated to behavioural aberrations in offspring. Copyright © 2014. Published by Elsevier Ireland Ltd.
Fossati, Andrea; Widiger, Thomas A; Borroni, Serena; Maffei, Cesare; Somma, Antonella
2017-06-01
To extend the evidence on the reliability and construct validity of the Five-Factor Model Rating Form (FFMRF) in its self-report version, two independent samples of Italian participants, which were composed of 510 adolescent high school students and 457 community-dwelling adults, respectively, were administered the FFMRF in its Italian translation. Adolescent participants were also administered the Italian translation of the Borderline Personality Features Scale for Children-11 (BPFSC-11), whereas adult participants were administered the Italian translation of the Triarchic Psychopathy Measure (TriPM). Cronbach α values were consistent with previous findings; in both samples, average interitem r values indicated acceptable internal consistency for all FFMRF scales. A multidimensional graded item response theory model indicated that the majority of FFMRF items had adequate discrimination parameters; information indices supported the reliability of the FFMRF scales. Both categorical (i.e., item-level) and scale-level regression analyses suggested that the FFMRF scores may predict a nonnegligible amount of variance in the BPFSC-11 total score in adolescent participants, and in the TriPM scale scores in adult participants.
Riley, Richard D.
2017-01-01
An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945
Khankari, Nikhil K.; Shu, Xiao Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Eeles, Rosalind A.; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei; Blalock, Kendra; Campbell, Peter T.; Casey, Graham; Conti, David V.; Edlund, Christopher K.; Figueiredo, Jane; James Gauderman, W.; Gong, Jian; Green, Roger C.; Harju, John F.; Harrison, Tabitha A.; Jacobs, Eric J.; Jenkins, Mark A.; Jiao, Shuo; Li, Li; Lin, Yi; Manion, Frank J.; Moreno, Victor; Mukherjee, Bhramar; Raskin, Leon; Schumacher, Fredrick R.; Seminara, Daniela; Severi, Gianluca; Stenzel, Stephanie L.; Thomas, Duncan C.; Hopper, John L.; Southey, Melissa C.; Makalic, Enes; Schmidt, Daniel F.; Fletcher, Olivia; Peto, Julian; Gibson, Lorna; dos Santos Silva, Isabel; Ahsan, Habib; Whittemore, Alice; Waisfisz, Quinten; Meijers-Heijboer, Hanne; Adank, Muriel; van der Luijt, Rob B.; Uitterlinden, Andre G.; Hofman, Albert; Meindl, Alfons; Schmutzler, Rita K.; Müller-Myhsok, Bertram; Lichtner, Peter; Nevanlinna, Heli; Muranen, Taru A.; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Hein, Rebecca; Dahmen, Norbert; Beckman, Lars; Crisponi, Laura; Hall, Per; Czene, Kamila; Irwanto, Astrid; Liu, Jianjun; Easton, Douglas F.; Turnbull, Clare; Rahman, Nazneen; Eeles, Rosalind; Kote-Jarai, Zsofia; Muir, Kenneth; Giles, Graham; Neal, David; Donovan, Jenny L.; Hamdy, Freddie C.; Wiklund, Fredrik; Gronberg, Henrik; Haiman, Christopher; Schumacher, Fred; Travis, Ruth; Riboli, Elio; Hunter, David; Gapstur, Susan; Berndt, Sonja; Chanock, Stephen; Han, Younghun; Su, Li; Wei, Yongyue; Hung, Rayjean J.; Brhane, Yonathan; McLaughlin, John; Brennan, Paul; McKay, James D.; Rosenberger, Albert; Houlston, Richard S.; Caporaso, Neil; Teresa Landi, Maria; Heinrich, Joachim; Wu, Xifeng; Ye, Yuanqing; Christiani, David C.
2016-01-01
Background: Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using
Energy Technology Data Exchange (ETDEWEB)
Xu, Bin [School of Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Research Center of Applied Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Lin, Boqiang, E-mail: bqlin@xmu.edu.cn [Collaborative Innovation Center for Energy Economics and Energy Policy, China Institute for Studies in Energy Policy, Xiamen University, Xiamen, Fujian 361005 (China)
2017-03-15
China is currently the world's largest carbon dioxide (CO{sub 2}) emitter. Moreover, total energy consumption and CO{sub 2} emissions in China will continue to increase due to the rapid growth of industrialization and urbanization. Therefore, vigorously developing the high–tech industry becomes an inevitable choice to reduce CO{sub 2} emissions at the moment or in the future. However, ignoring the existing nonlinear links between economic variables, most scholars use traditional linear models to explore the impact of the high–tech industry on CO{sub 2} emissions from an aggregate perspective. Few studies have focused on nonlinear relationships and regional differences in China. Based on panel data of 1998–2014, this study uses the nonparametric additive regression model to explore the nonlinear effect of the high–tech industry from a regional perspective. The estimated results show that the residual sum of squares (SSR) of the nonparametric additive regression model in the eastern, central and western regions are 0.693, 0.054 and 0.085 respectively, which are much less those that of the traditional linear regression model (3.158, 4.227 and 7.196). This verifies that the nonparametric additive regression model has a better fitting effect. Specifically, the high–tech industry produces an inverted “U–shaped” nonlinear impact on CO{sub 2} emissions in the eastern region, but a positive “U–shaped” nonlinear effect in the central and western regions. Therefore, the nonlinear impact of the high–tech industry on CO{sub 2} emissions in the three regions should be given adequate attention in developing effective abatement policies. - Highlights: • The nonlinear effect of the high–tech industry on CO{sub 2} emissions was investigated. • The high–tech industry yields an inverted “U–shaped” effect in the eastern region. • The high–tech industry has a positive “U–shaped” nonlinear effect in other regions. • The linear impact
Sobrinho, L G; Almeida-Costa, J M
1992-01-01
Pathological hyperprolactinaemia (PH) is significantly associated with: (1) paternal deprivation during childhood, (2) depression, (3) non-specific symptoms including obesity and weight gain. The clinical onset of the symptoms often follows pregnancy or a loss. Prolactin is an insulin antagonist which does not promote weight gain. Hyperprolactinaemia and increased metabolic efficiency are parts of a system of interdependent behavioural and metabolic mechanisms necessary for the care of the young. We call this system, which is available as a whole package, maternal subroutine (MS). An important number of cases of PH are due to activation of the MS that is not induced by pregnancy. The same occurs in surrogate maternity and in some animal models. Most women with PH developed a malignant symbiotic relationship with their mothers in the setting of absence, alcoholism or devaluation of the father. These women may regress to early developmental stages to the point that they identify themselves both with their lactating mother and with the nursing infant as has been found in psychoanalysed patients and in the paradigmatic condition of pseudopregnancy. Such regression can be associated with activation of the MS. Prolactinomas represent the extreme of the spectrum of PH and may result from somatic mutations occurring in hyperstimulated lactotrophs.
Results of radiotherapy in craniopharyngiomas analysed by the linear quadratic model
Energy Technology Data Exchange (ETDEWEB)
Guerkaynak, M. [Dept. of Radiation Oncology, Hacettepe Univ., Ankara (Turkey); Oezyar, E. [Dept. of Radiation Oncology, Hacettepe Univ., Ankara (Turkey); Zorlu, F. [Dept. of Radiation Oncology, Hacettepe Univ., Ankara (Turkey); Akyol, F.H. [Dept. of Radiation Oncology, Hacettepe Univ., Ankara (Turkey); Lale Atahan, I. [Dept. of Radiation Oncology, Hacettepe Univ., Ankara (Turkey)
1994-12-31
In 23 craniopharyngioma patients treated by limited surgery and external radiotherapy, the results concerning local control were analysed by linear quadratic formula. A biologically effective dose (BED) of 55 Gy, calculated with time factor and an {alpha}/{beta} value of 10 Gy, seemed to be adequate for local control. (orig.).
Spady, Richard; Stouli, Sami
2012-01-01
We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...
Probabilistic approach in treatment of deterministic analyses results of severe accidents
International Nuclear Information System (INIS)
Krajnc, B.; Mavko, B.
1996-01-01
Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)
Preliminary Results of Ancillary Safety Analyses Supporting TREAT LEU Conversion Activities
Energy Technology Data Exchange (ETDEWEB)
Brunett, A. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Fei, T. [Argonne National Lab. (ANL), Argonne, IL (United States); Strons, P. S. [Argonne National Lab. (ANL), Argonne, IL (United States); Papadias, D. D. [Argonne National Lab. (ANL), Argonne, IL (United States); Hoffman, E. A. [Argonne National Lab. (ANL), Argonne, IL (United States); Kontogeorgakos, D. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Connaway, H. M. [Argonne National Lab. (ANL), Argonne, IL (United States); Wright, A. E. [Argonne National Lab. (ANL), Argonne, IL (United States)
2015-10-01
Report (FSAR) [3]. Depending on the availability of historical data derived from HEU TREAT operation, results calculated for the LEU core are compared to measurements obtained from HEU TREAT operation. While all analyses in this report are largely considered complete and have been reviewed for technical content, it is important to note that all topics will be revisited once the LEU design approaches its final stages of maturity. For most safety significant issues, it is expected that the analyses presented here will be bounding, but additional calculations will be performed as necessary to support safety analyses and safety documentation. It should also be noted that these analyses were completed as the LEU design evolved, and therefore utilized different LEU reference designs. Preliminary shielding, neutronic, and thermal hydraulic analyses have been completed and have generally demonstrated that the various LEU core designs will satisfy existing safety limits and standards also satisfied by the existing HEU core. These analyses include the assessment of the dose rate in the hodoscope room, near a loaded fuel transfer cask, above the fuel storage area, and near the HEPA filters. The potential change in the concentration of tramp uranium and change in neutron flux reaching instrumentation has also been assessed. Safety-significant thermal hydraulic items addressed in this report include thermally-induced mechanical distortion of the grid plate, and heating in the radial reflector.
Wang, Hui; Sui, Weiguo; Xue, Wen; Wu, Junyong; Chen, Jiejing; Dai, Yong
2014-09-01
Immunoglobulin A nephropathy (IgAN) is a complex trait regulated by the interaction among multiple physiologic regulatory systems and probably involving numerous genes, which leads to inconsistent findings in genetic studies. One possibility of failure to replicate some single-locus results is that the underlying genetics of IgAN nephropathy is based on multiple genes with minor effects. To learn the association between 23 single nucleotide polymorphisms (SNPs) in 14 genes predisposing to chronic glomerular diseases and IgAN in Han males, the 23 SNPs genotypes of 21 Han males were detected and analyzed with a BaiO gene chip, and their associations were analyzed with univariate analysis and multiple linear regression analysis. Analysis showed that CTLA4 rs231726 and CR2 rs1048971 revealed a significant association with IgAN. These findings support the multi-gene nature of the etiology of IgAN and propose a potential gene-gene interactive model for future studies.
Results and analyses of the smoke from compression ignition CI engines (diesel)
International Nuclear Information System (INIS)
Dimitrovski, Mile; Mitreski, Stevo M.
1998-01-01
The aim of this examination is to improve the knowledge about the actual ecological condition which is a result of the gas and particles emission from the diesel engines. The mechanism of the production of combustion products including the particles in the combustion chamber has been studied, as well as the options for reduction of the emission. Measurements of the smoke emission from the vehicles in Skopje have been made and the results are given, including their analyses. The conclusion from the analyses is that it is necessary to improve the conditions for normal use of the diesel engine vehicles, as well as to engage new vehicles which cause less damage for the environment (Author)
International Nuclear Information System (INIS)
Hoertner, H.
1977-01-01
For the investigation of the risk of nuclear power plants loss-of-coolant accidents and transients have to be analyzed. The different functions of the engineered safety features installed to cope with transients are explained. The event tree analysis is carried out for the important transient 'loss of normal onsite power'. Preliminary results of the reliability analyses performed for quantitative evaluation of this event tree are shown. (orig.) [de
Li, L.; Yang, C.
2017-12-01
Climate extremes often manifest as rare events in terms of surface air temperature and precipitation with an annual reoccurrence period. In order to represent the manifold characteristics of climate extremes for monitoring and analysis, the Expert Team on Climate Change Detection and Indices (ETCCDI) had worked out a set of 27 core indices based on daily temperature and precipitation data, describing extreme weather and climate events on an annual basis. The CLIMDEX project (http://www.climdex.org) had produced public domain datasets of such indices for data from a variety of sources, including output from global climate models (GCM) participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Among the 27 ETCCDI indices, there are six percentile-based temperature extremes indices that may fall into two groups: exceedance rates (ER) (TN10p, TN90p, TX10p and TX90p) and durations (CSDI and WSDI). Percentiles must be estimated prior to the calculation of the indices, and could more or less be biased by the adopted algorithm. Such biases will in turn be propagated to the final results of indices. The CLIMDEX used an empirical quantile estimator combined with a bootstrap resampling procedure to reduce the inhomogeneity in the annual series of the ER indices. However, there are still some problems remained in the CLIMDEX datasets, namely the overestimated climate variability due to unaccounted autocorrelation in the daily temperature data, seasonally varying biases and inconsistency between algorithms applied to the ER indices and to the duration indices. We now present new results of the six indices through a semiparametric quantile regression approach for the CMIP5 model output. By using the base-period data as a whole and taking seasonality and autocorrelation into account, this approach successfully addressed the aforementioned issues and came out with consistent results. The new datasets cover the historical and three projected (RCP2.6, RCP4.5 and RCP
Salas, M.M.; Nascimento, G.G.; Vargas-Ferreira, F.; Tarquinio, S.B.; Huysmans, M.C.D.N.J.M.; Demarco, F.F.
2015-01-01
OBJECTIVE: The aim of the present study was to assess the influence of diet in tooth erosion presence in children and adolescents by meta-analysis and meta-regression. DATA: Two reviewers independently performed the selection process and the quality of studies was assessed. SOURCES: Studies
International Nuclear Information System (INIS)
Lehmann, M.; Pecka, M.; Rocek, J.; Zalesky, K.
1993-12-01
Detailed results are given of neutron physics analyses performed to assess the efficiency and acceptability of modifications of the WWER-440 core protection and control system; the modifications have been proposed with a view to increasing the proportion of mechanical control in the compensation of reactivity effects during reactor unit operation in the variable load mode. The calculations were carried out using the modular MOBY-DICK macrocode system together with the SMV42G36 library of two-group parametrized diffusion constants, containing corrections which allow new-design WWER-440 fuel assemblies to be discriminated. (J.B). 37 tabs., 18 figs., 5 refs
Duda, David P.; Minnis, Patrick
2009-01-01
Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.
Differentiating regressed melanoma from regressed lichenoid keratosis.
Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A
2017-04-01
Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Sjølie, A K; Klein, R; Porta, M; Orchard, T; Fuller, J; Parving, H H; Bilous, R; Aldington, S; Chaturvedi, N
2011-03-01
To study the association between baseline retinal microaneurysm score and progression and regression of diabetic retinopathy, and response to treatment with candesartan in people with diabetes. This was a multicenter randomized clinical trial. The progression analysis included 893 patients with Type 1 diabetes and 526 patients with Type 2 diabetes with retinal microaneurysms only at baseline. For regression, 438 with Type 1 and 216 with Type 2 diabetes qualified. Microaneurysms were scored from yearly retinal photographs according to the Early Treatment Diabetic Retinopathy Study (ETDRS) protocol. Retinopathy progression and regression was defined as two or more step change on the ETDRS scale from baseline. Patients were normoalbuminuric, and normotensive with Type 1 and Type 2 diabetes or treated hypertensive with Type 2 diabetes. They were randomized to treatment with candesartan 32 mg daily or placebo and followed for 4.6 years. A higher microaneurysm score at baseline predicted an increased risk of retinopathy progression (HR per microaneurysm score 1.08, P diabetes; HR 1.07, P = 0.0174 in Type 2 diabetes) and reduced the likelihood of regression (HR 0.79, P diabetes; HR 0.85, P = 0.0009 in Type 2 diabetes), all adjusted for baseline variables and treatment. Candesartan reduced the risk of microaneurysm score progression. Microaneurysm counts are important prognostic indicators for worsening of retinopathy, thus microaneurysms are not benign. Treatment with renin-angiotensin system inhibitors is effective in the early stages and may improve mild diabetic retinopathy. Microaneurysm scores may be useful surrogate endpoints in clinical trials. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.
Results of initial analyses of the salt (macro) batch 11 Tank 21H qualification samples
Energy Technology Data Exchange (ETDEWEB)
Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2017-10-23
Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 11 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 11 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amounts of solids, or unusual colors. Further sample results will be reported in a future document. This memo satisfies part of Deliverable 3 of the Technical Task Request (TTR).
Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.
2017-01-01
Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...
Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.
1998-01-01
This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.
Summary of results from the IPIRG-2 round-robin analyses
International Nuclear Information System (INIS)
Rahman, S.; Olson, R.; Rosenfield, A.; Wilkowski, G.
1996-02-01
This report presents a summary of the results from three one-day international round-robin workshops which were organized by Battelle in conjunction with the Second International Piping Integrity Research Group (IPIRG-2) Program. The objective of these workshops was to develop a consensus in handling difficult analytical problems in leak-before-break and pipe flaw evaluations. The workshops, which were held August 5, 1993, March 4, 1994, and October 21, 1994 at Columbus, Ohio, involved various technical presentations on the related research efforts by the IPIRG-2 member organizations and solutions to several round-robin problems. Following review by the IPIRG-2 members, four sets of round-robin problems were developed. They involved: (1) evaluations of fracture properties and pipe loads, (2) crack-opening and leak-rate evaluations, (3) dynamic analysis of cracked pipes, and (4) evaluations of elbows. A total of 18 organizations from the United States, Japan, Korea, and Europe solved these round-robin problems. The analysis techniques employed by the participants included both finite element and engineering methods. Based on the results from these analyses, several important observations were made concerning the predictive capability of the current fracture-mechanics and thermal-hydraulics models for their applications in nuclear piping and piping welds
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
International Nuclear Information System (INIS)
Kang, Kyoung Ho; Cho, Young Ro; Koo, Kil Mo; Park, Rae Joon; Kim, Jong Hwan; Kim, Jong Tae; Ha, Kwang Sun; Kim, Sang Baik; Kim, Hee Dong
2001-03-01
LAVA(Lower-plenum Arrested Vessel Attack) has been performed to gather proof of gap formation between the debris and lower head vessel and to evaluate the effect of the gap formation on in-vessel cooling. Through the total of 12 tests, the analyses on the melt relocation process, gap formation and the thermal and mechanical behaviors of the vessel were performed. The thermal behaviors of the lower head vessel were affected by the formation of the fragmented particles and melt pool during the melt relocation process depending on mass and composition of melt and subcooling and depth of water. During the melt relocation process 10.0 to 20.0 % of the melt mass was fragmented and also 15.5 to 47.5 % of the thermal energy of the melt was transferred to water. The experimental results address the non-adherence of the debris to the lower head vessel and the consequent gap formation between the debris and the lower head vessel in case there was an internal pressure load across the vessel abreast with the thermal load induced by the thermite melt. The thermal behaviors of the lower head vessel during the cooldown period were mainly affected by the heat removal characteristics through this gap, which were determined by the possibilities of the water ingression into the gap depending on the melt composition of the corium simulant. The enhanced cooling capacity through the gap was distinguished in the Al 2 O 3 melt tests. It could be inferred from the analyses on the heat removal capacity through the gap that the lower head vessel could effectively cooldown via heat removal in the gap governed by counter current flow limits(CCFL) even if 2mm thick gap should form in the 30 kg Al 2 O 3 melt tests, which was also confirmed through the variations of the conduction heat flux in the vessel and rapid cool down of the vessel outer surface in the Al 2 O 3 melt tests. In the case of large melt mass of 70 kg Al 2 O 3 melt, however, the infinite possibility of heat removal through the
Directory of Open Access Journals (Sweden)
Susanne Unverzagt
Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health
Duda, David P.; Minnis, Patrick
2009-01-01
Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.
Review of Statistical Analyses Resulting from Performance of HLDWD-DWPF-005
International Nuclear Information System (INIS)
Beck, R.S.
1997-01-01
The Engineering Department at the Defense Waste Processing Facility (DWPF) has reviewed two reports from the Statistical Consulting Section (SCS) involving the statistical analysis of test results for analysis of small sample inserts (references 1 ampersand 2). The test results cover two proposed analytical methods, a room temperature hydrofluoric acid preparation (Cold Chem) and a sodium peroxide/sodium hydroxide fusion modified for insert samples (Modified Fusion). The reports support implementation of the proposed small sample containers and analytical methods at DWPF. Hydragard sampler valve performance was typical of previous results (reference 3). Using an element from each major feed stream. lithium from the frit and iron from the sludge, the sampler was determined to deliver a uniform mixture in either sample container.The lithium to iron ratios were equivalent for the standard 15 ml vial and the 3 ml insert.The proposed method provide equivalent analyses as compared to the current methods. The biases associated with the proposed methods on a vitrified basis are less than 5% for major elements. The sum of oxides for the proposed method compares favorably with the sum of oxides for the conventional methods. However, the average sum of oxides for the Cold Chem method was 94.3% which is below the minimum required recovery of 95%. Both proposed methods, cold Chem and Modified Fusion, will be required at first to provide an accurate analysis which will routinely meet the 95% and 105% average sum of oxides limit for Product Composition Control System (PCCS).Issued to be resolved during phased implementation are as follows: (1) Determine calcine/vitrification factor for radioactive feed; (2) Evaluate covariance matrix change against process operating ranges to determine optimum sample size; (3) Evaluate sources for low sum of oxides; and (4) Improve remote operability of production versions of equipment and instruments for installation in 221-S.The specifics of
Crown, William H
2014-02-01
This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.
A Simulation Investigation of Principal Component Regression.
Allen, David E.
Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…
Stokman, M A; Spijkervet, F K L; Boezen, H M; Schouten, J.P.; Roodenburg, J L N; de Vries, E. G. E.
The aim of these meta-analyses was to evaluate the effectiveness of interventions for the prevention of oral mucositis in cancer patients treated with head and neck radiotherapy and/or chemotherapy, with a focus on randomized clinical trials. A literature search was performed for reports of
Influence of assessment setting on the results of functional analyses of problem behavior
Lang, R.B.; Sigafoos, J.; Lancioni, G.E.; Didden, H.C.M.; Rispoli, M.
2010-01-01
Analogue functional analyses are widely used to identify the operant function of problem behavior in individuals with developmental disabilities. Because problem behavior often occurs across multiple settings (e.g., homes, schools, outpatient clinics), it is important to determine whether the
Compilation and analyses of results from cross-hole tracer tests with conservative tracers
Energy Technology Data Exchange (ETDEWEB)
Hjerne, Calle; Nordqvist, Rune; Harrstroem, Johan (Geosigma AB (Sweden))
2010-09-15
Radionuclide transport in hydrogeological formations is one of the key factors for the safety analysis of a future repository of nuclear waste. Tracer tests have therefore been an important field method within the SKB investigation programmes at several sites since the late 1970's. This report presents a compilation and analyses of results from cross-hole tracer tests with conservative tracers performed within various SKB investigations. The objectives of the study are to facilitate, improve and reduce uncertainties in predictive tracer modelling and to provide supporting information for SKB's safety assessment of a final repository of nuclear waste. More specifically, the focus of the report is the relationship between the tracer mean residence time and fracture hydraulic parameters, i.e. the relationship between mass balance aperture and fracture transmissivity, hydraulic diffusivity and apparent storativity. For 74 different combinations of pumping and injection section at six different test sites (Studsvik, Stripa, Finnsjoen, Aespoe, Forsmark, Laxemar), estimates of mass balance aperture from cross-hole tracer tests as well as transmissivity were extracted from reports or in the SKB database Sicada. For 28 of these combinations of pumping and injection section, estimates of hydraulic diffusivity and apparent storativity from hydraulic interference tests were also found. An empirical relationship between mass balance aperture and transmissivity was estimated, although some uncertainties for individual data exist. The empirical relationship between mass balance aperture and transmissivity presented in this study deviates considerably from other previously suggested relationships, such as the cubic law and transport aperture as suggested by /Dershowitz and Klise 2002/, /Dershowitz et al. 2002/ and /Dershowitz et al. 2003/, which also is discussed in this report. No clear and direct empirical relationship between mass balance aperture and hydraulic
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
Directory of Open Access Journals (Sweden)
Bruce B.W. Phiri
2016-06-01
Full Text Available Schistosomiasis and soil-transmitted helminth (STH infections constitute a major public health problem in many parts of sub-Saharan Africa. In areas where prevalence of geo-helminths and schistosomes is high, co-infection with multiple parasite species is common, resulting in disproportionately elevated burden compared with single infections. Determining risk factors of co-infection intensity is important for better design of targeted interventions. In this paper, we examined risk factors of hookworm and S. haematobium co-infection intensity, in Chikwawa district, southern Malawi in 2005, using bivariate count models. Results show that hookworm and S. haematobium infections were much localised with small proportion of individuals harbouring more parasites especially among school-aged children. The risk of co-intensity with both hookworm and S. haematobium was high for all ages, although this diminished with increasing age, increased with fishing (hookworm: coefficient. = 12.29; 95% CI = 11.50–13.09; S. haematobium: 0.040; 95% CI = 0.0037, 3.832. Both infections were abundant in those with primary education (hookworm: coef. = 0.072; 95% CI = 0.056, 0.401 and S. haematobium: coef. = 0.286; 95% CI = 0.034, 0.538. However, much lower risk was observed for those who were farmers (hookworm: coef. = −0.349, 95% CI = −0.547,−0.150; S. haematobium: coef. −0.239, 95% CI = −0.406, −0.072. In conclusion, our findings suggest that efforts to control helminths infection should be co-integrated and health promotion campaigns should be aimed at school-going children and adults who are in constant contact with water.
Regression analysis with categorized regression calibrated exposure: some interesting findings
Directory of Open Access Journals (Sweden)
Hjartåker Anette
2006-07-01
Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a
Better Autologistic Regression
Directory of Open Access Journals (Sweden)
Mark A. Wolters
2017-11-01
Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.
Khankari, Nikhil K.; Shu, Xiao-Ou; Wen, Wanqing; Kraft, Peter; Lindström, Sara; Peters, Ulrike; Schildkraut, Joellen; Schumacher, Fredrick; Bofetta, Paolo; Risch, Angela; Bickeböller, Heike; Amos, Christopher I.; Easton, Douglas; Gruber, Stephen B.; Haiman, Christopher A.; Hunter, David J.; Chanock, Stephen J.; Pierce, Brandon L.; Zheng, Wei
2016-01-01
Background Observational studies examining associations between adult height and risk of colorectal, prostate, and lung cancers have generated mixed results. We conducted meta-analyses using data from prospective cohort studies and further carried out Mendelian randomization analyses, using height-associated genetic variants identified in a genome-wide association study (GWAS), to evaluate the association of adult height with these cancers. Methods and Findings A systematic review of prospective studies was conducted using the PubMed, Embase, and Web of Science databases. Using meta-analyses, results obtained from 62 studies were summarized for the association of a 10-cm increase in height with cancer risk. Mendelian randomization analyses were conducted using summary statistics obtained for 423 genetic variants identified from a recent GWAS of adult height and from a cancer genetics consortium study of multiple cancers that included 47,800 cases and 81,353 controls. For a 10-cm increase in height, the summary relative risks derived from the meta-analyses of prospective studies were 1.12 (95% CI 1.10, 1.15), 1.07 (95% CI 1.05, 1.10), and 1.06 (95% CI 1.02, 1.11) for colorectal, prostate, and lung cancers, respectively. Mendelian randomization analyses showed increased risks of colorectal (odds ratio [OR] = 1.58, 95% CI 1.14, 2.18) and lung cancer (OR = 1.10, 95% CI 1.00, 1.22) associated with each 10-cm increase in genetically predicted height. No association was observed for prostate cancer (OR = 1.03, 95% CI 0.92, 1.15). Our meta-analysis was limited to published studies. The sample size for the Mendelian randomization analysis of colorectal cancer was relatively small, thus affecting the precision of the point estimate. Conclusions Our study provides evidence for a potential causal association of adult height with the risk of colorectal and lung cancers and suggests that certain genetic factors and biological pathways affecting adult height may also affect the
[Inguinal hernia repair: results of randomized clinical trials and meta-analyses].
Slim, K; Vons, C
2008-01-01
This evidence-based review of the literature aims to answer two questions regarding inguinal hernia repair: 1. should a prosthetic patch be used routinely? 2. Which approach is better - laparoscopic or open surgery? After a comprehensive search of electronic databases we retained only meta-analyses (n=14) and/or randomised clinical trials (n=4). Review of this literature suggests with a good level of evidence that prosthetic hernia repair is the gold standard; the laparoscopic approach has very few proven benefits and may involve more serious complications when performed outside expert centers. The role of laparoscopy for the repair of bilateral or recurrent hernias needs better evaluation.
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
Lasaponara, Rosa; Masini, Nicola
2014-05-01
within Basilicata and Puglia Region, southern Patagonia and Payunia-Campo Volcanicos Liancanelo e PayunMatru respectively, in Italy and Argentina. We focused our attention on diverse surfaces and soil types in different periods of the year in order to assess the capabilities of both optical and radar data to detect archaeological marks in different ecosystems and seasons. We investigated not only crop culture during the "favourable vegetative period" to enhance the presence of subsurface remains but also the "spectral response" of spontaneous, sparse herbaceous covers during periods considered and expected to be less favourable (as for example summer and winter) for this type of investigation. The main interesting results were the capability of radar (cosmoskymed) and multispectral optical data satellite data (Pleiades, Quickbird, Geoeye) to highlight the presence of structures below the surface even (i) in during period of years generally considered not "suitable for crop mark investigations" and even (ii) in areas only covered by sparse, spontaneous herbaceous plants in several test sites investigate din both Argentine and Italian areas of interest. Preliminary results conducted in both Italian and Argentina sites pointed out that Earth Observation (EO) technology can be successfully used for extracting useful information on traces the past human activities still fossilized in the modern landscape in different ecosystems and seasons. Moreover the multitemporal analyses of satellite data can fruitfully applied to: (i) improve knowledge, (ii) support monitoring of natural and cultural site, (iii) assess natural and man-made risks including emerging threats to the heritage sites. References Lasaponara R, N Masini 2009 Full-waveform Airborne Laser Scanning for the detection of medieval archaeological microtopographic relief Journal of Cultural Heritage 10, e78-e82 Ciminale M, D Gallo, R Lasaponara, N Masini 2009 A multiscale approach for reconstructing archaeological
Matson, Johnny L.; Kozlowski, Alison M.
2010-01-01
Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…
Thiese, Matthew S; Hegmann, Kurt T; Kapellusch, Jay; Merryweather, Andrew; Bao, Stephen; Silverstein, Barbara; Tang, Ruoliang; Garg, Arun
2016-06-01
The goal is to assess the relationships between psychosocial factors and both medial and lateral epicondylitis after adjustment for personal and job physical exposures. One thousand eight hundred twenty-four participants were included in pooled analyses. Ten psychosocial factors were assessed. One hundred twenty-one (6.6%) and 34 (1.9%) participants have lateral and medial epicondylitis, respectively. Nine psychosocial factors assessed had significant trends or associations with lateral epicondylitis, the largest of which was between physical exhaustion after work and lateral epicondylitis with and odds ratio of 7.04 (95% confidence interval = 2.02 to 24.51). Eight psychosocial factors had significant trends or relationships with medial epicondylitis, with the largest being between mental exhaustion after work with an odds ratio of 6.51 (95% confidence interval = 1.57 to 27.04). The breadth and strength of these associations after adjustment for confounding factors demonstrate meaningful relationships that need to be further investigated in prospective analyses.
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
Directory of Open Access Journals (Sweden)
Jason W. Osborne
2012-06-01
Full Text Available Logistic regression is slowly gaining acceptance in the social sciences, and fills an important niche in the researcher's toolkit: being able to predict important outcomes that are not continuous in nature. While OLS regression is a valuable tool, it cannot routinely be used to predict outcomes that are binary or categorical in nature. These outcomes represent important social science lines of research: retention in, or dropout from school, using illicit drugs, underage alcohol consumption, antisocial behavior, purchasing decisions, voting patterns, risky behavior, and so on. The goal of this paper is to briefly lead the reader through the surprisingly simple mathematics that underpins logistic regression: probabilities, odds, odds ratios, and logits. Anyone with spreadsheet software or a scientific calculator can follow along, and in turn, this knowledge can be used to make much more interesting, clear, and accurate presentations of results (especially to non-technical audiences. In particular, I will share an example of an interaction in logistic regression, how it was originally graphed, and how the graph was made substantially more user-friendly by converting the original metric (logits to a more readily interpretable metric (probability through three simple steps.
Essential results of analyses accompanying the leak rate experiments E22 at HDR
International Nuclear Information System (INIS)
Grebner, H.; Hoefler, A.; Hunger, H.
1994-01-01
Under the E22 test group of phase III of the HDR safety programme, experiments were performed on the crack opening and leak rate behaviour of pipe components of smaller nominal bores. The experiments were complemented by computations, in particular verifications, to qualify the computation models as one of the main aims of the HDR safety programme. Most of the analyses to determine crack openings were performed by means of the finite-element method, including elastic-plastic materials behaviour and, complementarily, assessing engineering methods. The leak rate was calculated by means of separate 2-phase computation models. Altogether, it may be concluded from the structural and fracture mechanical experiments with pipes, elbows and branch pieces, that crack openings and incipient cracks at loading with internal pressure or bending moment can be described with good accuracy by means of the finite-element programme ADINA and the developed FE-models. (orig.) [de
The number and choice of muscles impact the results of muscle synergy analyses
Directory of Open Access Journals (Sweden)
Katherine Muterspaugh Steele
2013-08-01
Full Text Available One theory for how humans control movement is that muscles are activated in weighted groups or synergies. Studies have shown that electromyography (EMG from a variety of tasks can be described by a low-dimensional space thought to reflect synergies. These studies use algorithms, such as nonnegative matrix factorization, to identify synergies from EMG. Due to experimental constraints, EMG can rarely be taken from all muscles involved in a task. However, it is unclear if the choice of muscles included in the analysis impacts estimated synergies. The aim of our study was to evaluate the impact of the number and choice of muscles on synergy analyses. We used a musculoskeletal model to calculate muscle activations required to perform an isometric upper-extremity task. Synergies calculated from the activations from the musculoskeletal model were similar to a prior experimental study. To evaluate the impact of the number of muscles included in the analysis, we randomly selected subsets of between 5 and 29 muscles and compared the similarity of the synergies calculated from each subset to a master set of synergies calculated from all muscles. We determined that the structure of synergies is dependent upon the number and choice of muscles included in the analysis. When five muscles were included in the analysis, the similarity of the synergies to the master set was only 0.57 ± 0.54; however, the similarity improved to over 0.8 with more than ten muscles. We identified two methods, selecting dominant muscles from the master set or selecting muscles with the largest maximum isometric force, which significantly improved similarity to the master set and can help guide future experimental design. Analyses that included a small subset of muscles also over-estimated the variance accounted for (VAF by the synergies compared to an analysis with all muscles. Thus, researchers should use caution using VAF to evaluate synergies when EMG is measured from a small
Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses
Energy Technology Data Exchange (ETDEWEB)
Krause, E.; et al.
2017-06-28
We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihood $\\Delta \\chi^2 \\le 0.045$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$~h^{-1}$) and galaxy-galaxy lensing (12 Mpc$~h^{-1}$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.
Olive, David J
2017-01-01
This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...
Lim, H. S.; Verzwyvelt, S. A.
1989-01-01
KOH concentration effects on cycle life of a Ni/H2 cell have been studied by carrying out a cycle life test of ten Ni/H2 boiler plate cells which contain electrolytes of various KOH concentrations. Failure analyses of these cells were carried out after completion of the life test which accumulated up to 40,000 cycles at an 80 percent depth of discharge over a period of 3.7 years. These failure analyses included studies on changes of electrical characteristics of test cells and component analyses after disassembly of the cell. The component analyses included visual inspections, dimensional changes, capacity measurements of nickel electrodes, scanning electron microscopy, BET surface area measurements, and chemical analyses. Results have indicated that failure mode and change in the nickel electrode varied as the concentration was varied, especially, when the concentration was changed from 31 percent or higher to 26 percent or lower.
Oles, Sylwia K; Fukui, Sadaaki; Rand, Kevin L; Salyers, Michelle P
2015-08-30
Hope (goal-directed thinking) and patient activation (knowledge and skills to manage one's illness) are both important in managing chronic conditions like schizophrenia. The relationship between hope and patient activation has not been clearly defined. However, hope may be viewed as a foundational, motivating factor that can lead to greater involvement in care and feelings of efficacy. The purpose of the present study was to understand the prospective relationship between hope and patient activation in a sample of adults with schizophrenia (N=118). This study was a secondary data analysis from a study on Illness Management and Recovery (IMR) - a curriculum-based approach to schizophrenia self-management. Data were collected at baseline (prior to any intervention), and at 9 and 18-month follow-up. As predicted, hope and patient activation were significantly related with each other, showing large positive concurrent correlations. Demographics and background characteristics were not significantly related to patient activation or hope. Longitudinal analyses found no specific directional effect, yet suggested that hope and patient activation mutually influence each other over time. Our findings add flexibility in designing recovery-based interventions - fostering hope may not be a pre-requisite for activating consumers to be more involved in their own care. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Koopman, S.J.; Lit, R.
2015-01-01
Summary: We develop a statistical model for the analysis and forecasting of football match results which assumes a bivariate Poisson distribution with intensity coefficients that change stochastically over time. The dynamic model is a novelty in the statistical time series analysis of match results
Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally
2018-02-01
1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.
Lammers, Jeroen; Goossens, Ferry; Conrod, Patricia; Engels, Rutger; Wiers, Reinout W; Kleinjan, Marloes
2017-08-01
To explore whether specific groups of adolescents (i.e., scoring high on personality risk traits, having a lower education level, or being male) benefit more from the Preventure intervention with regard to curbing their drinking behaviour. A clustered randomized controlled trial, with participants randomly assigned to a 2-session coping skills intervention or a control no-intervention condition. Fifteen secondary schools throughout The Netherlands; 7 schools in the intervention and 8 schools in the control condition. 699 adolescents aged 13-15; 343 allocated to the intervention and 356 to the control condition; with drinking experience and elevated scores in either negative thinking, anxiety sensitivity, impulsivity or sensation seeking. Differential effectiveness of the Preventure program was examined for the personality traits group, education level and gender on past-month binge drinking (main outcome), binge frequency, alcohol use, alcohol frequency and problem drinking, at 12months post-intervention. Preventure is a selective school-based alcohol prevention programme targeting personality risk factors. The comparator was a no-intervention control. Intervention effects were moderated by the personality traits group and by education level. More specifically, significant intervention effects were found on reducing alcohol use within the anxiety sensitivity group (OR=2.14, CI=1.40, 3.29) and reducing binge drinking (OR=1.76, CI=1.38, 2.24) and binge drinking frequency (β=0.24, p=0.04) within the sensation seeking group at 12months post-intervention. Also, lower educated young adolescents reduced binge drinking (OR=1.47, CI=1.14, 1.88), binge drinking frequency (β=0.25, p=0.04), alcohol use (OR=1.32, CI=1.06, 1.65) and alcohol use frequency (β=0.47, p=0.01), but not those in the higher education group. Post hoc latent-growth analyses revealed significant effects on the development of binge drinking (β=-0.19, p=0.02) and binge drinking frequency (β=-0.10, p=0
Energy Technology Data Exchange (ETDEWEB)
Falcnik, M; Brumovsky, M; Pav, T [Czech Nuclear Society, Prague (Czech Republic)
1994-12-31
In Czech and Slovak republics, six units of WWER 440/C type reactors are monitored by surveillance specimens programmes; the specimens are determined for static tensile testing, impact notch toughness testing and fracture toughness evaluation. Results of mechanical properties of these specimens after irradiation in intervals between 1 and 5 years of operation, are summarized and discussed with respect to the effect of individual heats and welded joints, radiation embrittlement, and annealing recovery. (authors). 3 refs., 11 figs., 2 tabs.
Soldati, A. L.; Beierlein, L.; Jacob, D. E.
2009-04-01
Freshwater mussels of the genus Diplodon (Bivalvia, Hyriidae) are the most abundant bivalve (today and in the past) in freshwater bodies at both sides of the South-Andean Cordillera. There are about 25 different Diplodon genera in Argentina and Chile that could be assigned almost completely to the species Diplodon chilensis (Gray, 1828) and two subspecies: D. ch. chilensis and D. ch. patagonicus; this latter species is found in Argentina between Mendoza (32Ë 52' S; 68Ë 51' W) and Chubut (45Ë 51' S; 67Ë 28' W), including the lakes and rivers of the target area, the Nahuel Huapi National Park (Castellanos, 1960). Despite their wide geographic distribution, Diplodon species have only rarely been used as climate archives in the southern hemisphere. Kaandorp et al. (2005) demonstrated for Diplodon longulus (Conrad 1874) collected from the Peruvian Amazonas that oxygen isotopic patterns in the shells could be used in order to reconstruct the precipitation regime and dry/wet seasonal of the monsoonal system in Amazonia. Although this study demonstrated the potential of Diplodon in climatological and ecological reconstructions in the southern hemisphere, as of yet, no systematic study of Diplodon as a multi-proxy archive has been undertaken for the Patagonian region. In this work we present sclerochronological analyses supported by ^18Oshell in recent mussel of Diplodon chilensis patagonicus (D'Orbigny, 1835) collected at Laguna El Trébol (42°S-71°W, Patagonia Argentina), one of the best studied water bodies in the region for paleoclimate analysis. Water temperature was measured every six hours for one year using a temperature sensor (Starmon mini®) placed at 5m depth in the lake, close to a mussel bank. Additionally, ^18Owater was measured monthly for the same time range.g^18Oshell values obtained by micro-milling at high spatial resolution in the growth increments of three Diplodon shells were compared to these records, and to air temperature and
Directory of Open Access Journals (Sweden)
Pinto João
2011-08-01
Full Text Available Abstract Background Anopheles gambiae M and S molecular forms, the major malaria vectors in the Afro-tropical region, are ongoing a process of ecological diversification and adaptive lineage splitting, which is affecting malaria transmission and vector control strategies in West Africa. These two incipient species are defined on the basis of single nucleotide differences in the IGS and ITS regions of multicopy rDNA located on the X-chromosome. A number of PCR and PCR-RFLP approaches based on form-specific SNPs in the IGS region are used for M and S identification. Moreover, a PCR-method to detect the M-specific insertion of a short interspersed transposable element (SINE200 has recently been introduced as an alternative identification approach. However, a large-scale comparative analysis of four widely used PCR or PCR-RFLP genotyping methods for M and S identification was never carried out to evaluate whether they could be used interchangeably, as commonly assumed. Results The genotyping of more than 400 A. gambiae specimens from nine African countries, and the sequencing of the IGS-amplicon of 115 of them, highlighted discrepancies among results obtained by the different approaches due to different kinds of biases, which may result in an overestimation of MS putative hybrids, as follows: i incorrect match of M and S specific primers used in the allele specific-PCR approach; ii presence of polymorphisms in the recognition sequence of restriction enzymes used in the PCR-RFLP approaches; iii incomplete cleavage during the restriction reactions; iv presence of different copy numbers of M and S-specific IGS-arrays in single individuals in areas of secondary contact between the two forms. Conclusions The results reveal that the PCR and PCR-RFLP approaches most commonly utilized to identify A. gambiae M and S forms are not fully interchangeable as usually assumed, and highlight limits of the actual definition of the two molecular forms, which might
Stacey, Peter; Butler, Owen
2008-06-01
This paper emphasizes the need for occupational hygiene professionals to require evidence of the quality of welding fume data from analytical laboratories. The measurement of metals in welding fume using atomic spectrometric techniques is a complex analysis often requiring specialist digestion procedures. The results from a trial programme testing the proficiency of laboratories in the Workplace Analysis Scheme for Proficiency (WASP) to measure potentially harmful metals in several different types of welding fume showed that most laboratories underestimated the mass of analyte on the filters. The average recovery was 70-80% of the target value and >20% of reported recoveries for some of the more difficult welding fume matrices were welding fume trial filter samples. Consistent rather than erratic error predominated, suggesting that the main analytical factor contributing to the differences between the target values and results was the effectiveness of the sample preparation procedures used by participating laboratories. It is concluded that, with practice and regular participation in WASP, performance can improve over time.
Water Use in Parabolic Trough Power Plants: Summary Results from WorleyParsons' Analyses
Energy Technology Data Exchange (ETDEWEB)
Turchi, C. S.; Wagner, M. J.; Kutscher, C. F.
2010-12-01
The National Renewable Energy Laboratory (NREL) contracted with WorleyParsons Group, Inc. to examine the effect of switching from evaporative cooling to alternative cooling systems on a nominal 100-MW parabolic trough concentrating solar power (CSP) plant. WorleyParsons analyzed 13 different cases spanning three different geographic locations (Daggett, California; Las Vegas, Nevada; and Alamosa, Colorado) to assess the performance, cost, and water use impacts of switching from wet to dry or hybrid cooling systems. NREL developed matching cases in its Solar Advisor Model (SAM) for each scenario to allow for hourly modeling and provide a comparison to the WorleyParsons results.Our findings indicate that switching from 100% wet to 100% dry cooling will result in levelized cost of electricity (LCOE) increases of approximately 3% to 8% for parabolic trough plants throughout most of the southwestern United States. In cooler, high-altitude areas like Colorado's San Luis Valley, WorleyParsons estimated the increase at only 2.5%, while SAM predicted a 4.4% difference. In all cases, the transition to dry cooling will reduce water consumption by over 90%. Utility time-of-delivery (TOD) schedules had similar impacts for wet- and dry-cooled plants, suggesting that TOD schedules have a relatively minor effect on the dry-cooling penalty.
Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.
Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa
2015-09-01
Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.
Statistical analyses of fracture toughness results for two irradiated high-copper welds
International Nuclear Information System (INIS)
Nanstad, R.K.; McCabe, D.E.; Haggag, F.M.; Bowman, K.O.; Downing, D.J.
1990-01-01
The objectives of the Heavy-Section Steel Irradiation Program Fifth Irradiation Series were to determine the effects of neutron irradiation on the transition temperature shift and the shape of the K Ic curve described in Sect. 6 of the ASME Boiler and Pressure Vessel Code. Two submerged-arc welds with copper contents of 0.23 and 0.31% were commercially fabricated in 215-mm-thick plates. Charpy V-notch (CVN) impact, tensile, drop-weight, and compact specimens up to 203.2 mm thick [1T, 2T, 4T, 6T, and 8T C(T)] were tested to provide a large data base for unirradiated material. Similar specimens with compacts up to 4T were irradiated at about 288 degrees C to a mean fluence of about 1.5 x 10 19 neutrons/cm 2 (>1 MeV) in the Oak Ridge Research Reactor. Both linear-elastic and elastic-plastic fracture mechanics methods were used to analyze all cleavage fracture results and local cleavage instabilities (pop-ins). Evaluation of the results showed that the cleavage fracture toughness values determined at initial pop-ins fall within the same scatter band as the values from failed specimens; thus, they were included in the data base for analysis (all data are designated K Jc )
DEFF Research Database (Denmark)
Brok, J.; Thorlund, K.; Gluud, C.
2008-01-01
in 80% (insufficient information size). TSA(15%) and TSA(LBHIS) found that 95% and 91% had absence of evidence. The remaining nonsignificant meta-analyses had evidence of lack of effect. CONCLUSION: TSA reveals insufficient information size and potentially false positive results in many meta......OBJECTIVES: To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries...... analogous to interim monitoring boundaries in a single trial. STUDY DESIGN AND SETTING: We applied TSA on meta-analyses performed in Cochrane Neonatal reviews. We calculated information sizes and monitoring boundaries with three different anticipated intervention effects of 30% relative risk reduction (TSA...
The Evaluation of Real Time Milk Analyse Result Reliability in the Czech Republic
Directory of Open Access Journals (Sweden)
Oto Hanuš
2016-01-01
Full Text Available The good result reliability of regular analyzes of milk composition could improve the health monitoring of dairy cows and herd management. The aim of this study was the analysis of measurement of abilities and properties of RT (Real Time system (AfiLab = AfiMilk (NIR measurement unit (near infrared spectroscopy and electrical conductivity (C of milk by conductometry + AfiFarm (calibration and interpretation software for the analysis of individual milk samples (IMSs. There were 2 × 30 IMSs in the experiment. The reference values (RVs of milk components and properties (fat (F, proteins (P, lactose (L, C and the somatic cell count (SCC were determined by conventional (direct and indirect: conductometry (C; infrared spectroscopy 1 with the filter technology and 2 with the Fourier transformations (F, P, L; fluoro-opto-electronic cell counting (SCC in the film on the rotation disc (1 and by flow cytometry (2 methods. AfiLab method (alternative showed less close relationships as compared to the RVs as relationships between reference methods. This was expected. However, these relationships (r were mostly significant: F from .597 to .738 (P ≤ 0.01 and ≤ 0.001; P from .284 to .787 (P > 0.05 and P ≤ 0.001; C .773 (P ≤ 0.001. Correlations (r were not significant (P > 0.05: L from −.013 to .194; SCC from −.148 to −.133. Variability of the RVs explained the following percentages of variability in AfiLab results: F to 54.4 %; P to 61.9 %; L only 3.8 %; C to 59.7 %. Explanatory power (reliability of AfiLab results to the animal is increasing with the regularity of their measurements (principle of real time application. Correlation values r (x minus 1.64 × sd for confidence interval (one-sided at a level of 95 % can be used for an alternative method in assessing the calibration quality. These limits are F 0.564, P 0.784 and C 0.715 and can be essential with the further implementation of this advanced technology of dairy herd management.
Methods and results for stress analyses on 14-ton, thin-wall depleted UF6 cylinders
International Nuclear Information System (INIS)
Kirkpatrick, J.R.; Chung, C.K.; Frazier, J.L.; Kelley, D.K.
1996-10-01
Uranium enrichment operations at the three US gaseous diffusion plants produce depleted uranium hexafluoride (DUF 6 ) as a residential product. At the present time, the inventory of DUF 6 in this country is more than half a million tons. The inventory of DUF 6 is contained in metal storage cylinders, most of which are located at the gaseous diffusion plants. The principal objective of the project is to ensure the integrity of the cylinders to prevent causing an environmental hazard by releasing the contents of the cylinders into the atmosphere. Another objective is to maintain the cylinders in such a manner that the DUF 6 may eventually be converted to a less hazardous material for final disposition. An important task in the DUF 6 cylinders management project is determining how much corrosion of the walls can be tolerated before the cylinders are in danger of being damaged during routine handling and shipping operations. Another task is determining how to handle cylinders that have already been damaged in a manner that will minimize the chance that a breach will occur or that the size of an existing breach will be significantly increased. A number of finite element stress analysis (FESA) calculations have been done to analyze the stresses for three conditions: (1) while the cylinder is being lifted, (2) when a cylinder is resting on two cylinders under it in the customary two-tier stacking array, and (3) when a cylinder is resting on tis chocks on the ground. Various documents describe some of the results and discuss some of the methods whereby they have been obtained. The objective of the present report is to document as many of the FESA cases done at Oak Ridge for 14-ton thin-wall cylinders as possible, giving results and a description of the calculations in some detail
Excavation damage and disturbance in crystalline rock - results from experiments and analyses
Energy Technology Data Exchange (ETDEWEB)
Baeckblom, Goeran (Conrox AB, Stockholm (Sweden))
2008-11-15
SKB plans to submit the application to site and construct the final repository for spent nuclear fuel in 2010. One important basis for the application is the results of the safety assessments, for which one particular dataset is the axial hydraulic properties along the underground openings used to calculate the transport resistance for radionuclide transport in the event that the canister is impaired. SKB initiated a project (Zuse) to be run over the period 2007-2009 to: - establish the current knowledge base on excavation damage and disturbance with particular focus on the axial hydraulic properties along the underground openings; - provide a basis for the requirements and compliance criteria for the excavation damaged and disturbed zone; - devise methods and instruments to infer or measure the excavation damage and disturbance at different times during the repository construction and operation before closure; - propose demonstration tests for which the methods are used in situ to qualify appropriate data for use in the safety reports. This report presents the results of the first stage of the Zuse project. Previous major experiments and studies in Canada, Finland, Japan, Sweden and Switzerland on spalling, excavation damage and disturbance was compiled and evaluated to provide the SR-Site report with a defendable database on the properties for the excavation damage and disturbance. In preparation for the SR-Site report, a number of sensitivity studies were conducted in which reasonable ranges of values for spalling and damage were selected in combination with an impaired backfill. The report here describes the construction of the repository in eleven steps and for each of these steps, the potential evolution of THMCB (Thermal, Mechanical, Hydraulic and Chemical/ Biological) processes are reviewed. In this work it was found that descriptions of the chemical and microbiological evolution connected with excavation damage and disturbance was lacking. The preliminary
Biosphere analyses for the safety assessment SR-Site - synthesis and summary of results
International Nuclear Information System (INIS)
Saetre, Peter
2010-12-01
This report summarises nearly 20 biosphere reports and gives a synthesis of the work performed within the SR-Site Biosphere project, i.e. the biosphere part of SR-Site. SR-Site Biosphere provides the main project with dose conversion factors (LDFs), given a unit release rate, for calculation of human doses under different release scenarios, and assesses if a potential release from the repository would have detrimental effects on the environment. The intention of this report is to give sufficient details for an overview of methods, results and major conclusions, with references to the biosphere reports where methods, data and results are presented and discussed in detail. The philosophy of the biosphere assessment was to make estimations of the radiological risk for humans and the environment as realistic as possible, based on the knowledge of present-day conditions at Forsmark and the past and expected future development of the site. This was achieved by using the best available knowledge, understanding and data from extensive site investigations from two sites. When sufficient information was not available, uncertainties were handled cautiously. A systematic identification and evaluation of features and processes that affect transport and accumulation of radionuclides at the site was conducted, and the results were summarised in an interaction matrix. Data and understanding from the site investigation was an integral part of this work, the interaction matrix underpinned the development of the radionuclide model used in the biosphere assessment. Understanding of the marine, lake and river and terrestrial ecosystems at the site was summarized in a conceptual model, and relevant features and process have been characterized to capture site specific parameter values. Detailed investigations of the structure and history of the regolith at the site and simulations of regolith dynamics were used to describe the present day state at Forsmark and the expected development of
Excavation damage and disturbance in crystalline rock - results from experiments and analyses
International Nuclear Information System (INIS)
Baeckblom, Goeran
2008-11-01
SKB plans to submit the application to site and construct the final repository for spent nuclear fuel in 2010. One important basis for the application is the results of the safety assessments, for which one particular dataset is the axial hydraulic properties along the underground openings used to calculate the transport resistance for radionuclide transport in the event that the canister is impaired. SKB initiated a project (Zuse) to be run over the period 2007-2009 to: - establish the current knowledge base on excavation damage and disturbance with particular focus on the axial hydraulic properties along the underground openings; - provide a basis for the requirements and compliance criteria for the excavation damaged and disturbed zone; - devise methods and instruments to infer or measure the excavation damage and disturbance at different times during the repository construction and operation before closure; - propose demonstration tests for which the methods are used in situ to qualify appropriate data for use in the safety reports. This report presents the results of the first stage of the Zuse project. Previous major experiments and studies in Canada, Finland, Japan, Sweden and Switzerland on spalling, excavation damage and disturbance was compiled and evaluated to provide the SR-Site report with a defendable database on the properties for the excavation damage and disturbance. In preparation for the SR-Site report, a number of sensitivity studies were conducted in which reasonable ranges of values for spalling and damage were selected in combination with an impaired backfill. The report here describes the construction of the repository in eleven steps and for each of these steps, the potential evolution of THMCB (Thermal, Mechanical, Hydraulic and Chemical/ Biological) processes are reviewed. In this work it was found that descriptions of the chemical and microbiological evolution connected with excavation damage and disturbance was lacking. The preliminary
Results of the analyses of the intercomparison samples of natural dioxide SR-1
International Nuclear Information System (INIS)
Aigner, H.; Kuhn, E.; Deron, S.
1980-08-01
Samples of a homogeneous powder of natural uranium dioxide, SR-1, were distributed to 37 laboratories in November 1977 for intercomparison of the precisions and accuracies of wet chemical assays. 17 laboratories reported 18 sets of results (one laboratory applied two techniques). The analytical methods which were applied were: titration (11), coulometry (2), precipitation-gravimetry (1), flourimetry (2), X-Ray flourescence (1) and neutron activation (1). Analysis of variance yield for each combination of laboratory and technique the estimates of the measurement errors, the dissolution or treatment errors and the fluctuation of the measurements between sample bottles. Time effects have also been tested. The measurement errors vary between 0.01% and 6.4%. Eleven laboratories agree within 0.25% with the reference value. No mean obtained by wet chemical methods is biased by more than 0.4%. The biases of the other methods (flourimetry, X-Ray fluorescence and neutron activation) vary between 0.5% and 4.3%. The biases of 9 laboratories or techniques are greater than expected from their random errors. The mean bias of the fourteen wet chemical methods is equal to 0.08% U with a standard deviation of +-0.18% U
Essential results of analyses accompanying the leak rate experiments E22 at HDR
International Nuclear Information System (INIS)
Grebner, H.; Hoefler, A.; Hunger, H.
1997-01-01
During phase III of the HDR Safety Programme (HDR: decommissioned overheated steam reactor in Karlstein, Germany), experiments were performed in test group E22 on small-bore austenitic straight piping and on pipe elbows and branches containing through-wall cracks. The main aim was the determination of crack opening and leak rate behaviour for the cracked components under almost operational pressure and temperature loading conditions, especially including transient bending moments. In addition to machined slits, naturally grown fatigue cracks were also considered to cover the leakage behaviour. The experiments were accompanied by calculations, mainly performed by GRS. The paper describes the most important aspects and the essential results of the calculations and analysis. The main outcome was that the crack opening and initiation of crack growth can be described with the finite element techniques applied with sufficient accuracy. However, the qualification of the leak rate models could not be completed successfully, and therefore more sophisticated experiments of this kind are needed. (orig.)
Biosphere analyses for the safety assessment SR-Site - synthesis and summary of results
Energy Technology Data Exchange (ETDEWEB)
Saetre, Peter [comp.
2010-12-15
This report summarises nearly 20 biosphere reports and gives a synthesis of the work performed within the SR-Site Biosphere project, i.e. the biosphere part of SR-Site. SR-Site Biosphere provides the main project with dose conversion factors (LDFs), given a unit release rate, for calculation of human doses under different release scenarios, and assesses if a potential release from the repository would have detrimental effects on the environment. The intention of this report is to give sufficient details for an overview of methods, results and major conclusions, with references to the biosphere reports where methods, data and results are presented and discussed in detail. The philosophy of the biosphere assessment was to make estimations of the radiological risk for humans and the environment as realistic as possible, based on the knowledge of present-day conditions at Forsmark and the past and expected future development of the site. This was achieved by using the best available knowledge, understanding and data from extensive site investigations from two sites. When sufficient information was not available, uncertainties were handled cautiously. A systematic identification and evaluation of features and processes that affect transport and accumulation of radionuclides at the site was conducted, and the results were summarised in an interaction matrix. Data and understanding from the site investigation was an integral part of this work, the interaction matrix underpinned the development of the radionuclide model used in the biosphere assessment. Understanding of the marine, lake and river and terrestrial ecosystems at the site was summarized in a conceptual model, and relevant features and process have been characterized to capture site specific parameter values. Detailed investigations of the structure and history of the regolith at the site and simulations of regolith dynamics were used to describe the present day state at Forsmark and the expected development of
Bartels, Ronald H M A; Donk, Roland D; Verhagen, Wim I M; Hosman, Allard J F; Verbeek, André L M
2017-11-01
The results of meta-analyses are frequently reported, but understanding and interpreting them is difficult for both clinicians and patients. Statistical significances are presented without referring to values that imply clinical relevance. This study aimed to use the minimal clinically important difference (MCID) to rate the clinical relevance of a meta-analysis. This study is a review of the literature. This study is a review of meta-analyses relating to a specific topic, clinical results of cervical arthroplasty. The outcome measure used in the study was the MCID. We performed an extensive literature search of a series of meta-analyses evaluating a similar subject as an example. We searched in Pubmed and Embase through August 9, 2016, and found articles concerning meta-analyses of the clinical outcome of cervical arthroplasty compared with that of anterior cervical discectomy with fusion in cases of cervical degenerative disease. We evaluated the analyses for statistical significance and their relation to MCID. MCID was defined based on results in similar patient groups and a similar disease entity reported in the literature. We identified 21 meta-analyses, only one of which referred to MCID. However, the researchers used an inappropriate measurement scale and, therefore, an incorrect MCID. The majority of the conclusions were based on statistical results without mentioning clinical relevance. The majority of the articles we reviewed drew conclusions based on statistical differences instead of clinical relevance. We recommend introducing the concept of MCID while reporting the results of a meta-analysis, as well as mentioning the explicit scale of the analyzed measurement. Copyright © 2017 Elsevier Inc. All rights reserved.
Understanding logistic regression analysis
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...
International Nuclear Information System (INIS)
Buck, John W.; McDonald, John P.; Taira, Randal Y.
2002-01-01
To support cleanup and closure of these tanks, modeling is performed to understand and predict potential impacts to human health and the environment. Pacific Northwest National Laboratory developed a screening tool for the United States Department of Energy, Office of River Protection that estimates the long-term human health risk, from a strategic planning perspective, posed by potential tank releases to the environment. This tool is being conditioned to more detailed model analyses to ensure consistency between studies and to provide scientific defensibility. Once the conditioning is complete, the system will be used to screen alternative cleanup and closure strategies. The integration of screening and detailed models provides consistent analyses, efficiencies in resources, and positive feedback between the various modeling groups. This approach of conditioning a screening methodology to more detailed analyses provides decision-makers with timely and defensible information and increases confidence in the results on the part of clients, regulators, and stakeholders
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us FANTOM... JASPAR) Data file File name: Motifs File URL: ftp://ftp.biosciencedbc.jp/archive/fantom5/datafiles/phase1.3...tabase Database Description Download License Update History of This Database Site Policy | Contact Us Results of de-novo and Motif activity analyses - FANTOM5 | LSDB Archive ...
2017-09-01
in the hybrid scheme. They conclude that in the Lorenz model they investigated, the hybrid scheme cannot result in errors that are simultaneously ...centered on the analysis time. Note that the spatial and temporal refinement of the analyses are taking place simultaneously (i.e., the first analysis...of a strong capping inversion and then a deep elevated mixed layer. At 1800 UTC (Fig. 5b), daytime heating along with the formation of a convective
McCormick, Patrick W.; Lewis, Gary D.; Dujovny, Manuel; Ausman, James I.; Stewart, Mick; Widman, Ronald A.
1992-05-01
Near infrared light generated by specialized instrumentation was passed through artificially oxygenated human blood during simultaneous sampling by a co-oximeter. Characteristic absorption spectra were analyzed to calculate the ratio of oxygenated to reduced hemoglobin. A positive linear regression fit between diffuse transmission oximetry and measured blood oxygenation over the range 23% to 99% (r2 equals .98, p signal was observed in the patient over time. The procedure was able to be performed clinically without difficulty; rSO2 values recorded continuously demonstrate the usefulness of the technique. Using the same instrumentation, arterial input and cerebral response functions, generated by IV tracer bolus, were deconvoluted to measure mean cerebral transit time. Date collected over time provided a sensitive index of changes in cerebral blood flow as a result of therapeutic maneuvers.
Understanding logistic regression analysis.
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.
Osborne, Jason W.
2012-01-01
Logistic regression is slowly gaining acceptance in the social sciences, and fills an important niche in the researcher's toolkit: being able to predict important outcomes that are not continuous in nature. While OLS regression is a valuable tool, it cannot routinely be used to predict outcomes that are binary or categorical in nature. These…
Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian
2016-01-07
Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable
International Nuclear Information System (INIS)
Mazzinghi, A.
2014-01-01
Beato Angelico is one of the most important Italian painters of the Renaissance period, in particular he was a master of the so-called 'Buon fresco' technique for mural paintings. A wide diagnostic campaign with X-Ray Fluorescence (XRF) analyses has been carried out on three masterworks painted by Beato Angelico in the San Marco monastery in Florence: the Crocifissione con Santi, the 'Annunciazione' and the 'Madonna delle Ombre'. The latter is painted by mixing fresco and secco techniques, which makes it of particular interest for the study of two different paintings techniques of the same artist. Then the aim of the study was focused on the characterization of the painting palette, and therefore the painting techniques, used by Beato Angelico. Moreover, the conservators were interested in the study of degradation processes and old restoration treatments. Our analyses have been carried out by means of the XRF spectrometer developed at LABEC laboratory at Istituto Nazionale di Fisica Nucleare in Florence (Italy). XRF is indeed especially suited for such a kind of study, allowing for multi-elemental, nondestructive, non-invasive analyses in a short time, with portable instruments. In this paper the first results concerning the XRF analysis are presented.
Panel data specifications in nonparametric kernel regression
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard; Henningsen, Arne
parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...
Gulácsi, László; Rencz, Fanni; Péntek, Márta; Brodszky, Valentin; Lopert, Ruth; Hevér, Noémi V; Baji, Petra
2014-05-01
Several Central and Eastern European (CEE) countries require cost-utility analyses (CUAs) to support reimbursement formulary listing. However, CUAs informed by local evidence are often unavailable, and the cost-effectiveness of the several currently reimbursed biologicals is unclear. To estimate the cost-effectiveness as multiples of per capita GDP/quality adjusted life years (QALY) of four biologicals (infliximab, etanercept, adalimumab, golimumab) currently reimbursed in six CEE countries in six inflammatory rheumatoid and bowel disease conditions. Systematic literature review of published cost-utility analyses in the selected conditions, using the United Kingdom (UK) as reference country and with study selection criteria set to optimize the transfer of results to the CEEs. Prices in each CEE country were pro-rated against UK prices using purchasing power parity (PPP)-adjusted per capita GDP, and local GDP per capita/QALY ratios estimated. Central and Eastern European countries list prices were 144-333% higher than pro rata prices. Out of 85 CUAs identified by previous systematic literature reviews, 15 were selected as a convenience sample for estimating the cost-effectiveness of biologicals in the CEE countries in terms of per capita GDP/QALY. Per capita GDP/QALY values varied from 0.42 to 6.4 across countries and conditions (Bulgaria: 0.97-6.38; Czech Republic: 0.42-2.76; Hungary: 0.54-3.54; Poland: 0.59-3.90; Romania: 0.77-5.07; Slovakia: 0.55-3.61). While results must be interpreted with caution, calculating pro rata (cost-effective) prices and per capita GDP/QALY ratios based on CUAs can aid reimbursement decision-making in the absence of analyses using local data.
KOH concentration effect on the cycle life of nickel-hydrogen cells. 4: Results of failure analyse
Lim, H. S.; Verzwyvelt, S. A.
1989-01-01
Effects of KOH concentrations on failure modes and mechanisms of nickel-hydrogen cells were studied using long cycled boiler plate cells containing electrolytes of various KOH concentrations ranging 21 to 36 percent. Life of these cells were up to 40,000 cycles in an accelerated low earth orbit (LEO) cycle regime at 80 percent depth of discharge. An interim life test results were reported earlier in J. Power Sources, 22, 213-220, 1988. The results of final life test, end-of-life cell performance, and teardown analyses are discussed. These teardown analyses included visual observations, measurements of nickel electrode capacity in an electrolyte-flooded cell, dimensional changes of cell components, SEM studies on cell cross section, BET surface area and pore volume distribution in cycled nickel electrodes, and chemical analyses. Cycle life of a nickel-hydrogen cell was improved tremendously as KOH concentration was decreased from 36 to 31 percent and from 31 to 26 percent while effect of further concentration decrease was complicated as described in our earlier report. Failure mode of high concentration (31 to 36 percent) cells was gradual capacity decrease, while that of low concentration (21 to 26 percent) cells was mainly formation of a soft short. Long cycled (25,000 to 40,000 cycles) nickel electrodes were expanded more than 50 percent of the initial value, but no correlation was found between this expansion and measured capacity. All electrodes cycled in low concentration (21 to 26 percent) cells had higher capacity than those cycled in high concentration (31 to 36 percent) cells.
Numerical analyses of an ex-core fuel incident: Results of the OECD-IAEA Paks Fuel Project
Energy Technology Data Exchange (ETDEWEB)
Hozer, Z., E-mail: hozer@aeki.kfki.h [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary); Aszodi, A. [BME NTI Budapest (Hungary); Barnak, M. [IVS, Trnava (Slovakia); Boros, I. [BME NTI Budapest (Hungary); Fogel, M. [VUJE, Trnava (Slovakia); Guillard, V. [IRSN, Cadarache (France); Gyori, Cs. [ITU, EU, Karlsruhe (Germany); Hegyi, G. [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary); Horvath, G.L. [VEIKI, Budapest (Hungary); Nagy, I. [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary); Junninen, P. [VTT, Espoo (Finland); Kobzar, V. [KI, Moscow (Russian Federation); Legradi, G. [BME NTI Budapest (Hungary); Molnar, A. [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary); Pietarinen, K. [VTT, Espoo (Finland); Perneczky, L. [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary); Makihara, Y. [ATMEA, Paris (France); Matejovic, P. [IVS, Trnava (Slovakia); Perez-Fero, E.; Slonszki, E. [Hungarian Academy of Sciences KFKI Atomic Energy Research Institute, H-1525 Budapest, P.O. Box 49 (Hungary)
2010-03-15
The OECD-IAEA Paks Fuel Project was developed to support the understanding of fuel behaviour in accident conditions on the basis of analyses of the Paks-2 incident. Numerical simulation of the most relevant aspects of the event and comparison of the calculation results with the available data from the incident was carried out between 2006 and 2007. A database was compiled to provide input for the code calculations. The activities covered the following three areas: (a) Thermal hydraulic calculations described the cooling conditions possibly established during the incident. (b) Simulation of fuel behaviour described the oxidation and degradation mechanisms of the fuel assemblies. (c) The release of fission products from the failed fuel rods was estimated and compared to available measured data. The applied used codes captured the most important events of the Paks-2 incident and the calculated results improved the understanding of the causes and mechanisms of fuel failure. The numerical analyses showed that the by-pass flow leading to insufficient cooling amounted to 75-90% of the inlet flow rate, the maximum temperature in the tank was between 1200 and 1400 deg. C, the degree of zirconium oxidation reached 4-12% and the mass of produced hydrogen was between 3 and 13 kg.
Yamashita, Takashi; Kart, Cary S; Noe, Douglas A
2012-12-01
Type 2 diabetes is known to contribute to health disparities in the U.S. and failure to adhere to recommended self-care behaviors is a contributing factor. Intervention programs face difficulties as a result of patient diversity and limited resources. With data from the 2005 Behavioral Risk Factor Surveillance System, this study employs a logistic regression tree algorithm to identify characteristics of sub-populations with type 2 diabetes according to their reported frequency of adherence to four recommended diabetes self-care behaviors including blood glucose monitoring, foot examination, eye examination and HbA1c testing. Using Andersen's health behavior model, need factors appear to dominate the definition of which sub-groups were at greatest risk for low as well as high adherence. Findings demonstrate the utility of easily interpreted tree diagrams to design specific culturally appropriate intervention programs targeting sub-populations of diabetes patients who need to improve their self-care behaviors. Limitations and contributions of the study are discussed.
Grafmeyer, D; Bondon, M; Manchon, M; Levillain, P
1995-01-01
The director of a laboratory has to be sure to give out reliable results for routine tests on automatic analysers regardless of the clinical context. However, he may find hyperbilirubinaemia in some circumstances, parenteral nutrition causing turbidity in others, and haemolysis occurring if sampling is difficult. For this reason, the Commission for Instrumentation of the Société Française de Biologie Clinique (SFBC) (president Alain Feuillu) decided to look into "visible" interferences--bilirubin, haemolysis and turbidity--and their effect on 20 major tests: 13 substrates/chemistries: albumin, calcium, cholesterol, creatinine, glucose, iron, magnesium, phosphorus, total bilirubin, total proteins, triacylglycerols, uric acid, urea, and 7 enzymatic activities: alkaline phosphatase, alanine aminotransferase, alpha-amylase, aspartate aminotransferase, creatine kinase, gamma-glutamyl transferase and lactate dehydrogenase measured on 15 automatic analysers representative of those found on the French market (Astra 8, AU 510, AU 5010, AU 5000, Chem 1, CX 7, Dax 72, Dimension, Ektachem, Hitachi 717, Hitachi 737, Hitachi 747, Monarch, Open 30, Paramax, Wako 30 R) and to see how much they affect the accuracy of results under routine conditions in the laboratory. The study was carried out following the SFBC protocol for the validation of techniques using spiked plasma pools with bilirubin, ditauro-bilirubin, haemoglobin (from haemolysate) and Intralipid (turbidity). Overall, the following results were obtained: haemolysis affects tests the most often (34.5% of cases); total bilirubin interferes in 21.7% of cases; direct bilirubin and turbidity seem to interfere less at around 17%. The different tests are not affected to the same extent; enzyme activity is hardly affected at all; on the other hand certain major tests are extremely sensitive, increasingly so as we go through the following: creatinine (interference of bilirubin), triacylglycerols (interference of bilirubin and
Mulroy, Sara J; Winstein, Carolee J; Kulig, Kornelia; Beneck, George J; Fowler, Eileen G; DeMuth, Sharon K; Sullivan, Katherine J; Brown, David A; Lane, Christianne J
2011-12-01
Each of the 4 randomized clinical trials (RCTs) hosted by the Physical Therapy Clinical Research Network (PTClinResNet) targeted a different disability group (low back disorder in the Muscle-Specific Strength Training Effectiveness After Lumbar Microdiskectomy [MUSSEL] trial, chronic spinal cord injury in the Strengthening and Optimal Movements for Painful Shoulders in Chronic Spinal Cord Injury [STOMPS] trial, adult stroke in the Strength Training Effectiveness Post-Stroke [STEPS] trial, and pediatric cerebral palsy in the Pediatric Endurance and Limb Strengthening [PEDALS] trial for children with spastic diplegic cerebral palsy) and tested the effectiveness of a muscle-specific or functional activity-based intervention on primary outcomes that captured pain (STOMPS, MUSSEL) or locomotor function (STEPS, PEDALS). The focus of these secondary analyses was to determine causal relationships among outcomes across levels of the International Classification of Functioning, Disability and Health (ICF) framework for the 4 RCTs. With the database from PTClinResNet, we used 2 separate secondary statistical approaches-mediation analysis for the MUSSEL and STOMPS trials and regression analysis for the STEPS and PEDALS trials-to test relationships among muscle performance, primary outcomes (pain related and locomotor related), activity and participation measures, and overall quality of life. Predictive models were stronger for the 2 studies with pain-related primary outcomes. Change in muscle performance mediated or predicted reductions in pain for the MUSSEL and STOMPS trials and, to some extent, walking speed for the STEPS trial. Changes in primary outcome variables were significantly related to changes in activity and participation variables for all 4 trials. Improvement in activity and participation outcomes mediated or predicted increases in overall quality of life for the 3 trials with adult populations. Variables included in the statistical models were limited to those
Modified Regression Correlation Coefficient for Poisson Regression Model
Kaengthong, Nattacha; Domthong, Uthumporn
2017-09-01
This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).
Directory of Open Access Journals (Sweden)
Leo Oey
2013-01-01
Full Text Available A data-assimilated Taiwan Ocean Prediction (ATOP system is being developed at the National Central University, Taiwan. The model simulates sea-surface height, three-dimensional currents, temperature and salinity and turbulent mixing. The model has options for tracer and particle-tracking algorithms, as well as for wave-induced Stokes drift and wave-enhanced mixing and bottom drag. Two different forecast domains have been tested: a large-grid domain that encompasses the entire North Pacific Ocean at 0.1° × 0.1° horizontal resolution and 41 vertical sigma levels, and a smaller western North Pacific domain which at present also has the same horizontal resolution. In both domains, 25-year spin-up runs from 1988 - 2011 were first conducted, forced by six-hourly Cross-Calibrated Multi-Platform (CCMP and NCEP reanalysis Global Forecast System (GSF winds. The results are then used as initial conditions to conduct ocean analyses from January 2012 through February 2012, when updated hindcasts and real-time forecasts begin using the GFS winds. This paper describes the ATOP system and compares the forecast results against satellite altimetry data for assessing model skills. The model results are also shown to compare well with observations of (i the Kuroshio intrusion in the northern South China Sea, and (ii subtropical counter current. Review and comparison with other models in the literature of ¡§(i¡¨ are also given.
Chong, A; Nazarian, N; Chandrananth, J; Tacey, M; Shepherd, D; Tran, P
2015-02-01
This study sought to determine the medium-term patient-reported and radiographic outcomes in patients undergoing surgery for hallux valgus. A total of 118 patients (162 feet) underwent surgery for hallux valgus between January 2008 and June 2009. The Manchester-Oxford Foot Questionnaire (MOXFQ), a validated tool for the assessment of outcome after surgery for hallux valgus, was used and patient satisfaction was sought. The medical records and radiographs were reviewed retrospectively. At a mean of 5.2 years (4.7 to 6.0) post-operatively, the median combined MOXFQ score was 7.8 (IQR:0 to 32.8). The median domain scores for pain, walking/standing, and social interaction were 10 (IQR: 0 to 45), 0 (IQR: 0 to 32.1) and 6.3 (IQR: 0 to 25) respectively. A total of 119 procedures (73.9%, in 90 patients) were reported as satisfactory but only 53 feet (32.7%, in 43 patients) were completely asymptomatic. The mean (SD) correction of hallux valgus, intermetatarsal, and distal metatarsal articular angles was 18.5° (8.8°), 5.7° (3.3°), and 16.6° (8.8°), respectively. Multivariable regression analysis identified that an American Association of Anesthesiologists grade of >1 (Incident Rate Ratio (IRR) = 1.67, p-value = 0.011) and recurrent deformity (IRR = 1.77, p-value = 0.003) were associated with significantly worse MOXFQ scores. No correlation was found between the severity of deformity, the type, or degree of surgical correction and the outcome. When using a validated outcome score for the assessment of outcome after surgery for hallux valgus, the long-term results are worse than expected when compared with the short- and mid-term outcomes, with 25.9% of patients dissatisfied at a mean follow-up of 5.2 years. ©2015 The British Editorial Society of Bone & Joint Surgery.
and Multinomial Logistic Regression
African Journals Online (AJOL)
This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).
Pedrini, D. T.; Pedrini, Bonnie C.
Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…
Directory of Open Access Journals (Sweden)
Emily Grundy
2015-01-01
Full Text Available Background: Previous research shows associations between fertility histories and later life health. The childless, those with large families, and those with a young age at entry to parenthood generally have higher mortality and worse health than parents of two or three children. These associations are hypothesised to reflect a range of biosocial influences, but underlying mechanisms are poorly understood. Objective: To identify pathways from fertility histories to later life health by examining mediation through health-related behaviours, social support and strain, and wealth. Additionally to examine mediation through allostatic load - an indicator of multisystem physical dysregulation, hypothesised to be an outcome of chronic stress. Methods: Associations between fertility histories, mediators, and outcomes were analysed using path models. Data were drawn from the English Longitudinal Study of Ageing. Outcomes studied were a measure of allostatic load based on 9 biomarkers and self-reported long-term illness which limited activities. Results: Early parenthood (Conclusions: In England early parenthood and larger family size are associated with less wealth and poorer health behaviours and this accounts for much of the association with health. At least part of this operates through stress-related physiological dysfunction (allostatic load.
Motzer, Robert J; Ravaud, Alain; Patard, Jean-Jacques; Pandha, Hardev S; George, Daniel J; Patel, Anup; Chang, Yen-Hwa; Escudier, Bernard; Donskov, Frede; Magheli, Ahmed; Carteni, Giacomo; Laguerre, Brigitte; Tomczak, Piotr; Breza, Jan; Gerletti, Paola; Lechuga, Mariajose; Lin, Xun; Casey, Michelle; Serfass, Lucile; Pantuck, Allan J; Staehler, Michael
2018-01-01
Adjuvant sunitinib significantly improved disease-free survival (DFS) versus placebo in patients with locoregional renal cell carcinoma (RCC) at high risk of recurrence after nephrectomy (hazard ratio [HR] 0.76, 95% confidence interval [CI] 0.59-0.98; p=0.03). To report the relationship between baseline factors and DFS, pattern of recurrence, and updated overall survival (OS). Data for 615 patients randomized to sunitinib (n=309) or placebo (n=306) in the S-TRAC trial. Subgroup DFS analyses by baseline risk factors were conducted using a Cox proportional hazards model. Baseline risk factors included: modified University of California Los Angeles integrated staging system criteria, age, gender, Eastern Cooperative Oncology Group performance status (ECOG PS), weight, neutrophil-to-lymphocyte ratio (NLR), and Fuhrman grade. Of 615 patients, 97 and 122 in the sunitinib and placebo arms developed metastatic disease, with the most common sites of distant recurrence being lung (40 and 49), lymph node (21 and 26), and liver (11 and 14), respectively. A benefit of adjuvant sunitinib over placebo was observed across subgroups, including: higher risk (T3, no or undetermined nodal involvement, Fuhrman grade ≥2, ECOG PS ≥1, T4 and/or nodal involvement; hazard ratio [HR] 0.74, 95% confidence interval [CI] 0.55-0.99; p=0.04), NLR ≤3 (HR 0.72, 95% CI 0.54-0.95; p=0.02), and Fuhrman grade 3/4 (HR 0.73, 95% CI 0.55-0.98; p=0.04). All subgroup analyses were exploratory, and no adjustments for multiplicity were made. Median OS was not reached in either arm (HR 0.92, 95% CI 0.66-1.28; p=0.6); 67 and 74 patients died in the sunitinib and placebo arms, respectively. A benefit of adjuvant sunitinib over placebo was observed across subgroups. The results are consistent with the primary analysis, which showed a benefit for adjuvant sunitinib in patients at high risk of recurrent RCC after nephrectomy. Most subgroups of patients at high risk of recurrent renal cell carcinoma after
Directory of Open Access Journals (Sweden)
Man Luo
Full Text Available Ischemic stroke (IS is a multifactorial disorder caused by both genetic and environmental factors. The combined effects of multiple susceptibility genes might result in a higher risk for IS than a single gene. Therefore, we investigated whether interactions among multiple susceptibility genes were associated with an increased risk of IS by evaluating gene polymorphisms identified in previous meta-analyses, including methylenetetrahydrofolate reductase (MTHFR C677T, beta fibrinogen (FGB, β-FG A455G and T148C, apolipoprotein E (APOE ε2-4, angiotensin-converting enzyme (ACE insertion/deletion (I/D, and endothelial nitric oxide synthase (eNOS G894T. In order to examine these interactions, 712 patients with IS and 774 controls in a Chinese Han population were genotyped using the SNaPshot method, and multifactor dimensionality reduction analysis was used to detect potential interactions among the candidate genes. The results of this study found that ACE I/D and β-FG T148C were significant synergistic contributors to IS. In particular, the ACE DD + β-FG 148CC, ACE DD + β-FG 148CT, and ACE ID + β-FG 148CC genotype combinations resulted in higher risk of IS. After adjusting for potential confounding IS risk factors (age, gender, family history of IS, hypertension history and history of diabetes mellitus using a logistic analysis, a significant correlation between the genotype combinations and IS patients persisted (overall stroke: adjusted odds ratio [OR] = 1.57, 95% confidence interval [CI]: 1.22-2.02, P < 0.001, large artery atherosclerosis subtype: adjusted OR = 1.50, 95% CI: 1.08-2.07, P = 0.016, small-artery occlusion subtype: adjusted OR = 2.04, 95% CI: 1.43-2.91, P < 0.001. The results of this study indicate that the ACE I/D and β-FG T148C combination may result in significantly higher risk of IS in this Chinese population.
Directory of Open Access Journals (Sweden)
Arendt Maryse
2008-01-01
Full Text Available Abstract This article addresses the problem of how to ensure consistency in messages communicating public health recommendations on environmental health and on child health. The World Health Organization states that the protection, promotion and support of breastfeeding rank among the most effective interventions to improve child survival. International public health policy recommends exclusive breastfeeding for six months, followed by continued breastfeeding with the addition of safe and adequate complementary foods for two years and beyond. Biomonitoring of breastmilk is used as an indicator of environmental pollution ending up in mankind. This article will therefore present the biomonitoring results of concentrations of residues in breastmilk in a wider context. These results are the mirror that reflects the chemical substances accumulated in the bodies of both men and women in the course of a lifetime. The accumulated substances in our bodies may have an effect on male or female reproductive cells; they are present in the womb, directly affecting the environment of the fragile developing foetus; they are also present in breastmilk. Evidence of man-made chemical residues in breastmilk can provide a shock tactic to push for stronger laws to protect the environment. However, messages about chemicals detected in breastmilk can become dramatized by the media and cause a backlash against breastfeeding, thus contradicting the public health messages issued by the World Health Organization. Analyses of breastmilk show the presence of important nutritional components and live protective factors active in building up the immune system, in gastro intestinal maturation, in immune defence and in providing antiviral, antiparasitic and antibacterial activity. Through cohort studies researchers in environmental health have concluded that long-term breastfeeding counterbalances the effect of prenatal exposure to chemicals causing delay in mental and
The Live program - Results of test L1 and joint analyses on transient molten pool thermal hydraulics
Energy Technology Data Exchange (ETDEWEB)
Buck, M.; Buerger, M. [Univ Stuttgart, Inst Kernenerget and Energiesyst, D-70569 Stuttgart (Germany); Miassoedov, A.; Gaus-Liu, X.; Palagin, A. [IRSN Forschungszentrum Karlsruhe GmbH, D-76021 Karlsruhe, (Germany); Godin-Jacqmin, L. [CEA Cadarache, DEN STRI LMA, F-13115 St Paul Les Durance (France); Tran, C. T.; Ma, W. M. [KTH, AlbaNova Univ Ctr, S-10691 Stockholm (Sweden); Chudanov, V. [Nucl Safety Inst, Moscow 113191 (Russian Federation)
2010-07-01
of the LIVE activities also provide data for a better understanding of in-core corium pool behaviour. The experimental results are being used for the development and validation of mechanistic models for the description of molten pool behaviour, In the present paper, a range of different models is used for post-test calculations and comparative analyses. This includes simplified, but fast running models implemented in the severe accident codes ASTEC and ATHLET-CD. Further, a computational tool developed at KTH (PECM model implemented in Fluent) is applied. These calculations are complemented by analyses with the CFD code CONV (thermal hydraulics of heterogeneous, viscous and heat-generating melts) which was developed at IBRAE (Nuclear Safety Institute of Russian Academy) within the RASPLAV project and was further improved within the ISTC 2936 Project. (authors)
International Nuclear Information System (INIS)
Samojlov, O.B.; Kajdalov, V.B.; Falkov, A.A.; Bolnov, V.A.; Morozkin, O.N.; Molchanov, V.L.; Ugryumov, A.V.
2010-01-01
TVSA is a fuel assembly with rigid skeleton formed by 6 angle pieces and SG is successfully operated at 17 VVER-1000 power units of Kalinin NPP, as well as at Ukrainian and Bulgarian NPPs. Based on a contract for fuel supply to the Temelin NPP, the TVSA-T fuel assembly was developed, building on proven solutions confirmed by operation of TVSA modifications during 4-6 years and by the results of post-irradiation examination. The TVSA-T design includes combined spacer grids (SG+MG) and by fuel column elongation by 150 mm. A set of analyses and experiments was performed to validate the design, including thermal hydraulic tests, validation of critical heat flux correlation for TVSA-T, integrated mechanical, vibration and lifetime tests. A licence to use the fuel has been granted by the Czech State Office for Nuclear Safety. The TVSA-T core is currently in operation at the Temelin-1 reactor unit. The presentation is concluded as follows: TVSA-T fuel assembly for Temelin has been validated. The TVSA-T design is based on approved technical decisions and meets the current requirements for lifetime, operational maneuverability and safety. The results of post-irradiation examination of TVSA-T operated at the Kalinin-1 unit for 4 years confirm the assembly operability, skeleton stiffness, geometric stability and normal fuel rod cladding condition. The properties of the TVSA fuel with MG allow the core power to be increased up to 3300 MW to match the envisaged future VVER (MIR-1200) design, providing allowable fuel rod power FΔh =1.63 (to implement effective fuel cycles). (P.A.)
Tzeng, I-Shiang; Liu, Su-Hsun; Chen, Kuan-Fu; Wu, Chin-Chieh; Chen, Jih-Chang
2016-10-01
To reduce patient boarding time at the emergency department (ED) and to improve the overall quality of the emergent care system in Taiwan, the Minister of Health and Welfare of Taiwan (MOHW) piloted the Grading Responsible Hospitals for Acute Care (GRHAC) audit program in 2007-2009.The aim of the study was to evaluate the impact of the GRHAC audit program on the identification and management of acute myocardial infarction (AMI)-associated ED visits by describing and comparing the incidence of AMI-associated ED visits before (2003-2007), during (2007-2009), and after (2009-2012) the initial audit program implementation.Using aggregated data from the MOHW of Taiwan, we estimated the annual incidence of AMI-associated ED visits by Poisson regression models. We used segmented regression techniques to evaluate differences in the annual rates and in the year-to-year changes in AMI-associated ED visits between 2003 and 2012. Medical comorbidities such as diabetes mellitus, hyperlipidemia, and hypertensive disease were considered as potential confounders.Overall, the number of AMI-associated patient visits increased from 8130 visits in 2003 to 12,695 visits in 2012 (P-value for trend capacity for timely and correctly diagnosing and managing patients presenting with AMI-associated symptoms or signs at the ED.
International Nuclear Information System (INIS)
Kroeger, W.; Mertens, J.
1985-01-01
As regards system-inherent risks, HTGR type reactors are evaluated with reference to the established light-water-moderated reactor types. Probabilistic HTGR risk analyses have shown modern HTGR systems to possess a balanced safety concept with a risk remaining distinctly below legally accepted values. Inversely, the development and optimization of the safety concepts have been (and are being) essentially co-determined by the probabilistic analyses, as it is technically sensible and economically necessary to render the specific safety-related HTGR properties eligible for licensing. (orig./HP) [de
Bayesian ARTMAP for regression.
Sasu, L M; Andonie, R
2013-10-01
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.
Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura
2008-02-10
Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk
Energy Technology Data Exchange (ETDEWEB)
Lydmark, Sara; Hallbeck, Lotta (Microbial Analytics Sweden AB (Sweden))
2011-04-15
The MINICAN project is located at the depth of 450 m in the Aespoe Hard Rock Laboratory and was initiated to study how corrosion of the cast iron insert inside a perforated copper canister would evolve with time. Miniature canisters with different perforations and with and without bentonite buffer in steel cages were installed and monitored. Samples for microbiological and gas composition together with samples for groundwater chemistry have been analysed at three occasions in 2007, 2008 and 2010. The results show how the microbial populations outside the canisters have evolved from a mixture of microorganism able to grow on organic material, like heterotrophic organisms, and acetogens that grow on hydrogen gas and carbon dioxide in 2007, to populations with a large proportion of sulphate-reducing bacteria in 2010. The highest number of sulphate-reducing bacteria was found in MINICAN experiment A02C, canister with one hole at the top of the copper canister, in 2010 with 2.4 x 104 mL-1 followed by 8 x 103 mL-1 in A03 (hole in the bottom of the canister) and 7 x 103 mL-1 in A06 (two holes at the top and no bentonite). The numbers of culturable heterotrophic bacteria were between 200 and 530 mL-1 in the experiments with bentonite in 2007 but below detection in all experiments in 2010. The same trend was shown for acetogenes. Measurable amounts of hydrogen gas were found in all experiments at all sampling occasions. There was no general trend for the amounts of hydrogen but there was an increase in three of the experiments and in the groundwater outside MINICAN. It was found that the water chemistry differed between A06 and A02-A04 experiment by higher sulphate and chloride concentrations in A06 compared to the others. By plotting the concentrations of chloride, sulphate against time, a decrease in sulphate concentration was found in all canister experiments. The chloride concentrations were stable during the same period. On the other hand, an increase in sulphate
International Nuclear Information System (INIS)
Lydmark, Sara; Hallbeck, Lotta
2011-04-01
The MINICAN project is located at the depth of 450 m in the Aespoe Hard Rock Laboratory and was initiated to study how corrosion of the cast iron insert inside a perforated copper canister would evolve with time. Miniature canisters with different perforations and with and without bentonite buffer in steel cages were installed and monitored. Samples for microbiological and gas composition together with samples for groundwater chemistry have been analysed at three occasions in 2007, 2008 and 2010. The results show how the microbial populations outside the canisters have evolved from a mixture of microorganism able to grow on organic material, like heterotrophic organisms, and acetogens that grow on hydrogen gas and carbon dioxide in 2007, to populations with a large proportion of sulphate-reducing bacteria in 2010. The highest number of sulphate-reducing bacteria was found in MINICAN experiment A02C, canister with one hole at the top of the copper canister, in 2010 with 2.4 x 10 4 mL -1 followed by 8 x 10 3 mL -1 in A03 (hole in the bottom of the canister) and 7 x 10 3 mL -1 in A06 (two holes at the top and no bentonite). The numbers of culturable heterotrophic bacteria were between 200 and 530 mL-1 in the experiments with bentonite in 2007 but below detection in all experiments in 2010. The same trend was shown for acetogenes. Measurable amounts of hydrogen gas were found in all experiments at all sampling occasions. There was no general trend for the amounts of hydrogen but there was an increase in three of the experiments and in the groundwater outside MINICAN. It was found that the water chemistry differed between A06 and A02-A04 experiment by higher sulphate and chloride concentrations in A06 compared to the others. By plotting the concentrations of chloride, sulphate against time, a decrease in sulphate concentration was found in all canister experiments. The chloride concentrations were stable during the same period. On the other hand, an increase in sulphate
Logistic regression applied to natural hazards: rare event logistic regression with replications
Directory of Open Access Journals (Sweden)
M. Guns
2012-06-01
Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Logistic regression applied to natural hazards: rare event logistic regression with replications
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Summary of Results from Analyses of Deposits of the Deep-Ocean Impact of the Eltanin Asteroid
Kyte, Frank T.; Kuhn, Gerhard; Gersonde, Rainer
2005-01-01
Deposits of the late Pliocene (2.5 Ma) Eltanin impact are unique in the known geological record. The only known example of a km-sized asteroid to impact a deep-ocean (5 km) basin, is the most meterorite-rich locality known. This was discovered as an Ir anomaly in sediments from three cores collected in 1965 by the USNS Eltanin. These cores contained mm-sized shock-melted asteroid materials and unmelted meteorite fragments. Mineral chemistry of meteorite fragments, and siderophole concentrations in melt rocks, indicate that the parent asteroid was a low-metal (4\\%) mesosiderite. A geological exploration of the impact in 1995 by Polarstern expedition ANT-XIV4 was near the Freeden Seamounts (57.3S, 90.5 W), and successfully collected three cores with impact deposits. Analyses showed that sediments as old as Eocene were eroded by the impact disturbance and redeposited in three distinct units. The lowermost is a chaotic assemblage of sediment fragments up to 50 cm in size. Above this is a laminated sand-rich unit that deposited as a turbulent flow, and this is overlain by a more fine-grained deposit of silts and clays that settled from a cloud of sediment suspended in the water column. Meteoritic ejecta particles were concentrated near the base of the uppermost unit, where coarse ejecta caught up with the disturbed sediment. Here we will present results from a new suite of cores collected on Polarstern expedition ANT-XVIIU5a. In 2001, the Polarstern returned to the impact area and explored a region of 80,000 sq-km., collecting at least 16 sediment cores with meteoritic ejecta. The known strewn field extends over a region 660 by 200 km. The meteoritic ejecta is most concentrated in cores on the Freeden seamounts, and in the basins to the north, where the amount of meteoritic material deposited on the ocean floor was as much as 3 g/sq-cm. These concentrations drop off to the north and the east to levels as low as approximately 0.1 g/sq-cm. We were unable to sample the
DEFF Research Database (Denmark)
Johansen, Søren
2008-01-01
The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...
Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-01
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.
Yi, Jun; Yang, Wenhong; Sun, Wen-Hua; Nomura, Kotohiro; Hada, Masahiko
2017-11-30
The NMR chemical shifts of vanadium ( 51 V) in (imido)vanadium(V) dichloride complexes with imidazolin-2-iminato and imidazolidin-2-iminato ligands were calculated by the density functional theory (DFT) method with GIAO. The calculated 51 V NMR chemical shifts were analyzed by the multiple linear regression (MLR) analysis (MLRA) method with a series of calculated molecular properties. Some of calculated NMR chemical shifts were incorrect using the optimized molecular geometries of the X-ray structures. After the global minimum geometries of all of the molecules were determined, the trend of the observed chemical shifts was well reproduced by the present DFT method. The MLRA method was performed to investigate the correlation between the 51 V NMR chemical shift and the natural charge, band energy gap, and Wiberg bond index of the V═N bond. The 51 V NMR chemical shifts obtained with the present MLR model were well reproduced with a correlation coefficient of 0.97.
Mohammad, Fahim; Theisen-Toupal, Jesse C.; Arnaout, Ramy
2014-01-01
Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to thei...
DEFF Research Database (Denmark)
Motzer, Robert J; Ravaud, Alain; Patard, Jean-Jacques
2018-01-01
BACKGROUND: Adjuvant sunitinib significantly improved disease-free survival (DFS) versus placebo in patients with locoregional renal cell carcinoma (RCC) at high risk of recurrence after nephrectomy (hazard ratio [HR] 0.76, 95% confidence interval [CI] 0.59-0.98; p=0.03). OBJECTIVE: To report...... sunitinib over placebo was observed across subgroups, including: higher risk (T3, no or undetermined nodal involvement, Fuhrman grade ≥2, ECOG PS ≥1, T4 and/or nodal involvement; hazard ratio [HR] 0.74, 95% confidence interval [CI] 0.55-0.99; p=0.04), NLR ≤3 (HR 0.72, 95% CI 0.54-0.95; p=0.02), and Fuhrman...... grade 3/4 (HR 0.73, 95% CI 0.55-0.98; p=0.04). All subgroup analyses were exploratory, and no adjustments for multiplicity were made. Median OS was not reached in either arm (HR 0.92, 95% CI 0.66-1.28; p=0.6); 67 and 74 patients died in the sunitinib and placebo arms, respectively. CONCLUSIONS...
Regression to Causality : Regression-style presentation influences causal attribution
DEFF Research Database (Denmark)
Bordacconi, Mats Joe; Larsen, Martin Vinæs
2014-01-01
of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...
Fleischmann, Robert; Tränkner, Steffi; Bathe-Peters, Rouven; Rönnefarth, Maria; Schmidt, Sein; Schreiber, Stephan J; Brandt, Stephan A
2018-03-01
The lack of objective disease markers is a major cause of misdiagnosis and nonstandardized approaches in delirium. Recent studies conducted in well-selected patients and confined study environments suggest that quantitative electroencephalography (qEEG) can provide such markers. We hypothesize that qEEG helps remedy diagnostic uncertainty not only in well-defined study cohorts but also in a heterogeneous hospital population. In this retrospective case-control study, EEG power spectra of delirious patients and age-/gender-matched controls (n = 31 and n = 345, respectively) were fitted in a linear model to test their performance as binary classifiers. We subsequently evaluated the diagnostic performance of the best classifiers in control samples with normal EEGs (n = 534) and real-world samples including pathologic findings (n = 4294). Test reliability was estimated through split-half analyses. We found that the combination of spectral power at F3-P4 at 2 Hz (area under the curve [AUC] = .994) and C3-O1 at 19 Hz (AUC = .993) provided a sensitivity of 100% and a specificity of 99% to identify delirious patients among normal controls. These classifiers also yielded a false positive rate as low as 5% and increased the pretest probability of being delirious by 57% in an unselected real-world sample. Split-half reliabilities were .98 and .99, respectively. This retrospective study yielded preliminary evidence that qEEG provides excellent diagnostic performance to identify delirious patients even outside confined study environments. It furthermore revealed reduced beta power as a novel specific finding in delirium and that a normal EEG excludes delirium. Prospective studies including parameters of pretest probability and delirium severity are required to elaborate on these promising findings.
Lamparter, J; Dick, H B; Krummenauer, Frank
2005-09-12
Laser in situ keratomileusis (LASIK) means a patient investment of 2426 Euro per eye, which usually cannot be funded by European health care insurers. In the context of recent resource allocation discussions, however, the cost effectiveness of LASIK could become an important indication of allocation decisions. Therefore an evidence based estimation of its incremental cost effectiveness was intended. Three independent meta analyses were implemented to estimate the refractive gain (dpt) due to conventional LASIK procedures as well as the predictability of the latter (%) (fraction of eyes achieving a postoperative refraction with maximum deviation of +/- 0.5 dpt from the target refraction). Study reports of 1995 - 2004 (English or German language) were screened for appropriate key words. Meta effects in refractive gain and predictability were estimated by means and standard deviations of reported effect measures. Cost data were estimated by German DRG rates and individual clinical pathway calculations; cost effectiveness was then computed in terms of the incremental cost effectiveness ratio (ICER) for both clinical benefit endpoints. A sensitivity analysis comprised cost variations of +/- 10 % and utility variations alongside the meta effects' 95% confidence intervals. Total direct costs from the patients' perspective were estimated at 2426 Euro per eye, associated with a refractive meta benefit of 5.93 dpt (95% meta confidence interval 5.32 - 6.54 dpt) and a meta predictability of 67% (43% - 91%). In terms of incremental costs, the unilateral LASIK implied a patient investion of 409 Euro (sensitivity range 351 - 473 Euro) per gained refractive unit or 36 Euro (27 - 56 Euro) per gained percentage point in predictability. When LASIK associated complication patterns were considered, the total direct costs amounted up to 3075 Euro, resulting in incremental costs of 519 Euro / dpt (sensitivity range 445 - 600 Euro / dpt) or 46 Euro / % (34 - 72 Euro / %). Most frequently
Directory of Open Access Journals (Sweden)
Fahim Mohammad
Full Text Available Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal". We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV and area under the receiver-operator characteristic curve (ROC AUCs. Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.
Mohammad, Fahim; Theisen-Toupal, Jesse C; Arnaout, Ramy
2014-01-01
Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV) and area under the receiver-operator characteristic curve (ROC AUCs). Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.
Mason, Cindi; Twomey, Janet; Wright, David; Whitman, Lawrence
2018-01-01
As the need for engineers continues to increase, a growing focus has been placed on recruiting students into the field of engineering and retaining the students who select engineering as their field of study. As a result of this concentration on student retention, numerous studies have been conducted to identify, understand, and confirm…
Directory of Open Access Journals (Sweden)
Jens Baumert
Full Text Available Plasma fibrinogen is an acute phase protein playing an important role in the blood coagulation cascade having strong associations with smoking, alcohol consumption and body mass index (BMI. Genome-wide association studies (GWAS have identified a variety of gene regions associated with elevated plasma fibrinogen concentrations. However, little is yet known about how associations between environmental factors and fibrinogen might be modified by genetic variation. Therefore, we conducted large-scale meta-analyses of genome-wide interaction studies to identify possible interactions of genetic variants and smoking status, alcohol consumption or BMI on fibrinogen concentration. The present study included 80,607 subjects of European ancestry from 22 studies. Genome-wide interaction analyses were performed separately in each study for about 2.6 million single nucleotide polymorphisms (SNPs across the 22 autosomal chromosomes. For each SNP and risk factor, we performed a linear regression under an additive genetic model including an interaction term between SNP and risk factor. Interaction estimates were meta-analysed using a fixed-effects model. No genome-wide significant interaction with smoking status, alcohol consumption or BMI was observed in the meta-analyses. The most suggestive interaction was found for smoking and rs10519203, located in the LOC123688 region on chromosome 15, with a p value of 6.2 × 10(-8. This large genome-wide interaction study including 80,607 participants found no strong evidence of interaction between genetic variants and smoking status, alcohol consumption or BMI on fibrinogen concentrations. Further studies are needed to yield deeper insight in the interplay between environmental factors and gene variants on the regulation of fibrinogen concentrations.
Interpreting Multiple Linear Regression: A Guidebook of Variable Importance
Nathans, Laura L.; Oswald, Frederick L.; Nimon, Kim
2012-01-01
Multiple regression (MR) analyses are commonly employed in social science fields. It is also common for interpretation of results to typically reflect overreliance on beta weights, often resulting in very limited interpretations of variable importance. It appears that few researchers employ other methods to obtain a fuller understanding of what…
Retro-regression--another important multivariate regression improvement.
Randić, M
2001-01-01
We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.
Forbes, Valery E; Aufderheide, John; Warbritton, Ryan; van der Hoeven, Nelly; Caspers, Norbert
2007-03-01
This study presents results of the effects of bisphenol A (BPA) on adult egg production, egg hatchability, egg development rates and juvenile growth rates in the freshwater gastropod, Marisa cornuarietis. We observed no adult mortality, substantial inter-snail variability in reproductive output, and no effects of BPA on reproduction during 12 weeks of exposure to 0, 0.1, 1.0, 16, 160 or 640 microg/L BPA. We observed no effects of BPA on egg hatchability or timing of egg hatching. Juveniles showed good growth in the control and all treatments, and there were no significant effects of BPA on this endpoint. Our results do not support previous claims of enhanced reproduction in Marisa cornuarietis in response to exposure to BPA. Statistical power analysis indicated high levels of inter-snail variability in the measured endpoints and highlighted the need for sufficient replication when testing treatment effects on reproduction in M. cornuarietis with adequate power.
Directory of Open Access Journals (Sweden)
Ana-Maria Budai
2013-05-01
Full Text Available This paper present the results of a study that was made to establish the influence of finite element number used to determined the real load of a structure. Actually, the study represent a linear static analyze for a link gear control mechanism of a Kaplan turbine. The all analyze was made for the normal condition of functioning having like final scope to determine de life time duration of mentioned mechanism.
1988-03-01
ACTIVITY 11. ASSESS RESULTS OF ENCAGENENT TC GNU IDR M Alien battle dauge/c<iua1ttis Check/adJuit MRS Dtttratiw If and how crm ihould bt... UfO Of ring« lino con fco obiorvod on CRT. Kongo It not innounctd, but oppaor on CRT diti oroo. U(0 of ringt lint con bo obior»td on CRT
De Angelis, Gessica
2014-01-01
The present study adopts a multilingual approach to analysing the standardized test results of primary school immigrant children living in the bi-/multilingual context of South Tyrol, Italy. The standardized test results are from the Invalsi test administered across Italy in 2009/2010. In South Tyrol, several languages are spoken on a daily basis…
International Nuclear Information System (INIS)
Chirico, Robert D.; Kazakov, Andrei F.
2015-01-01
Highlights: • Heat capacities were measured for the temperature range (5 to 520) K. • The enthalpy of combustion was measured and the enthalpy of formation was derived. • Thermodynamic-consistency analysis resolved inconsistencies in literature enthalpies of sublimation. • An inconsistency in literature enthalpies of combustion was resolved. • Application of computational chemistry in consistency analysis was demonstrated successfully. - Abstract: Heat capacities and phase-transition properties for xanthone (IUPAC name 9H-xanthen-9-one and Chemical Abstracts registry number [90-47-1]) are reported for the temperature range 5 < T/K < 524. Statistical calculations were performed and thermodynamic properties for the ideal gas were derived based on molecular geometry optimization and vibrational frequencies calculated at the B3LYP/6-31+G(d,p) level of theory. These results are combined with sublimation pressures from the literature to allow critical evaluation of inconsistent enthalpies of sublimation for xanthone, also reported in the literature. Literature values for the enthalpy of combustion of xanthone are re-assessed, a revision is recommended for one result, and a new value for the enthalpy of formation of the ideal gas is derived. Comparisons with thermophysical properties reported in the literature are made for all other reported and derived properties, where possible
System to monitor data analyses and results of physics data validation between pulses at DIII-D
International Nuclear Information System (INIS)
Flanagan, S.; Schachter, J.M.; Schissel, D.P.
2004-01-01
A data analysis monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility (http://nssrv1.gat.com:8000/dam). The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded, thus increasing the efficiency of experimental time. An example of a consistency check is comparing the experimentally measured neutron rate and the expected neutron emission, RDD0D. A significant difference between these two values could indicate a problem with one or more diagnostics, or the presence of unanticipated phenomena in the plasma. This system also tracks the progress of MDSplus dispatched data analysis software and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, C Language Integrated Production System to implement expert system logic, and displays its results to multiple web clients via Hypertext Markup Language. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse
International Nuclear Information System (INIS)
FLANAGAN, A; SCHACHTER, J.M; SCHISSEL, D.P
2003-01-01
A Data Analysis Monitoring (DAM) system has been developed to monitor between pulse physics analysis at the DIII-D National Fusion Facility (http://nssrv1.gat.com:8000/dam). The system allows for rapid detection of discrepancies in diagnostic measurements or the results from physics analysis codes. This enables problems to be detected and possibly fixed between pulses as opposed to after the experimental run has concluded thus increasing the efficiency of experimental time. An example of a consistency check is comparing the experimentally measured neutron rate and the expected neutron emission, RDD0D. A significant difference between these two values could indicate a problem with one or more diagnostics, or the presence of unanticipated phenomena in the plasma. This new system also tracks the progress of MDSplus dispatched data analysis software and the loading of analyzed data into MDSplus. DAM uses a Java Servlet to receive messages, CLIPS to implement expert system logic, and displays its results to multiple web clients via HTML. If an error is detected by DAM, users can view more detailed information so that steps can be taken to eliminate the error for the next pulse
Directory of Open Access Journals (Sweden)
K. A. Trukhanov
2014-01-01
Full Text Available State-of-the-art machinery development enables people with lost lower limb to continue their previous life despite a loss. International companies dealing with this area pursue a minimization of human behaviour problems because of amputation. Researches to create an optimal design of the artificial knee joint are under way.The work task was to define analytical relationships of changing kinematic parameters of the human walking on the flat surface such as an angle of the knee joint, knee point (moment, definition of reduced knee actuator (A load, as well as to compare obtained results with experimental data.As an A in created design, the article proposes to use a controlled shock absorber based on the hydraulic cylinder.A knee unit is a kinematic two-tier mechanism. One of the mechanism links performs rotational motion, and the other is rotation-translational to provide a rotation of the first one.When studying the hydraulic actuator device dynamics, as a generalized coordinate a coordinate of the piston x (or ρ position is chosen while in the study of link movements an angle β is preferable.Experimental data are obtained for a human with the body weight of 57.6 kg walking on the flat surface to estimate a value of the knee joint angle, speed, acceleration, torque, and capacity in the knee joint and are taken from the published works of foreign authors.A trigonometric approximation was used for fitting the experimental data. The resulting dependence of the reduced load on the stock of A is necessary to perform the synthesis of A. The criterion for linear mechanisms mentioned in the D.N. Popov’s work is advisable to use as a possible criterion for optimization of A.The results obtained are as follows:1. Kinematics linkage mechanism is described using relationships for dependencies of its geometrical parameters, namely a cylinder piston stroke x (or ρ and a links angle β.2. Obtained polynomials of kinematic relationships allow a synthesis of
International Nuclear Information System (INIS)
Zeleznik, Nadja; Kralj, Metka; Lokner, Vladimir; Levanat, Ivica; Rapic, Andrea; Mele, Irena
2010-01-01
The preparation of the new revision of the Decommissioning and Spent Fuel (SF) and Low and Intermediate level Waste (LILW) Disposal Program for the NPP Krsko (Program) started in September 2008 after the acceptance of the Term of Reference for the work by Intergovernmental Committee responsible for implementation of the Agreement between the governments of Slovenia and Croatia on the status and other legal issues related to investment, exploitation, and decommissioning of the Nuclear power plant Krsko. The responsible organizations, APO and ARAO together with NEK prepared all new technical and financial data and relevant inputs for the new revision in which several scenarios based on the accepted boundary conditions were investigated. The strategy of immediate dismantling was analyzed for planned and extended NPP life time together with linked radioactive waste and spent fuel management to calculate yearly annuity to be paid by the owners into the decommissioning funds in Slovenia and Croatia. The new Program incorporated among others new data on the LILW repository including the costs for siting, construction and operation of silos at the location Vrbina in Krsko municipality, the site specific Preliminary Decommissioning Plan for NPP Krsko which included besides dismantling and decontamination approaches also site specific activated and contaminated radioactive waste, and results from the referenced scenario for spent fuel disposal but at very early stage. Important inputs for calculations presented also new amounts of compensations to the local communities for different nuclear facilities which were taken from the supplemented Slovenian regulation and updated fiscal parameters (inflation, interest, discount factors) used in the financial model based on the current development in economical environment. From the obtained data the nominal and discounted costs for the whole nuclear program related to NPP Krsko which is jointly owned by Slovenia and Croatia have
DEFF Research Database (Denmark)
Fitzenberger, Bernd; Wilke, Ralf Andreas
2015-01-01
if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...
DEFF Research Database (Denmark)
Hansen, Henrik; Tarp, Finn
2001-01-01
This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...
Bottari, C.; Albano, M.; Capizzi, P.; D'Alessandro, A.; Doumaz, F.; Martorana, R.; Moro, M.; Saroli, M.
2018-01-01
Seismotectonic activity and slope instability are a permanent threat in the archaeological site of Abakainon and in the nearby village of Tripi in NE Sicily. In recent times, signs of an ancient earthquake have been identified in the necropolis of Abakainon which dating was ascertained to the first century AD earthquake. The site is located on a slope of Peloritani Mts. along the Tindari Fault Line and contains evidence for earthquake-induced landslide, including fallen columns and blocks, horizontal shift and counter slope tilting of the tomb basements. In this paper, we used an integrated geomorphological and geophysical analysis to constrain the landslide. The research was directed to the acquisition of deep geological data for the reconstruction of slope process and the thickness of mobilized materials. The applied geophysical techniques included seismic refraction tomography and electrical resistivity tomography. The surveys were performed to delineate the sliding surface and to assess approximately the thickness of mobilized materials. The geophysical and geomorphologic data confirmed the presence of different overlapped landslides in the studied area. Moreover, a numerical simulation of the slope under seismic loads supports the hypothesis of a mobilization of the landslide mass in case of strong earthquakes (PGA > 0.3 g). However, numerical results highlight that the main cause of destruction for the Abakainon necropolis is the amplification of the seismic waves, occasionally accompanied by surficial sliding.
Monitoring result analyses of high slope of five-step ship lock in the Three Gorges Project
Directory of Open Access Journals (Sweden)
Qixiang Fan
2015-04-01
Full Text Available The construction of the double-lane five-step ship lock of the Three Gorges Project (TGP was commenced in 1994, the excavation of the ship lock was completed by the end of 1999, and the ship lock was put in operation in June 2003. The side slopes of the ship lock are characterized by great height (170 m, steepness (70 m in height of upright slope, and great length (over 7000 m in total length. In association with the ship lock, the surrounding rocks in slope have a high potential to deform, with which the magnitude of deformation is restricted. Monitoring results show that the deformation of the five-step ship lock high slopes of the TGP primarily occurred in excavation period, and deformation tended to be stable and convergent during operation period, suggesting the allowable ranges of deformation. At present, the slopes and lock chambers are stable, and the ship lock works well under normal operation condition, enabling the social and economic benefits of the TGP.
Analysing the economy-wide effects of the energy tax: results for Australia from the ORANI-E model
Energy Technology Data Exchange (ETDEWEB)
McDougall, R.A.; Dixon, P.B. [Monash Univ., Clayton, VIC (Australia); Australian Bureau of Agricultural and Resource Economics (ABARE), Canberra, ACT (Australia)
1996-12-31
Since the mid 1980s, economists have devoted considerable effort to greenhouse issues. Among the questions to which they have sought answers were what would be the effects on economic growth and employment of adopting different approaches to restricting greenhouse gas emissions, and what are the distributional effects of restricting greenhouse gas emissions, i.e. how would income and economic activity be re-allocated between countries, between industries, and between income classes. One approach to reduce greenhouse gas emissions is to impose taxes on the use of fossil fuels. Such a policy might, however, cause short-run economic disruption. This issues is investigated for Australia using a general equilibrium model, ORANI-E. The short-run effects of an energy tax are shown to depend on what is done with the tax revenue, how the labour market reacts, and on substitution possibilities between energy, capital and labour. Overall, the results indicate that energy taxes need not be damaging to the macro-economy. (author). 5 tabs., 2 figs., refs.
Gay, Charles W; Robinson, Michael E; Lai, Song; O'Shea, Andrew; Craggs, Jason G; Price, Donald D; Staud, Roland
2016-02-01
Although altered resting-state functional connectivity (FC) is a characteristic of many chronic pain conditions, it has not yet been evaluated in patients with chronic fatigue. Our objective was to investigate the association between fatigue and altered resting-state FC in myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS). Thirty-six female subjects, 19 ME/CFS and 17 healthy controls, completed a fatigue inventory before undergoing functional magnetic resonance imaging. Two methods, (1) data driven and (2) model based, were used to estimate and compare the intraregional FC between both groups during the resting state (RS). The first approach using independent component analysis was applied to investigate five RS networks: the default mode network, salience network (SN), left frontoparietal networks (LFPN) and right frontoparietal networks, and the sensory motor network (SMN). The second approach used a priori selected seed regions demonstrating abnormal regional cerebral blood flow (rCBF) in ME/CFS patients at rest. In ME/CFS patients, Method-1 identified decreased intrinsic connectivity among regions within the LFPN. Furthermore, the FC of the left anterior midcingulate with the SMN and the connectivity of the left posterior cingulate cortex with the SN were significantly decreased. For Method-2, five distinct clusters within the right parahippocampus and occipital lobes, demonstrating significant rCBF reductions in ME/CFS patients, were used as seeds. The parahippocampal seed and three occipital lobe seeds showed altered FC with other brain regions. The degree of abnormal connectivity correlated with the level of self-reported fatigue. Our results confirm altered RS FC in patients with ME/CFS, which was significantly correlated with the severity of their chronic fatigue.
International Nuclear Information System (INIS)
Uemura, George; Matos, Ludmila Vieira da Silva; Silva, Maria Aparecida da; Ferreira, Alexandre Santos Martorano; Menezes, Maria Angela de Barros Correia
2009-01-01
Natural arsenic contamination is a cause for concern in many countries of the world including Argentina, Bangladesh, Chile, China, India, Mexico, Thailand and the United States of America and also in Brazil, specially in the Iron Quadrangle area, where mining activities has been contributing to aggravate natural contamination. Brassicacea is a plant family with edible species (arugula, cabbage, cauliflower, cress, kale, mustard, radish), ornamental ones (alysssum, field pennycress, ornamental cabbages and kales) and some species are known as metal and metalloid accumulators (Indian mustard, field pennycress), like chromium, nickel, and arsenic. The present work aimed at studying other taxa of the Brassicaceae family to verify their capability in absorbing arsenic, under controlled conditions, for possible utilisation in remediation activities. The analytical method chosen was neutron activation analysis, k 0 method, a routine technique at CDTN, and also very appropriate for arsenic studies. To avoid possible interference from solid substrates, like sand or vermiculite, attempts were carried out to keep the specimens in 1/4 Murashige and Skoog basal salt solution (M and S). Growth was stumped, plants withered and perished, showing that modifications in M and S had to be done. The addition of nickel and silicon allowed normal growth of the plant specimens, for periods longer than usually achieved (more than two months); yielding samples large enough for further studies with other techniques, like ICP-MS, and other targets, like speciation studies. The results of arsenic absorption are presented here and the need of nickel and silicon in the composition of M and S is discussed. (author)
Energy Technology Data Exchange (ETDEWEB)
Uemura, George; Matos, Ludmila Vieira da Silva; Silva, Maria Aparecida da; Ferreira, Alexandre Santos Martorano; Menezes, Maria Angela de Barros Correia [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN-CNEN/MG), Belo Horizonte, MG (Brazil)], e-mail: george@cdtn.br, e-mail: menezes@cdtn.br
2009-07-01
Natural arsenic contamination is a cause for concern in many countries of the world including Argentina, Bangladesh, Chile, China, India, Mexico, Thailand and the United States of America and also in Brazil, specially in the Iron Quadrangle area, where mining activities has been contributing to aggravate natural contamination. Brassicacea is a plant family with edible species (arugula, cabbage, cauliflower, cress, kale, mustard, radish), ornamental ones (alysssum, field pennycress, ornamental cabbages and kales) and some species are known as metal and metalloid accumulators (Indian mustard, field pennycress), like chromium, nickel, and arsenic. The present work aimed at studying other taxa of the Brassicaceae family to verify their capability in absorbing arsenic, under controlled conditions, for possible utilisation in remediation activities. The analytical method chosen was neutron activation analysis, k{sub 0} method, a routine technique at CDTN, and also very appropriate for arsenic studies. To avoid possible interference from solid substrates, like sand or vermiculite, attempts were carried out to keep the specimens in 1/4 Murashige and Skoog basal salt solution (M and S). Growth was stumped, plants withered and perished, showing that modifications in M and S had to be done. The addition of nickel and silicon allowed normal growth of the plant specimens, for periods longer than usually achieved (more than two months); yielding samples large enough for further studies with other techniques, like ICP-MS, and other targets, like speciation studies. The results of arsenic absorption are presented here and the need of nickel and silicon in the composition of M and S is discussed. (author)
Directory of Open Access Journals (Sweden)
Putthasri Weerasak
2010-06-01
Full Text Available Abstract Background Since 2003, Asia-Pacific, particularly Southeast Asia, has received substantial attention because of the anticipation that it could be the epicentre of the next pandemic. There has been active investment but earlier review of pandemic preparedness plans in the region reveals that the translation of these strategic plans into operational plans is still lacking in some countries particularly those with low resources. The objective of this study is to understand the pandemic preparedness programmes, the health systems context, and challenges and constraints specific to the six Asian countries namely Cambodia, Indonesia, Lao PDR, Taiwan, Thailand, and Viet Nam in the prepandemic phase before the start of H1N1/2009. Methods The study relied on the Systemic Rapid Assessment (SYSRA toolkit, which evaluates priority disease programmes by taking into account the programmes, the general health system, and the wider socio-cultural and political context. The components under review were: external context; stewardship and organisational arrangements; financing, resource generation and allocation; healthcare provision; and information systems. Qualitative and quantitative data were collected in the second half of 2008 based on a review of published data and interviews with key informants, exploring past and current patterns of health programme and pandemic response. Results The study shows that health systems in the six countries varied in regard to the epidemiological context, health care financing, and health service provision patterns. For pandemic preparation, all six countries have developed national governance on pandemic preparedness as well as national pandemic influenza preparedness plans and Avian and Human Influenza (AHI response plans. However, the governance arrangements and the nature of the plans differed. In the five developing countries, the focus was on surveillance and rapid containment of poultry related transmission
Tumor regression patterns in retinoblastoma
International Nuclear Information System (INIS)
Zafar, S.N.; Siddique, S.N.; Zaheer, N.
2016-01-01
To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)
Introduction to regression graphics
Cook, R Dennis
2009-01-01
Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava
Alternative Methods of Regression
Birkes, David
2011-01-01
Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s
Coral, Josep; Lleixà, Teresa; Ventura, Carles
2018-01-01
The member states of the European Union have funded many initiatives supporting the teaching and learning of foreign languages. Content and language integrated learning is one of the experimental language programmes that have been introduced in Catalonia, in the north-east of Spain. The aims of this study are to analyse the results achieved by…
Imbert-Bismut, F; Messous, D; Raoult, A; Poynard, T; Bertrand, J J; Marie, P A; Louis, V; Audy, C; Thouy, J M; Hainque, B; Piton, A
2005-01-01
The follow up of patients with chronic liver diseases and the data from multicentric clinical studies are affected by the variability of assay results for the same parameter between the different laboratories. Today, the main objective in clinical chemistry throughout the world is to harmonise the assay results between the laboratories after the confirmation of their traceability, in relation to defined reference systems. In this context, the purpose of our study was to verify the homogeneity of haptoglobin, apolipoprotein A1, total bilirubin, GGT activity, ALAT activity results, which are combined in Fibrotest and Actitest, between Dimension Analysers RXL, ARX and X-PAND (Dade Behring Society). Moreover, we verified the transferability of Fibrotest and Actitest results between the RXL, and either the BN2 (haptoglobin and apolipoprotein A1) or the Modular DP (total bilirubin, GGT and ALAT activity concentrations). The serum samples from 150 hospitalised patients were analysed on the different analysers. Specific protein assays were calibrated using solutions standardised against reference material on Dimension and BN2 analysers. Total bilirubin assays were performed by a diazoreaction on Dimension and Modular DP analysers. The GGT and ALAT activity measurements on the Dimension analysers were performed in accordance with the reference methods defined by the International Federation of Clinical Chemisty and Laboratory Medicine (IFCC). On the Modular, enzyme activity measurements were performed according to the Szasz method (L-gamma- glutamyl-4-nitroanilide as substrate) modified by Persijn and van der Slik (L-gamma- glutamyl-3-carboxy- 4-nitroanilide as substrat) for GGT and according to the IFCC specifications for ALAT. The methods of enzymatic activity measurement were calibrated on the Modular only. Liver fibrosis and necroinflammatory activity indices were determined using calculation algorithms, after having adjusted each component's result of Fibrotest and
The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard
and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...
Linear regression and the normality assumption.
Schmidt, Amand F; Finan, Chris
2017-12-16
Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.
Abstract Expression Grammar Symbolic Regression
Korns, Michael F.
This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.
Directory of Open Access Journals (Sweden)
F.M.O. Borges
2003-12-01
que significou pouca influência da metodologia sobre essa medida. A FDN não mostrou ser melhor preditor de EM do que a FB.One experiment was run with broiler chickens, to obtain prediction equations for metabolizable energy (ME based on feedstuffs chemical analyses, and determined ME of wheat grain and its by-products, using four different methodologies. Seven wheat grain by-products were used in five treatments: wheat grain, wheat germ, white wheat flour, dark wheat flour, wheat bran for human use, wheat bran for animal use and rough wheat bran. Based on chemical analyses of crude fiber (CF, ether extract (EE, crude protein (CP, ash (AS and starch (ST of the feeds and the determined values of apparent energy (MEA, true energy (MEV, apparent corrected energy (MEAn and true energy corrected by nitrogen balance (MEVn in five treatments, prediction equations were obtained using the stepwise procedure. CF showed the best relationship with metabolizable energy values, however, this variable alone was not enough for a good estimate of the energy values (R² below 0.80. When EE and CP were included in the equations, R² increased to 0.90 or higher in most estimates. When the equations were calculated with all treatments, the equation for MEA were less precise and R² decreased. When ME data of the traditional or force-feeding methods were used separately, the precision of the equations increases (R² higher than 0.85. For MEV and MEVn values, the best multiple linear equations included CF, EE and CP (R²>0.90, independently of using all experimental data or separating by methodology. The estimates of MEVn values showed high precision and the linear coefficients (a of the equations were similar for all treatments or methodologies. Therefore, it explains the small influence of the different methodologies on this parameter. NDF was not a better predictor of ME than CF.
Directory of Open Access Journals (Sweden)
Matthias Schmid
Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.
Weisberg, Sanford
2013-01-01
Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus
Hosmer, David W; Sturdivant, Rodney X
2013-01-01
A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-
International Nuclear Information System (INIS)
Mattie, Patrick D.; McNeish, Jerry A.; Sevougian, S. David; Andrews, Robert W.
2001-01-01
Total System Performance Assessment (TSPA) is used as a key decision-making tool for the potential geologic repository of high level radioactive waste at Yucca Mountain, Nevada USA. Because of the complexity and uncertainty involved in a post-closure performance assessment, an important goal is to produce a transparent document describing the assumptions, the intermediate steps, the results, and the conclusions of the analyses. An important objective for a TSPA analysis is to illustrate confidence in performance projections of the potential repository given a complex system of interconnected process models, data, and abstractions. The methods and techniques used for the recent TSPA analyses demonstrate an effective process to portray complex models and results with transparency and credibility
Multilingual speaker age recognition: regression analyses on the Lwazi corpus
CSIR Research Space (South Africa)
Feld, M
2009-12-01
Full Text Available Multilinguality represents an area of significant opportunities for automatic speech-processing systems: whereas multilingual societies are commonplace, the majority of speechprocessing systems are developed with a single language in mind. As a step...
Söhner, Felicitas; Fangerau, Heiner; Becker, Thomas
2018-05-01
This paper examines the influence of sociology as a discipline on the Psychiatrie-Enquete by analysing interviews with expert (psychiatrist, psychologist, sociologist etc.) witnesses of the Enquete process and by analysing pertinent documents. 24 interviews were conducted and analysed using qualitative secondary analysis. Sociological texts and research results influenced the professional development of psychiatrists at the time. Cross-talk between psychiatry and sociology developed through seminal sociological analyses of psychiatric institutions and the interest taken in medical institutions in a number of sociological texts. Inter-disciplinary joint studies (of sociologists and psychiatrists) affected the research interest and professional behaviour of psychiatrists involved in the process on the way to the Psychiatrie-Enquete. Tenacity of psychiatrists' systems of opinion was dissolved by impulses from the sociological thought community. The forms of contact between the psychiatric and the sociological thought collective which we could reconstruct are an example of the evolution of knowledge and practice through transdisciplinary communication. © Georg Thieme Verlag KG Stuttgart · New York.
Economic Analyses of Ware Yam Production in Orlu Agricultural ...
African Journals Online (AJOL)
Economic Analyses of Ware Yam Production in Orlu Agricultural Zone of Imo State. ... International Journal of Agriculture and Rural Development ... statistics, gross margin analysis, marginal analysis and multiple regression analysis. Results ...
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
Energy Technology Data Exchange (ETDEWEB)
Dittmar, S.; Neubrech, G.E.; Wernicke, R. [TUeV Nord SysTec GmbH und Co.KG (Germany); Rieck, D. [IGN Ingenieurgesellschaft Nord mbH und Co.KG (Germany)
2008-07-01
For the fracture mechanical assessment of postulated or detected crack-like defects in welds of piping systems it is necessary to know the stresses in the un-cracked component normal to the crack plane. Results of piping stress analyses may be used if these are evaluated for the locations of the welds in the piping system. Using stress enhancing factors (stress indices, stress factors) the needed stress components are calculated from the component specific sectional loads (forces and moments). For this procedure the tabulated stress enhancing factors, given in the standards (ASME Code, German KTA regulations) for determination and limitation of the effective stresses, are not always and immediately adequate for the calculation of the stress component normal to the crack plane. The contribution shows fundamental possibilities and validity limits for adoption of the results of piping system analyses for the fracture mechanical evaluation of axial and circumferential defects in welded joints, with special emphasis on typical piping system components (straight pipe, elbow, pipe fitting, T-joint). The lecture is supposed to contribute to the standardization of a code compliant and task-related use of the piping system analysis results for fracture mechanical failure assessment. [German] Fuer die bruchmechanische Bewertung von postulierten oder bei der wiederkehrenden zerstoerungsfreien Pruefung detektierten rissartigen Fehlern in Schweissnaehten von Rohrsystemen werden die Spannungen in der ungerissenen Bauteilwand senkrecht zur Rissebene benoetigt. Hierfuer koennen die Ergebnisse von Rohrsystemanalysen (Spannungsanalysen) genutzt werden, wenn sie fuer die Orte der Schweissnaehte im Rohrsystem ausgewertet werden. Mit Hilfe von Spannungserhoehungsfaktoren (Spannungsindizes, Spannungsbeiwerten) werden aus den komponentenweise berechneten Schnittlasten (Kraefte und Momente) die benoetigten Spannungskomponenten berechnet. Dabei sind jedoch die in den Regelwerken (ASME
Directory of Open Access Journals (Sweden)
Mok Tik
2014-06-01
Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.
Schmucker, Christine M; Blümle, Anette; Schell, Lisa K; Schwarzer, Guido; Oeller, Patrick; Cabrera, Laura; von Elm, Erik; Briel, Matthias; Meerpohl, Joerg J
2017-01-01
A meta-analysis as part of a systematic review aims to provide a thorough, comprehensive and unbiased statistical summary of data from the literature. However, relevant study results could be missing from a meta-analysis because of selective publication and inadequate dissemination. If missing outcome data differ systematically from published ones, a meta-analysis will be biased with an inaccurate assessment of the intervention effect. As part of the EU-funded OPEN project (www.open-project.eu) we conducted a systematic review that assessed whether the inclusion of data that were not published at all and/or published only in the grey literature influences pooled effect estimates in meta-analyses and leads to different interpretation. Systematic review of published literature (methodological research projects). Four bibliographic databases were searched up to February 2016 without restriction of publication year or language. Methodological research projects were considered eligible for inclusion if they reviewed a cohort of meta-analyses which (i) compared pooled effect estimates of meta-analyses of health care interventions according to publication status of data or (ii) examined whether the inclusion of unpublished or grey literature data impacts the result of a meta-analysis. Seven methodological research projects including 187 meta-analyses comparing pooled treatment effect estimates according to different publication status were identified. Two research projects showed that published data showed larger pooled treatment effects in favour of the intervention than unpublished or grey literature data (Ratio of ORs 1.15, 95% CI 1.04-1.28 and 1.34, 95% CI 1.09-1.66). In the remaining research projects pooled effect estimates and/or overall findings were not significantly changed by the inclusion of unpublished and/or grey literature data. The precision of the pooled estimate was increased with narrower 95% confidence interval. Although we may anticipate that
Multicollinearity and Regression Analysis
Daoud, Jamal I.
2017-12-01
In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.
DEFF Research Database (Denmark)
Bache, Stefan Holst
A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....
Regression Analysis by Example. 5th Edition
Chatterjee, Samprit; Hadi, Ali S.
2012-01-01
Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…
Prediction, Regression and Critical Realism
DEFF Research Database (Denmark)
Næss, Petter
2004-01-01
This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...
Sutter, B.; McAdam, A. C.; Rampe, E. B.; Thompson, L. M.; Ming, D. W.; Mahaffy, P. R.; Navarro-Gonzalez, R.; Stern, J. C.; Eigenbrode, J. L.; Archer, P. D.
2017-01-01
The Sample Analysis at Mars (SAM) instrument aboard the Mars Science Laboratory rover has analyzed 13 samples from Gale Crater. All SAM-evolved gas analyses have yielded a multitude of volatiles (e.g., H2O, SO2, H2S, CO2, CO, NO, O2, HCl) [1- 6]. The objectives of this work are to 1) Characterize recent evolved SO2, CO2, O2, and NO gas traces of the Murray formation mudstone, 2) Constrain sediment mineralogy/composition based on SAM evolved gas analysis (SAM-EGA), and 3) Discuss the implications of these results relative to understanding the geological history of Gale Crater.
Sutter, B.; McAdam, A. C.; Rampe, E. B.; Ming, D. W.; Mahaffy, P. R.; Navarro-Gonzalez, R.; Stern, J. C.; Eigenbrode, J. L.; Archer, P. D.
2016-01-01
The Sample Analysis at Mars (SAM) instrument aboard the Mars Science Laboratory rover has analyzed 10 samples from Gale Crater. All SAM evolved gas analyses have yielded a multitude of volatiles (e.g, H2O, SO2, H2S, CO2, CO, NO, O2, HC1). The objectives of this work are to 1) Characterize the evolved H2O, SO2, CO2, and O2 gas traces of sediments analyzed by SAM through sol 1178, 2) Constrain sediment mineralogy/composition based on SAM evolved gas analysis (SAM-EGA), and 3) Discuss the implications of these results releative to understanding the geochemical history of Gale Crater.
Use of probabilistic weights to enhance linear regression myoelectric control.
Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J
2015-12-01
Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts' law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p linear regression control. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.
Logistic regression for dichotomized counts.
Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W
2016-12-01
Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.
Multiple linear regression analysis
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
Ritz, Christian; Parmigiani, Giovanni
2009-01-01
R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.
Bounded Gaussian process regression
DEFF Research Database (Denmark)
Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan
2013-01-01
We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....
Mechanisms of neuroblastoma regression
Brodeur, Garrett M.; Bagatell, Rochelle
2014-01-01
Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179
International Nuclear Information System (INIS)
Halvorsen, Bente; Larsen, Bodil M.; Nesbakken, Runa
2001-01-01
The literature on energy demand shows that there are systematic differences in income- and price elasticity from analyses based on macro data and micro data. Even if one estimates models with the same explanatory variables, the results may differ with respect to estimated price- and income sensitivity. These differences may be caused by problems involved in transferring micro properties to macro properties, or the estimated macro relationships have failed to adequately consideration the fact that households behave differently in their energy demand. Political goals are often directed towards the entire household sector. Partial equilibrium models do not capture important equilibrium effects and feedback through the energy markets and the economy in general. Thus, it is very interesting, politically and scientifically, to do macro economic model analyses of different political measures that affect the energy consumption. The results of behavioural analyses, in which one investigates the heterogeneity of the energy demand, must be based on information about individual households. When the demand is studied based on micro data, it is difficult to aggregate its properties to a total demand function for the entire household sector if different household sectors have different behaviour. Such heterogeneity of behaviour may for instance arise when households in different regions have different heating equipment because of regional differences in the price of electricity. The subject of aggregation arises immediately when one wants to draw conclusions about the household sector based on information about individual households, whether the discussion is about the whole population or a selection of households. Thus, aggregation is a topic of interest in a wide range of problems
Energy Technology Data Exchange (ETDEWEB)
2009-11-15
The objective with this report is to: - provide design premises from a long term safety aspect of a KBS-3V repository for spent nuclear fuel, to form the basis for the development of the reference design of the repository. The design premises are used as input to the documents, called production reports, that present the reference design to be analysed in the long term safety assessment SR-Site. It is the aim that the production reports should verify that the chosen design complies with the design premises given in this report, whereas this report takes the burden of justifying why these design premises are relevant. The more specific aims and objectives with the production reports are provided in these reports. The following approach is used: - The reference design analysed in SR-Can is a starting point for setting safety related design premises for the next design step. - A few design basis cases, in accordance with the definition used in the regulation SSMFS 2008:211 and mainly related to the canister, can be derived from the results of the SR-Can assessment. From these it is possible to formulate some specific design premises for the canister. - The design basis cases involve several assumptions on the state of other barriers. These implied conditions are thus set as design premises for these barriers. - Even if there are few load cases on individual barriers that can be directly derived from the analyses, SR-Can provides substantial feedback on most aspects of the analysed reference design. This feedback is also formulated as design premises. - An important part of SR-Can Main report is the formulation and assessment of safety function indicator criteria. These criteria are a basis for formulating design premises, but they are not the same as the design premises discussed in the present report. Whereas the former should be upheld throughout the assessment period, the latter refer to the initial state and must be defined such that they give a margin for
International Nuclear Information System (INIS)
2009-11-01
The objective with this report is to: - provide design premises from a long term safety aspect of a KBS-3V repository for spent nuclear fuel, to form the basis for the development of the reference design of the repository. The design premises are used as input to the documents, called production reports, that present the reference design to be analysed in the long term safety assessment SR-Site. It is the aim that the production reports should verify that the chosen design complies with the design premises given in this report, whereas this report takes the burden of justifying why these design premises are relevant. The more specific aims and objectives with the production reports are provided in these reports. The following approach is used: - The reference design analysed in SR-Can is a starting point for setting safety related design premises for the next design step. - A few design basis cases, in accordance with the definition used in the regulation SSMFS 2008:211 and mainly related to the canister, can be derived from the results of the SR-Can assessment. From these it is possible to formulate some specific design premises for the canister. - The design basis cases involve several assumptions on the state of other barriers. These implied conditions are thus set as design premises for these barriers. - Even if there are few load cases on individual barriers that can be directly derived from the analyses, SR-Can provides substantial feedback on most aspects of the analysed reference design. This feedback is also formulated as design premises. - An important part of SR-Can Main report is the formulation and assessment of safety function indicator criteria. These criteria are a basis for formulating design premises, but they are not the same as the design premises discussed in the present report. Whereas the former should be upheld throughout the assessment period, the latter refer to the initial state and must be defined such that they give a margin for
Ridge Regression Signal Processing
Kuhl, Mark R.
1990-01-01
The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.
Subset selection in regression
Miller, Alan
2002-01-01
Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...
Regression in organizational leadership.
Kernberg, O F
1979-02-01
The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.
Classification and regression trees
Breiman, Leo; Olshen, Richard A; Stone, Charles J
1984-01-01
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Chanat, Jeffrey G.; Moyer, Douglas L.; Blomquist, Joel D.; Hyer, Kenneth E.; Langland, Michael J.
2016-01-13
In the Chesapeake Bay watershed, estimated fluxes of nutrients and sediment from the bay’s nontidal tributaries into the estuary are the foundation of decision making to meet reductions prescribed by the Chesapeake Bay Total Maximum Daily Load (TMDL) and are often the basis for refining scientific understanding of the watershed-scale processes that influence the delivery of these constituents to the bay. Two regression-based flux and trend estimation models, ESTIMATOR and Weighted Regressions on Time, Discharge, and Season (WRTDS), were compared using data from 80 watersheds in the Chesapeake Bay Nontidal Water-Quality Monitoring Network (CBNTN). The watersheds range in size from 62 to 70,189 square kilometers and record lengths range from 6 to 28 years. ESTIMATOR is a constant-parameter model that estimates trends only in concentration; WRTDS uses variable parameters estimated with weighted regression, and estimates trends in both concentration and flux. WRTDS had greater explanatory power than ESTIMATOR, with the greatest degree of improvement evident for records longer than 25 years (30 stations; improvement in median model R2= 0.06 for total nitrogen, 0.08 for total phosphorus, and 0.05 for sediment) and the least degree of improvement for records of less than 10 years, for which the two models performed nearly equally. Flux bias statistics were comparable or lower (more favorable) for WRTDS for any record length; for 30 stations with records longer than 25 years, the greatest degree of improvement was evident for sediment (decrease of 0.17 in median statistic) and total phosphorus (decrease of 0.05). The overall between-station pattern in concentration trend direction and magnitude for all constituents was roughly similar for both models. A detailed case study revealed that trends in concentration estimated by WRTDS can operationally be viewed as a less-constrained equivalent to trends in concentration estimated by ESTIMATOR. Estimates of annual mean flow
KELEŞ, Taliha; ALTUN, Murat
2016-01-01
Regression analysis is a statistical technique for investigating and modeling the relationship between variables. The purpose of this study was the trivial presentation of the equation for orthogonal regression (OR) and the comparison of classical linear regression (CLR) and OR techniques with respect to the sum of squared perpendicular distances. For that purpose, the analyses were shown by an example. It was found that the sum of squared perpendicular distances of OR is smaller. Thus, it wa...
Hilbe, Joseph M
2009-01-01
This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...
Covariate Imbalance and Adjustment for Logistic Regression Analysis of Clinical Trial Data
Ciolino, Jody D.; Martin, Reneé H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.
2014-01-01
In logistic regression analysis for binary clinical trial data, adjusted treatment effect estimates are often not equivalent to unadjusted estimates in the presence of influential covariates. This paper uses simulation to quantify the benefit of covariate adjustment in logistic regression. However, International Conference on Harmonization guidelines suggest that covariate adjustment be pre-specified. Unplanned adjusted analyses should be considered secondary. Results suggest that that if adjustment is not possible or unplanned in a logistic setting, balance in continuous covariates can alleviate some (but never all) of the shortcomings of unadjusted analyses. The case of log binomial regression is also explored. PMID:24138438
Krivolutsky, Alexei A.; Nazarova, Margarita; Knyazeva, Galina
Solar activity influences on atmospheric photochemical system via its changebale electromag-netic flux with eleven-year period and also by energetic particles during solar proton event (SPE). Energetic particles penetrate mostly into polar regions and induce additional produc-tion of NOx and HOx chemical compounds, which can destroy ozone in photochemical catalytic cycles. Solar irradiance variations cause in-phase variability of ozone in accordance with photo-chemical theory. However, real ozone response caused by these two factors, which has different physical nature, is not so clear on long-term time scale. In order to understand the situation multiply linear regression statistical method was used. Three data series, which covered the period 1958-2006, have been used to realize such analysis: yearly averaged total ozone at dif-ferent latitudes (World Ozone Data Centre, Canada, WMO); yearly averaged proton fluxes with E¿ 10 MeV ( IMP, GOES, METEOR satellites); yearly averaged numbers of solar spots (Solar Data). Then, before the analysis, the data sets of ozone deviations from the mean values for whole period (1958-2006) at each latitudinal belt were prepared. The results of multiply regression analysis (two factors) revealed rather complicated time-dependent behavior of ozone response with clear negative peaks for the years of strong SPEs. The magnitudes of such peaks on annual mean basis are not greater than 10 DU. The unusual effect -positive response of ozone to solar proton activity near both poles-was discovered by statistical analysis. The pos-sible photochemical nature of found effect is discussed. This work was supported by Russian Science Foundation for Basic Research (grant 09-05-009949) and by the contract 1-6-08 under Russian Sub-Program "Research and Investigation of Antarctica".
Kjellenberg, Lars; Granstedt, Artur
2015-09-15
The aim of this paper was to present results from two long term field experiments comparing potato samples from conventional farming systems with samples from biodynamic farming systems. The principal component analyses (PCA), consistently exhibited differences between potato samples from the two farming systems. According to the PCA, potato samples treated with inorganic fertilizers exhibited a variation positively related to amounts of crude protein, yield, cooking or tissue discoloration and extract decomposition. Potato samples treated according to biodynamic principles, with composted cow manure, were more positively related to traits such as Quality- and EAA-indices, dry matter content, taste quality, relative proportion of pure protein and biocrystallization value. Distinctions between years, crop rotation and cultivars used were sometimes more significant than differences between manuring systems. Grown after barley the potato crop exhibited better quality traits compared to when grown after ley in both the conventional and the biodynamic farming system.
Directory of Open Access Journals (Sweden)
Lars Kjellenberg
2015-09-01
Full Text Available The aim of this paper was to present results from two long term field experiments comparing potato samples from conventional farming systems with samples from biodynamic farming systems. The principal component analyses (PCA, consistently exhibited differences between potato samples from the two farming systems. According to the PCA, potato samples treated with inorganic fertilizers exhibited a variation positively related to amounts of crude protein, yield, cooking or tissue discoloration and extract decomposition. Potato samples treated according to biodynamic principles, with composted cow manure, were more positively related to traits such as Quality- and EAA-indices, dry matter content, taste quality, relative proportion of pure protein and biocrystallization value. Distinctions between years, crop rotation and cultivars used were sometimes more significant than differences between manuring systems. Grown after barley the potato crop exhibited better quality traits compared to when grown after ley in both the conventional and the biodynamic farming system.
Steganalysis using logistic regression
Lubenko, Ivans; Ker, Andrew D.
2011-02-01
We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.
SEPARATION PHENOMENA LOGISTIC REGRESSION
Directory of Open Access Journals (Sweden)
Ikaro Daniel de Carvalho Barreto
2014-03-01
Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...
Adaptive metric kernel regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
2000-01-01
Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...
Adaptive Metric Kernel Regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
1998-01-01
Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...
Tok, Ezgi; Kurt, Halil; Tunga Akarsubasi, A.
2016-04-01
The microbial diversity of cave sediments which are obtained from three different caves named Insuyu, Balatini and Altınbeşik located at Southern Turkey has been investigated using molecular methods for biomineralization . The total number of 22 samples were taken in duplicates from the critical zones of the caves at where the water activity is observed all year round. Microbial communities were monitored by 16S rRNA gene based PCR-DGGE (Polymerase Chain Reaction - Denaturating Gradient Gel Electrophoresis) methodology. DNA were extracted from the samples by The PowerSoil® DNA Isolation Kit (MO BIO Laboratories inc., CA) with the modifications on the producer's protocol. The synthetic DNA molecule poly-dIdC was used to increase the yield of PCR amplification via blocking the reaction between CaCO3 and DNA molecules. Thereafter samples were amplified by using both Archaeal and Bacterial universal primers (ref). Subsequently, archaeal and bacterial diversities in cave sediments, were investigated to be able to compare with respect to their similarities by using DGGE. DGGE patterns were analysed with BioNumerics software 5.1. Similarity matrix and dendograms of the DGGE profiles were generated based on the Dice correlation coefficient (band-based) and unweighted pair-group method with arithmetic mean (UPGMA). The structural diversity of the microbial community was examined by the Shannon index of general diversity (H). Similtaneously, geochemical analyses of the sediment samples were performed within the scope of this study. Total organic carbon (TOC), x-ray diffraction spectroscopy (XRD) and x-ray fluorescence spectroscopy (XRF) analysis of sediments were also implemented. The extensive results will be obtained at the next stages of the study currently carried on.
Energy Technology Data Exchange (ETDEWEB)
Buhmann, Dieter; Becker, Dirk-Alexander; Laggiard, Eduardo; Ruebel, Andre; Spiessl, Sabine; Wolf, Jens
2016-07-15
In the frame of the project ISIBEL deterministic analyses on the radiological consequences of several possible developments of the final repository were performed (VSG: preliminary safety analysis of the site Gorleben). The report describes the probabilistic evaluation of the VSG scenarios using uncertainty and sensitivity analyses. It was shown that probabilistic analyses are important to evaluate the influence of uncertainties. The transfer of the selected scenarios in computational cases and the used modeling parameters are discussed.
International Nuclear Information System (INIS)
Morland, E.; Sherry, A.H.
1993-01-01
A series of six large-scale experiments have been carried out at AEA Technology using the Spinning Cylinder test facility. Results from two of those experiments (SC-I and SC-II) have been provided to Project FALSIRE and are reviewed in this paper. The Spinning Cylinder tests were carried out using hollow cylinders of 1.4m outer diameter, 0.2m wall thickness and 1.3m length, containing full-length axial defects and fabricated from a modified A508 Class 3 steel. The first Spinning Cylinder test (SC-I) was an investigation of stable ductile growth induced via mechanical (primary) loading and under conditions of contained yielding. Mechanical loading was provided in the hoop direction by rotating the cylinder about its major axis within an enclosed oven. The second test (SC-II) investigated stable ductile growth under severe thermal shock (secondary) loading again under conditions of contained yielding. In this case thermal shock was produced by spraying cold water on the inside surface of the heated cylinder whilst it was rotating. For each experiment, results are presented in terms of a number of variables, eg. crack growth, temperature, stress, strain and applied K and J. In addition, an overview of the analyses of the FALSIRE Phase-1 report is also presented with respect to test SC-I and SC-II. 4 refs., 14 figs., 13 tabs
Damm, Gabriele; Macha, Thorsten; Petermann, Franz; Voss, Wolfgang; Sens, Brigitte
2015-01-01
Based on perinatal and neonatal quality assurance programmes, a follow-up project for the high-risk group of extremely preterm infants, unparalleled in Germany, was initiated in the federal state of Lower Saxony in 2004. Here we describe the new approach of examining a comparison group of term infants, which, for the first time, allows a valid interpretation of the collection of area-wide long-term outcome data on preterm children. The prospective long-term outcome project investigates the medical care situation for children born at less than 28 weeks of gestation up to school age. Based on the information obtained about the children's development the quality of health care will be optimised. A standardised examining concept with established development tests at defined follow-up intervals (at the age of 6 months, 2, 5 and 10 years) is used. At the age of five years 75 % of the examined premature children exhibited impairments. In order to better assess remarkable results, a comparison group of term infants (n=305) selected by a matched-pairs method was examined at the age of five using an analogous concept in kindergartens in Lower Saxony. The results were compared with the first two age cohorts of the follow-up-project (n=226) and quality analyses performed. As expected, significant differences have been found in the children's motor, cognitive and linguistic development between the preterm and term infants examined. This fact draws attention to the importance of early support for the majority of extremely premature infants. Feedback on the results given to the medical staff involved allows for the implementation of best practices and quality improvements. Identifying potential for improvement in everyday health care will help to develop specific optimisation measures. Copyright © 2015. Published by Elsevier GmbH.
Directory of Open Access Journals (Sweden)
Salabura Piotr
2017-01-01
Full Text Available HADES experiment at GSI is the only high precision experiment probing nuclear matter in the beam energy range of a few AGeV. Pion, proton and ion beams are used to study rare dielectron and strangeness probes to diagnose properties of strongly interacting matter in this energy regime. Selected results from p + A and A + A collisions are presented and discussed.
Use of probabilistic weights to enhance linear regression myoelectric control
Smith, Lauren H.; Kuiken, Todd A.; Hargrove, Levi J.
2015-12-01
Objective. Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Approach. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts’ law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Main results. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p < 0.05) by preventing extraneous movement at additional DOFs. Similar results were seen in experiments with two transradial amputees. Though goodness-of-fit evaluations suggested that the EMG feature distributions showed some deviations from the Gaussian, equal-covariance assumptions used in this experiment, the assumptions were sufficiently met to provide improved performance compared to linear regression control. Significance. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.
Meurier, C E
2000-07-01
Human errors are common in clinical practice, but they are under-reported. As a result, very little is known of the types, antecedents and consequences of errors in nursing practice. This limits the potential to learn from errors and to make improvement in the quality and safety of nursing care. The aim of this study was to use an Organizational Accident Model to analyse critical incidents of errors in nursing. Twenty registered nurses were invited to produce a critical incident report of an error (which had led to an adverse event or potentially could have led to an adverse event) they had made in their professional practice and to write down their responses to the error using a structured format. Using Reason's Organizational Accident Model, supplemental information was then collected from five of the participants by means of an individual in-depth interview to explore further issues relating to the incidents they had reported. The detailed analysis of one of the incidents is discussed in this paper, demonstrating the effectiveness of this approach in providing insight into the chain of events which may lead to an adverse event. The case study approach using critical incidents of clinical errors was shown to provide relevant information regarding the interaction of organizational factors, local circumstances and active failures (errors) in producing an adverse or potentially adverse event. It is suggested that more use should be made of this approach to understand how errors are made in practice and to take appropriate preventative measures.
Polylinear regression analysis in radiochemistry
International Nuclear Information System (INIS)
Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.
1995-01-01
A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis
Measurement Error in Education and Growth Regressions
Portela, M.; Teulings, C.N.; Alessie, R.
The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations
Measurement error in education and growth regressions
Portela, Miguel; Teulings, Coen; Alessie, R.
2004-01-01
The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations
Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun
2016-07-01
In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
International Nuclear Information System (INIS)
Massara, Simone
2013-01-01
Since the very first hours after the accident at Fukushima-Daiichi, numerical simulations by means of severe accident codes have been carried out, aiming at highlighting the key physical phenomena allowing a correct understanding of the sequence of events, and - on a long enough timeline - improving models and methods, in order to reduce the discrepancy between calculated and measured data. A last long-term objective is to support the future decommissioning phase. The presentation summarises some of the available elements on the role of the fuel/cladding-water interaction, which became available only through modelling because of the absence of measured data directly related to the cladding-steam interaction. This presentation also aims at drawing some conclusions on the status of the modelling capabilities of current tools, particularly for the purpose of the foreseen application to ATF fuels: - analyses with MELCOR, MAAP, THALES2 and RELAP5 are presented; - input data are taken from BWR Mark-I Fukushima-Daiichi Units 1, 2 and 3, completed with operational data published by TEPCO. In the case of missing or incomplete data or hypotheses, these are adjusted to reduce the calculation/measurement discrepancy. The behaviour of the accident is well understood on a qualitative level (major trends on RPV pressure and water level, dry-wet and PCV pressure are well represented), allowing a certain level of confidence in the results of the analysis of the zirconium-steam reaction - which is accessible only through numerical simulations. These show an extremely fast sequence of events (here for Unit 1): - the top of fuel is uncovered in 3 hours (after the tsunami); - the steam line breaks at 6.5 hours. Vessel dries at 10 hours, with a heat-up rate in a first moment driven by the decay heat only (∼7 K/min) and afterwards by the chemical heat from Zr-oxidation (over 30 K/min), associated with massive hydrogen production. It appears that the level of uncertainty increases with
Suppression Situations in Multiple Linear Regression
Shieh, Gwowen
2006-01-01
This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis for the examination of their relationship. For the two-predictor regression, the author demonstrates that the previous results in the literature are…
DEFF Research Database (Denmark)
Kudahl, Anne Braad; Karydi, Emmanouela; Vaarst, Mette
2015-01-01
chicory and salad burnet. In both fields, birdsfoot trefoil, yellow sweet clover, Sainfoin and starflower never established although the originally seed mixture had a quite high content of their seeds. The plant coverage analyses of the fields which were one to six years old, generally, dandelions...
International Nuclear Information System (INIS)
Janssen, I.; Stebbings, J.H.
1990-01-01
In environmental epidemiology, trace and toxic substance concentrations frequently have very highly skewed distributions ranging over one or more orders of magnitude, and prediction by conventional regression is often poor. Classification and Regression Tree Analysis (CART) is an alternative in such contexts. To compare the techniques, two Pennsylvania data sets and three independent variables are used: house radon progeny (RnD) and gamma levels as predicted by construction characteristics in 1330 houses; and ∼200 house radon (Rn) measurements as predicted by topographic parameters. CART may identify structural variables of interest not identified by conventional regression, and vice versa, but in general the regression models are similar. CART has major advantages in dealing with other common characteristics of environmental data sets, such as missing values, continuous variables requiring transformations, and large sets of potential independent variables. CART is most useful in the identification and screening of independent variables, greatly reducing the need for cross-tabulations and nested breakdown analyses. There is no need to discard cases with missing values for the independent variables because surrogate variables are intrinsic to CART. The tree-structured approach is also independent of the scale on which the independent variables are measured, so that transformations are unnecessary. CART identifies important interactions as well as main effects. The major advantages of CART appear to be in exploring data. Once the important variables are identified, conventional regressions seem to lead to results similar but more interpretable by most audiences. 12 refs., 8 figs., 10 tabs
Polynomial regression analysis and significance test of the regression function
International Nuclear Information System (INIS)
Gao Zhengming; Zhao Juan; He Shengping
2012-01-01
In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)
Recursive Algorithm For Linear Regression
Varanasi, S. V.
1988-01-01
Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.
πN → πN and KN → KN low energy data and partial wave analyses recent results and new directions
International Nuclear Information System (INIS)
Kelly, R.L.
1975-07-01
This review deals with πN → πN and KN → KN physics below about 3 GeV/c. An attempt is made to convey the state of the art, and to point out what appear to be promising directions for future research. The situation as of about one year ago is summarized in the 1974 Review of Particle Properties and in London conference talks so more recent developments are considered. A comprehensive survey of πN → πN data between the Δ region and 3 GeV/c is given. Problems associated with spin-rotation experiments are discussed, and the current πN → πN partial wave analyses. I = 1 and I = 0 KN → KN analyses, respectively, are considered
Combining Alphas via Bounded Regression
Directory of Open Access Journals (Sweden)
Zura Kakushadze
2015-11-01
Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.
Regression in autistic spectrum disorders.
Stefanatos, Gerry A
2008-12-01
A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.
Linear regression in astronomy. I
Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh
1990-01-01
Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.
Energy Technology Data Exchange (ETDEWEB)
Roca, M
1985-07-01
Three programs, written in FORTRAN IV language, for reporting the results of the analyses by ICP emission spectroscopy from data stored in files on floppy disks have been developed. They are intended, respectively, for the analyses of: 1) waters, 2) granites and slates, and 3) different kinds of geological materials. (Author) 8 refs.
Regression of environmental noise in LIGO data
International Nuclear Information System (INIS)
Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G
2015-01-01
We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)
Advanced statistics: linear regression, part I: simple linear regression.
Marill, Keith A
2004-01-01
Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.
Linear regression in astronomy. II
Feigelson, Eric D.; Babu, Gutti J.
1992-01-01
A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.
Time-adaptive quantile regression
DEFF Research Database (Denmark)
Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik
2008-01-01
and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....
Quantile regression theory and applications
Davino, Cristina; Vistocco, Domenico
2013-01-01
A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and
Variable and subset selection in PLS regression
DEFF Research Database (Denmark)
Høskuldsson, Agnar
2001-01-01
The purpose of this paper is to present some useful methods for introductory analysis of variables and subsets in relation to PLS regression. We present here methods that are efficient in finding the appropriate variables or subset to use in the PLS regression. The general conclusion...... is that variable selection is important for successful analysis of chemometric data. An important aspect of the results presented is that lack of variable selection can spoil the PLS regression, and that cross-validation measures using a test set can show larger variation, when we use different subsets of X, than...
Time-trend of melanoma screening practice by primary care physicians: A meta-regression analysis
Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni
2009-01-01
Objective To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Methods Meta-regression analyses of available data. Data Sources: MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Results Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%?82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening ten...
Sieverink, Floor; Kelders, Saskia M; Braakman-Jansen, Louise M A; van Gemert-Pijnen, Julia E W C
2014-03-01
The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. © 2014 Diabetes Technology Society.
Interpret with caution: multicollinearity in multiple regression of cognitive data.
Morrison, Catriona M
2003-08-01
Shibihara and Kondo in 2002 reported a reanalysis of the 1997 Kanji picture-naming data of Yamazaki, Ellis, Morrison, and Lambon-Ralph in which independent variables were highly correlated. Their addition of the variable visual familiarity altered the previously reported pattern of results, indicating that visual familiarity, but not age of acquisition, was important in predicting Kanji naming speed. The present paper argues that caution should be taken when drawing conclusions from multiple regression analyses in which the independent variables are so highly correlated, as such multicollinearity can lead to unreliable output.
Regression analysis of growth responses to water depth in three wetland plant species
DEFF Research Database (Denmark)
Sorrell, Brian K; Tanner, Chris C; Brix, Hans
2012-01-01
depths from 0 – 0.5 m. Morphological and growth responses to depth were followed for 54 days before harvest, and then analysed by repeated measures analysis of covariance, and non-linear and quantile regression analysis (QRA), to compare flooding tolerances. Principal results Growth responses to depth...
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Caudal regression syndrome : a case report
International Nuclear Information System (INIS)
Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun
1998-01-01
Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging
Caudal regression syndrome : a case report
Energy Technology Data Exchange (ETDEWEB)
Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun [Chungang Gil Hospital, Incheon (Korea, Republic of)
1998-07-01
Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging.
Panel Smooth Transition Regression Models
DEFF Research Database (Denmark)
González, Andrés; Terasvirta, Timo; Dijk, Dick van
We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...
Testing discontinuities in nonparametric regression
Dai, Wenlin
2017-01-19
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Testing discontinuities in nonparametric regression
Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun
2017-01-01
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Logistic Regression: Concept and Application
Cokluk, Omay
2010-01-01
The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…
Fungible weights in logistic regression.
Jones, Jeff A; Waller, Niels G
2016-06-01
In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Iturrino, G. J.; Pirmez, C.; Moore, J. C.; Reichow, M. K.; Dugan, B. E.; Sawyer, D. E.; Flemings, P. B.; Shipboard Scientific Party, I.
2005-12-01
IODP Expedition 308 drilled transects along the Brazos-Trinity IV and Ursa Basins in the western and eastern Gulf of Mexico, respectively, for examining how sedimentation, overpressure, fluid flow, and deformation are coupled in passive margin settings. A total of eight holes were logged using either logging while drilling (LWD) or wireline techniques to evaluate the controls on slope stability, understand the timing of sedimentation and slumping, establish the petrophysical properties of shallow sediments, and provide a better understanding of turbidite systems. Overall, the log responses vary for the different lithostratigraphic units and associated regional seismic reflectors. The data acquired also make bed-to-bed correlation between sites possible, which is valuable for the study of sandy turbidites and studies of regional deformation. The thick sedimentary successions drilled at these basins show records of the evolution of channel-levee systems composed of low relief channels that were incapable of confining the turbidity currents causing an overspill of sand and silt. In addition, mass transport deposits at shallow depths, and transitions between interbedded silt, sand, and mud units are common features identified in many of the downhole logging data. In the Ursa Basin sediments, resistivity-at-the-bit images show significant deformation of the overlying hemipelagic drape and distal turbidites that were drilled in these areas. Numerous dipping beds throughout these intervals with dips ranging from 5 to 55 degrees confirm core observations. Steeply deformed beds, with dips as high as 65 degrees, and folded and faulted beds suggest down slope remobilization as mass-transport deposits. Resistivity images also show evidence of these mass-transport deposits where steep dips and folds suggest the presence of overturned beds within a series of cyclic intervals that we interpret as a succession of sand-silt-mud lamina. Preliminary structural analyses suggest that
International Nuclear Information System (INIS)
West, Tristram O; Le Page, Yannick; Wolf, Julie; Thomson, Allison M; Huang, Maoyi
2014-01-01
Projections of land cover change generated from integrated assessment models (IAM) and other economic-based models can be applied for analyses of environmental impacts at sub-regional and landscape scales. For those IAM and economic models that project land cover change at the continental or regional scale, these projections must be downscaled and spatially distributed prior to use in climate or ecosystem models. Downscaling efforts to date have been conducted at the national extent with relatively high spatial resolution (30 m) and at the global extent with relatively coarse spatial resolution (0.5°). We revised existing methods to downscale global land cover change projections for the US to 0.05° resolution using MODIS land cover data as the initial proxy for land class distribution. Land cover change realizations generated here represent a reference scenario and two emissions mitigation pathways (MPs) generated by the global change assessment model (GCAM). Future gridded land cover realizations are constructed for each MODIS plant functional type (PFT) from 2005 to 2095, commensurate with the community land model PFT land classes, and archived for public use. The GCAM land cover realizations provide spatially explicit estimates of potential shifts in croplands, grasslands, shrublands, and forest lands. Downscaling of the MPs indicate a net replacement of grassland by cropland in the western US and by forest in the eastern US. An evaluation of the downscaling method indicates that it is able to reproduce recent changes in cropland and grassland distributions in respective areas in the US, suggesting it could provide relevant insights into the potential impacts of socio-economic and environmental drivers on future changes in land cover. (letters)
Kim, Soo Young; Choi, Yoon Young; An, Ji Yeong; Shin, Hyun Beak; Jo, Ara; Choi, Hyeji; Seo, Sang Hyuk; Bang, Hui-Jae; Cheong, Jae-Ho; Hyung, Woo Jin; Noh, Sung Hoon
2015-08-15
We previously reported that the prognosis of microsatellite instability high (MSI-H) gastric cancer is similar to that of MSI-low/microsatellite stable (MSI-L/MSS) gastric cancer. The reason for this seemed to be related to the effects of chemotherapy. To verify this hypothesis, we expanded the study population and reanalyzed the prognosis of MSI-H gastric cancer. Data from 1,276 patients with Stage II and III gastric cancer who underwent gastrectomy with curative intent between January 2005 and June 2010 were reviewed. The prognosis of MSI-H tumors in comparison with MSI-L/MSS tumors was analyzed, according to the administration of chemotherapy and other clinicopathologic features. A total of 361 (28.3%) patients did not receive chemotherapy (MSI-H = 47 and MSI-L/MSS = 314), whereas 915 (71.7%) patients did receive chemotherapy (MSI-H = 58 and MSI-L/MSS = 857). The hazard ratio of MSI-H versus MSI-L/MSS was 0.49 (95% confidence interval: 0.26-0.94, p = 0.031) when chemotherapy was not received and 1.16 (95% confidence interval: 0.78-1.71, p = 0.466) when chemotherapy was received. In subgroup analyses, the prognosis of MSI-H was better in Stage III, women, with lymph node metastasis, and undifferentiated histology subgroups when chemotherapy was not received. However, in patients treated with chemotherapy, prognosis was worse for MSI-H tumors in Stage III, undifferentiated histology, and diffuse type subgroups of gastric cancer. In conclusion, MSI-H tumors were associated with a good prognosis in Stage II and III gastric cancer when patients were treated by surgery alone, and the benefits of MSI-H status were attenuated by chemotherapy. © 2015 UICC.
International Nuclear Information System (INIS)
Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei
2007-01-01
Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age
Predicting Word Reading Ability: A Quantile Regression Study
McIlraith, Autumn L.
2018-01-01
Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…
Tools to support interpreting multiple regression in the face of multicollinearity.
Kraha, Amanda; Turner, Heather; Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K
2012-01-01
While multicollinearity may increase the difficulty of interpreting multiple regression (MR) results, it should not cause undue problems for the knowledgeable researcher. In the current paper, we argue that rather than using one technique to investigate regression results, researchers should consider multiple indices to understand the contributions that predictors make not only to a regression model, but to each other as well. Some of the techniques to interpret MR effects include, but are not limited to, correlation coefficients, beta weights, structure coefficients, all possible subsets regression, commonality coefficients, dominance weights, and relative importance weights. This article will review a set of techniques to interpret MR effects, identify the elements of the data on which the methods focus, and identify statistical software to support such analyses.
Advanced statistics: linear regression, part II: multiple linear regression.
Marill, Keith A
2004-01-01
The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.
Logic regression and its extensions.
Schwender, Holger; Ruczinski, Ingo
2010-01-01
Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Smith, Maurice
2011-06-15
The oil industry is starting to implement new technologies such as horizontal drilling and multistage fracking to light oil production. Most of the producers are copying what is done by their coounterparts. Experts say that another approach should be taken because you can get quicker results with a technical analysis using an analytical model than by drilling a lot of wells. In general, producers are also eager to put too many fracs into the ground to inflate initial production rates but this does not increase the cumulative recovery so they are spending more money to end up with the same result. The oil industry still has to work to find a way to optimize production and reservoir management and costs.
International Nuclear Information System (INIS)
Pillon, J.; Roptin, D.; Cailler, M.
1976-01-01
Spurious effects of a four grid retarding field analyzer were studied for low energy secondary electron measurements. Their behavior was investigated and two peaks in the energy spectrum were interpreted as resulting from tertiary electrons from the grids. It was shown that the true secondary electron peak has to be separated from these spurious peaks. The spectrum and the yields sigma and eta obtained for a Cu(111) crystal after a surface cleanness control by Auger spectroscopy are given
International Nuclear Information System (INIS)
Watanabe, Norio
2004-01-01
The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)
Sando, Steven K.; McCarthy, Peter M.; Dutton, DeAnn M.
2016-04-05
Chapter C of this Scientific Investigations Report documents results from a study by the U.S. Geological Survey, in cooperation with the Montana Department of Transportation and the Montana Department of Natural Resources, to provide an update of statewide peak-flow frequency analyses and results for Montana. The purpose of this report chapter is to present peak-flow frequency analyses and results for 725 streamflow-gaging stations in or near Montana based on data through water year 2011. The 725 streamflow-gaging stations included in this study represent nearly all streamflowgaging stations in Montana (plus some from adjacent states or Canadian Provinces) that have at least 10 years of peak-flow records through water year 2011. For 29 of the 725 streamflow-gaging stations, peak-flow frequency analyses and results are reported for both unregulated and regulated conditions. Thus, peak-flow frequency analyses and results are reported for a total of 754 analyses. Estimates of peak-flow magnitudes for 66.7-, 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities are reported. These annual exceedance probabilities correspond to 1.5-, 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals.
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of
Directory of Open Access Journals (Sweden)
Erica H Craig
Full Text Available Understanding the genetics of a population is a critical component of developing conservation strategies. We used archived tissue samples from golden eagles (Aquila chrysaetos canadensis in three geographic regions of western North America to conduct a preliminary study of the genetics of the North American subspecies, and to provide data for United States Fish and Wildlife Service (USFWS decision-making for golden eagle management. We used a combination of mitochondrial DNA (mtDNA D-loop sequences and 16 nuclear DNA (nDNA microsatellite loci to investigate the extent of gene flow among our sampling areas in Idaho, California and Alaska and to determine if we could distinguish birds from the different geographic regions based on their genetic profiles. Our results indicate high genetic diversity, low genetic structure and high connectivity. Nuclear DNA Fst values between Idaho and California were low but significantly different from zero (0.026. Bayesian clustering methods indicated a single population, and we were unable to distinguish summer breeding residents from different regions. Results of the mtDNA AMOVA showed that most of the haplotype variation (97% was within the geographic populations while 3% variation was partitioned among them. One haplotype was common to all three areas. One region-specific haplotype was detected in California and one in Idaho, but additional sampling is required to determine if these haplotypes are unique to those geographic areas or a sampling artifact. We discuss potential sources of the high gene flow for this species including natal and breeding dispersal, floaters, and changes in migratory behavior as a result of environmental factors such as climate change and habitat alteration. Our preliminary findings can help inform the USFWS in development of golden eagle management strategies and provide a basis for additional research into the complex dynamics of the North American subspecies.
Craig, Erica H; Adams, Jennifer R; Waits, Lisette P; Fuller, Mark R; Whittington, Diana M
2016-01-01
Understanding the genetics of a population is a critical component of developing conservation strategies. We used archived tissue samples from golden eagles (Aquila chrysaetos canadensis) in three geographic regions of western North America to conduct a preliminary study of the genetics of the North American subspecies, and to provide data for United States Fish and Wildlife Service (USFWS) decision-making for golden eagle management. We used a combination of mitochondrial DNA (mtDNA) D-loop sequences and 16 nuclear DNA (nDNA) microsatellite loci to investigate the extent of gene flow among our sampling areas in Idaho, California and Alaska and to determine if we could distinguish birds from the different geographic regions based on their genetic profiles. Our results indicate high genetic diversity, low genetic structure and high connectivity. Nuclear DNA Fst values between Idaho and California were low but significantly different from zero (0.026). Bayesian clustering methods indicated a single population, and we were unable to distinguish summer breeding residents from different regions. Results of the mtDNA AMOVA showed that most of the haplotype variation (97%) was within the geographic populations while 3% variation was partitioned among them. One haplotype was common to all three areas. One region-specific haplotype was detected in California and one in Idaho, but additional sampling is required to determine if these haplotypes are unique to those geographic areas or a sampling artifact. We discuss potential sources of the high gene flow for this species including natal and breeding dispersal, floaters, and changes in migratory behavior as a result of environmental factors such as climate change and habitat alteration. Our preliminary findings can help inform the USFWS in development of golden eagle management strategies and provide a basis for additional research into the complex dynamics of the North American subspecies.
Craig, Erica H; Adams, Jennifer R.; Waits, Lisette P.; Fuller, Mark R.; Whittington, Diana M.
2016-01-01
Understanding the genetics of a population is a critical component of developing conservation strategies. We used archived tissue samples from golden eagles (Aquila chrysaetos canadensis) in three geographic regions of western North America to conduct a preliminary study of the genetics of the North American subspecies, and to provide data for United States Fish and Wildlife Service (USFWS) decision-making for golden eagle management. We used a combination of mitochondrial DNA (mtDNA) D-loop sequences and 16 nuclear DNA (nDNA) microsatellite loci to investigate the extent of gene flow among our sampling areas in Idaho, California and Alaska and to determine if we could distinguish birds from the different geographic regions based on their genetic profiles. Our results indicate high genetic diversity, low genetic structure and high connectivity. Nuclear DNA Fst values between Idaho and California were low but significantly different from zero (0.026). Bayesian clustering methods indicated a single population, and we were unable to distinguish summer breeding residents from different regions. Results of the mtDNA AMOVA showed that most of the haplotype variation (97%) was within the geographic populations while 3% variation was partitioned among them. One haplotype was common to all three areas. One region-specific haplotype was detected in California and one in Idaho, but additional sampling is required to determine if these haplotypes are unique to those geographic areas or a sampling artifact. We discuss potential sources of the high gene flow for this species including natal and breeding dispersal, floaters, and changes in migratory behavior as a result of environmental factors such as climate change and habitat alteration. Our preliminary findings can help inform the USFWS in development of golden eagle management strategies and provide a basis for additional research into the complex dynamics of the North American subspecies.
Arnold, Steven M.; Arya, Vinod K.; Melis, Matthew E.
1990-01-01
High residual stresses within intermetallic and metal matrix composite systems can develop upon cooling from the processing temperature to room temperature due to the coefficient of thermal expansion (CTE) mismatch between the fiber and matrix. As a result, within certain composite systems, radial, circumferential, and/or longitudinal cracks have been observed to form at the fiber-matrix interface. The compliant layer concept (insertion of a compensating interface material between the fiber and matrix) was proposed to reduce or eliminate the residual stress buildup during cooling and thus minimize cracking. The viability of the proposed compliant layer concept is investigated both elastically and elastoplastically. A detailed parametric study was conducted using a unit cell model consisting of three concentric cylinders to determine the required character (i.e., thickness and material properties) of the compliant layer as well as its applicability. The unknown compliant layer mechanical properties were expressed as ratios of the corresponding temperature dependent Ti-24Al-11Nb (a/o) matrix properties. The fiber properties taken were those corresponding to SCS-6 (SiC). Results indicate that the compliant layer can be used to reduce, if not eliminate, radial and circumferential residual stresses within the fiber and matrix and therefore also reduce or eliminate the radial cracking. However, with this decrease in in-plane stresses, one obtains an increase in longitudinal stress, thus potentially initiating longitudinal cracking. Guidelines are given for the selection of a specific compliant material, given a perfectly bonded system.
International Nuclear Information System (INIS)
Chu, Tsong-Lun; Musicki, Z.; Kohut, P.
1995-01-01
During 1989, the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the potential risks during low power and shutdown operations. Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied by Brookhaven National Laboratory (BNL) and Sandia National Laboratories (SNL). The objectives of the program are to assess the risks of severe accidents initiated during plant operational states (POSs) other than full power operation and to compare the estimated core damage frequencies (CDFs), important accident sequences and other qualitative and quantitative results with those accidents initiated during full power operation as assessed in NUREG-1150. The scope of the program includes that of a Level 3 PRA for internal events and a Level 1 PRA for seismically induced and internal fire and flood induced core damage sequences. This paper summarizes the results and highlights of the internal fire and flood analysis documented in Volumes 3 and 4 of NUREG/CR-6144 performed for the Surry plant during mid-loop operation
Quantile Regression With Measurement Error
Wei, Ying; Carroll, Raymond J.
2009-01-01
. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a
From Rasch scores to regression
DEFF Research Database (Denmark)
Christensen, Karl Bang
2006-01-01
Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....
Testing Heteroscedasticity in Robust Regression
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2011-01-01
Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf
Regression methods for medical research
Tai, Bee Choo
2013-01-01
Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the
Forecasting with Dynamic Regression Models
Pankratz, Alan
2012-01-01
One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.
Ji, Baochao; Xu, Enjie; Cao, Li; Yang, Desheng; Xu, Boyong; Guo, Wentao; Aili, Rehei
2015-02-01
To analyze the results of pathogenic bacteria culture on chronic periprosthetic joint infection after total knee arthroplasty (TKA) and total hip arthroplasty (THA). The medical data of 23 patients with chronic periprosthetic joint infection after TKA or THA from September 2010 to March 2014 were reviewed. Fifteen cases of TKA and 8 cases of THA were included in this study. There were 12 male and 11 female patients with the mean age of 62 years (range from 32 to 79 years), and among them 9 patients with sinus. All patients discontinued antibiotic therapy for a minimum of 2 weeks before arthrocentesis, taking pathogenic bacteria culture and antimicrobial susceptibility test by using synovial fluid taken preoperatively and intraoperatively of revision. Common pathogenic bacteria culture and pathological biopsy were taken on tissues intraoperatively of revision. Culture-negative specimens were prolonged the period of incubation for 2 weeks. The overall culture-positive rate of all 23 patients for 1 week before revision was 30.4% (7/23), and the positive rate of culture-negative samples which prolonged for 2 weeks was 39.1% (9/23). The overall culture-positive rate of patients for 1 week intraoperatively of revision was 60.9% (14/23), and the positive rate of culture-negative samples which prolonged for 2 weeks was 82.6% (19/23). The incubation results of 7 cases (30.4%) preoperatively conformed to that of intraoperation. The culture-positive rate of pathogenic bacteria culture can be increased evidently by discontinuing antimicrobial therapy for a minimum of 2 weeks prior to the definite diagnosis.
Discriminative Elastic-Net Regularized Linear Regression.
Zhang, Zheng; Lai, Zhihui; Xu, Yong; Shao, Ling; Wu, Jian; Xie, Guo-Sen
2017-03-01
In this paper, we aim at learning compact and discriminative linear regression models. Linear regression has been widely used in different problems. However, most of the existing linear regression methods exploit the conventional zero-one matrix as the regression targets, which greatly narrows the flexibility of the regression model. Another major limitation of these methods is that the learned projection matrix fails to precisely project the image features to the target space due to their weak discriminative capability. To this end, we present an elastic-net regularized linear regression (ENLR) framework, and develop two robust linear regression models which possess the following special characteristics. First, our methods exploit two particular strategies to enlarge the margins of different classes by relaxing the strict binary targets into a more feasible variable matrix. Second, a robust elastic-net regularization of singular values is introduced to enhance the compactness and effectiveness of the learned projection matrix. Third, the resulting optimization problem of ENLR has a closed-form solution in each iteration, which can be solved efficiently. Finally, rather than directly exploiting the projection matrix for recognition, our methods employ the transformed features as the new discriminate representations to make final image classification. Compared with the traditional linear regression model and some of its variants, our method is much more accurate in image classification. Extensive experiments conducted on publicly available data sets well demonstrate that the proposed framework can outperform the state-of-the-art methods. The MATLAB codes of our methods can be available at http://www.yongxu.org/lunwen.html.
Energy Technology Data Exchange (ETDEWEB)
Winkler, C.; Dornfeld, S.; Friedrich, S.; Baumann, M. [Technische Univ. Dresden (Germany). Klinik und Poliklinik fuerStrahlentherapie und Radioonkologie; Schwarz, R. [Universitaetskrankenhaus Hamburg-Eppendorf (Germany). Abt. fuer Strahlentherapie
1998-12-01
Aim: Retrospective assessment of the efficacy of radiatiotherapy for meningeomas with high risk for local recurrence. Patients and methods: Records of 67 patients with meningeomas treated from 1974 to 1995 at 2 centres were analyzed. Follow-up time ranged from 0.8 to 213 months (median: 61 months). Radiation therapy was given either after local failure or after biopsy or subtotal resection. The ratio between malignant (n=20) and benign (n=47) meningenoma was 1:2.4. Median age of the patients was 55 years (7 to 77 years). Radiation treatment was given at 1.5 to 2 Gy per fraction to 36 to 79.5 Gy. Survival rates were calculated by the Kaplan-Meier method. Statistical comparisons were performed with the log-rank test and the Cox proportional hazards model. The Bonferroni method was used to correct for multiple comparisons. Results: Five- and 10-year disease-free survival rates were 82%{+-}5% (standard error) and 70%{+-}9%. Local control rates at 5 and 10 years were 78%{+-}5% and 68%{+-}9%. In uni- and multivariate analysis histology, sex, total dose and center showed no significant influence on the results. Patients age was significant for local control (univariate p=0.02; multivariate p=0.03) and disease-free survival (univariate/multivariate p=0.04). The postoperative tumor burden had a significant influence of disease-free survival (multivariate P=0.04). After Bonferroni correction no significant influenc e was observed. We did not observe late side effects, especially brain necrosis. Conclusions: Despite of the negative selection of our patients we observed high survival- and local control rates after radiation therapy. This underscores the role of radiation therapy in the treatment of meningeomas with high risk of local failure. (orig.) [Deutsch] Hintergrund: Retrospektive Auswertung der Behandlungsergebnisse der Bestrahlung von Meningeomen mit hohem Rezidivrisiko. Patienten und Methode: Im Zeitraum zwischen 1974 und 1995 wurden an zwei Zentren insgesamt 67
Lin, Yingzhi; Deng, Xiangzheng; Li, Xing; Ma, Enjun
2014-12-01
Spatially explicit simulation of land use change is the basis for estimating the effects of land use and cover change on energy fluxes, ecology and the environment. At the pixel level, logistic regression is one of the most common approaches used in spatially explicit land use allocation models to determine the relationship between land use and its causal factors in driving land use change, and thereby to evaluate land use suitability. However, these models have a drawback in that they do not determine/allocate land use based on the direct relationship between land use change and its driving factors. Consequently, a multinomial logistic regression method was introduced to address this flaw, and thereby, judge the suitability of a type of land use in any given pixel in a case study area of the Jiangxi Province, China. A comparison of the two regression methods indicated that the proportion of correctly allocated pixels using multinomial logistic regression was 92.98%, which was 8.47% higher than that obtained using logistic regression. Paired t-test results also showed that pixels were more clearly distinguished by multinomial logistic regression than by logistic regression. In conclusion, multinomial logistic regression is a more efficient and accurate method for the spatial allocation of land use changes. The application of this method in future land use change studies may improve the accuracy of predicting the effects of land use and cover change on energy fluxes, ecology, and environment.
Vaeth, Michael; Skovlund, Eva
2004-06-15
For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.
Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate
Directory of Open Access Journals (Sweden)
Minh Vu Trieu
2017-03-01
Full Text Available This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS, Brazilian tensile strength (BTS, rock brittleness index (BI, the distance between planes of weakness (DPW, and the alpha angle (Alpha between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP. Four (4 statistical regression models (two linear and two nonlinear are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2 of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.
Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate
Minh, Vu Trieu; Katushin, Dmitri; Antonov, Maksim; Veinthal, Renno
2017-03-01
This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM) based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), rock brittleness index (BI), the distance between planes of weakness (DPW), and the alpha angle (Alpha) between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP). Four (4) statistical regression models (two linear and two nonlinear) are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2) of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.
Internal flooding analyses results of Slovak NPPs
International Nuclear Information System (INIS)
Sopira, Vladimir
2000-01-01
The assessment of the flood risk was the objective of the internal flooding analysis for NPPs Bohunice V1, V2 and Mochovce. All important flooding sources were identified. The rooms containing safety important components were analyzed from the point of view of: Integrity of flood boundaries; Capability for drainage; Flood signalisation; Flood localization and liquidation; Vulnerability of safety system component. The redundancies of safety systems are located mostly separately and no flood can endanger more than single train. It can be concluded that NPPs with WWER-440 are very safe against the flooding initiating event
Producing The New Regressive Left
DEFF Research Database (Denmark)
Crone, Christine
members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...
Heeg, G. P.; Jansonius, N. M.
Purpose To investigate whether frequency-doubling perimetry (FDT) and nerve fibre analyser (GDx) test results are able to predict glaucomatous visual field loss in glaucoma suspect patients. Methods A large cohort of glaucoma suspect patients (patients with ocular hypertension or a positive family
1977-04-01
extreme is the viewpoint that such analyses are modern types of witchcraft, or numerology , practiced by a priestly cast. Results and con- clusions... numerology . 3 Although such a simple buying strategy may b, adequate for products which are used or consumed at the time of purchase or soon thereafter
International Nuclear Information System (INIS)
Dworak, B.; Gajek, Sz.
1980-01-01
The results of sintered iron and of blast-furnace slag examination obtained by X-ray fluorescent analyses of energy and of wave dispersion are compared. They show that the methods are comparable for such elements as Ca and Fe, whereas for Mn (in sinter) the X-ray fluorescent analysis of wave dispersion is less precise. (author)
A Matlab program for stepwise regression
Directory of Open Access Journals (Sweden)
Yanhong Qi
2016-03-01
Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.
Correlation and simple linear regression.
Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G
2003-06-01
In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.
Regression filter for signal resolution
International Nuclear Information System (INIS)
Matthes, W.
1975-01-01
The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)
Nonparametric Mixture of Regression Models.
Huang, Mian; Li, Runze; Wang, Shaoli
2013-07-01
Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.
Regression Benchmarking: An Approach to Quality Assurance in Performance
Bulej, Lubomír
2005-01-01
The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...
Tuttle, Michele L.W.; Fahy, Juli; Grauch, Richard I.; Ball, Bridget A.; Chong, Geneva W.; Elliott, John G.; Kosovich, John J.; Livo, Keith E.; Stillings, Lisa L.
2007-01-01
Results of chemical and some isotopic analyses of soil, shale, and water extracts collected from the surface, trenches, and pits in the Mancos Shale are presented in this report. Most data are for sites on the Gunnison Gorge National Conservation Area (GGNCA) in southwestern Colorado. For comparison, data from a few sites from the Mancos landscape near Hanksville, Utah, are included. Twelve trenches were dug on the GGNCA from which 258 samples for whole-rock (total) analyses and 187 samples for saturation paste extracts were collected. Sixteen of the extract samples were duplicated and subjected to a 1:5 water extraction for comparison. A regional soil survey across the Mancos landscape on the GGNCA generated 253 samples for whole-rock analyses and saturation paste extractions. Seventeen gypsum samples were collected on the GGNCA for sulfur and oxygen isotopic analysis. Sixteen samples were collected from shallow pits in the Mancos Shale near Hanksville, Utah.
Influence diagnostics in meta-regression model.
Shi, Lei; Zuo, ShanShan; Yu, Dalei; Zhou, Xiaohua
2017-09-01
This paper studies the influence diagnostics in meta-regression model including case deletion diagnostic and local influence analysis. We derive the subset deletion formulae for the estimation of regression coefficient and heterogeneity variance and obtain the corresponding influence measures. The DerSimonian and Laird estimation and maximum likelihood estimation methods in meta-regression are considered, respectively, to derive the results. Internal and external residual and leverage measure are defined. The local influence analysis based on case-weights perturbation scheme, responses perturbation scheme, covariate perturbation scheme, and within-variance perturbation scheme are explored. We introduce a method by simultaneous perturbing responses, covariate, and within-variance to obtain the local influence measure, which has an advantage of capable to compare the influence magnitude of influential studies from different perturbations. An example is used to illustrate the proposed methodology. Copyright © 2017 John Wiley & Sons, Ltd.
Principal component regression for crop yield estimation
Suryanarayana, T M V
2016-01-01
This book highlights the estimation of crop yield in Central Gujarat, especially with regard to the development of Multiple Regression Models and Principal Component Regression (PCR) models using climatological parameters as independent variables and crop yield as a dependent variable. It subsequently compares the multiple linear regression (MLR) and PCR results, and discusses the significance of PCR for crop yield estimation. In this context, the book also covers Principal Component Analysis (PCA), a statistical procedure used to reduce a number of correlated variables into a smaller number of uncorrelated variables called principal components (PC). This book will be helpful to the students and researchers, starting their works on climate and agriculture, mainly focussing on estimation models. The flow of chapters takes the readers in a smooth path, in understanding climate and weather and impact of climate change, and gradually proceeds towards downscaling techniques and then finally towards development of ...
Cactus: An Introduction to Regression
Hyde, Hartley
2008-01-01
When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…
Regression Models for Repairable Systems
Czech Academy of Sciences Publication Activity Database
Novák, Petr
2015-01-01
Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf
Survival analysis II: Cox regression
Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.
2011-01-01
In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the
Kernel regression with functional response
Ferraty, Frédéric; Laksaci, Ali; Tadj, Amel; Vieu, Philippe
2011-01-01
We consider kernel regression estimate when both the response variable and the explanatory one are functional. The rates of uniform almost complete convergence are stated as function of the small ball probability of the predictor and as function of the entropy of the set on which uniformity is obtained.
Chapuis, Aude G; Roberts, Ilana M; Thompson, John A; Margolin, Kim A; Bhatia, Shailender; Lee, Sylvia M; Sloan, Heather L; Lai, Ivy P; Farrar, Erik A; Wagener, Felecia; Shibuya, Kendall C; Cao, Jianhong; Wolchok, Jedd D; Greenberg, Philip D; Yee, Cassian
2016-11-01
Purpose Peripheral blood-derived antigen-specific cytotoxic T cells (CTLs) provide a readily available source of effector cells that can be administered with minimal toxicity in an outpatient setting. In metastatic melanoma, this approach results in measurable albeit modest clinical responses in patients resistant to conventional therapy. We reasoned that concurrent cytotoxic T-cell lymphocyte antigen-4 (CTLA-4) checkpoint blockade might enhance the antitumor activity of adoptively transferred CTLs. Patients and Methods Autologous MART1-specific CTLs were generated by priming with peptide-pulsed dendritic cells in the presence of interleukin-21 and enriched by peptide-major histocompatibility complex multimer-guided cell sorting. This expeditiously yielded polyclonal CTL lines uniformly expressing markers associated with an enhanced survival potential. In this first-in-human strategy, 10 patients with stage IV melanoma received the MART1-specific CTLs followed by a standard course of anti-CTLA-4 (ipilimumab). Results The toxicity profile of the combined treatment was comparable to that of ipilimumab monotherapy. Evaluation of best responses at 12 weeks yielded two continuous complete remissions, one partial response (PR) using RECIST criteria (two PRs using immune-related response criteria), and three instances of stable disease. Infused CTLs persisted with frequencies up to 2.9% of CD8 + T cells for as long as the patients were monitored (up to 40 weeks). In patients who experienced complete remissions, PRs, or stable disease, the persisting CTLs acquired phenotypic and functional characteristics of long-lived memory cells. Moreover, these patients also developed responses to nontargeted tumor antigens (epitope spreading). Conclusion We demonstrate that combining antigen-specific CTLs with CTLA-4 blockade is safe and produces durable clinical responses, likely reflecting both enhanced activity of transferred cells and improved recruitment of new responses
Chapuis, Aude G.; Roberts, Ilana M.; Thompson, John A.; Margolin, Kim A.; Bhatia, Shailender; Lee, Sylvia M.; Sloan, Heather L.; Lai, Ivy P.; Farrar, Erik A.; Wagener, Felecia; Shibuya, Kendall C.; Cao, Jianhong; Wolchok, Jedd D.; Greenberg, Philip D.
2016-01-01
Purpose Peripheral blood–derived antigen-specific cytotoxic T cells (CTLs) provide a readily available source of effector cells that can be administered with minimal toxicity in an outpatient setting. In metastatic melanoma, this approach results in measurable albeit modest clinical responses in patients resistant to conventional therapy. We reasoned that concurrent cytotoxic T-cell lymphocyte antigen-4 (CTLA-4) checkpoint blockade might enhance the antitumor activity of adoptively transferred CTLs. Patients and Methods Autologous MART1-specific CTLs were generated by priming with peptide-pulsed dendritic cells in the presence of interleukin-21 and enriched by peptide-major histocompatibility complex multimer-guided cell sorting. This expeditiously yielded polyclonal CTL lines uniformly expressing markers associated with an enhanced survival potential. In this first-in-human strategy, 10 patients with stage IV melanoma received the MART1-specific CTLs followed by a standard course of anti–CTLA-4 (ipilimumab). Results The toxicity profile of the combined treatment was comparable to that of ipilimumab monotherapy. Evaluation of best responses at 12 weeks yielded two continuous complete remissions, one partial response (PR) using RECIST criteria (two PRs using immune-related response criteria), and three instances of stable disease. Infused CTLs persisted with frequencies up to 2.9% of CD8+ T cells for as long as the patients were monitored (up to 40 weeks). In patients who experienced complete remissions, PRs, or stable disease, the persisting CTLs acquired phenotypic and functional characteristics of long-lived memory cells. Moreover, these patients also developed responses to nontargeted tumor antigens (epitope spreading). Conclusion We demonstrate that combining antigen-specific CTLs with CTLA-4 blockade is safe and produces durable clinical responses, likely reflecting both enhanced activity of transferred cells and improved recruitment of new responses
Poisson Mixture Regression Models for Heart Disease Prediction.
Mufudza, Chipo; Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.
International Nuclear Information System (INIS)
Jafri, Y.Z.; Kamal, L.
2007-01-01
Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)
Logistic regression applied to natural hazards: rare event logistic regression with replications
Guns, M.; Vanacker, Veerle
2012-01-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logisti...
Quantile Regression With Measurement Error
Wei, Ying
2009-08-27
Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.
Multivariate and semiparametric kernel regression
Härdle, Wolfgang; Müller, Marlene
1997-01-01
The paper gives an introduction to theory and application of multivariate and semiparametric kernel smoothing. Multivariate nonparametric density estimation is an often used pilot tool for examining the structure of data. Regression smoothing helps in investigating the association between covariates and responses. We concentrate on kernel smoothing using local polynomial fitting which includes the Nadaraya-Watson estimator. Some theory on the asymptotic behavior and bandwidth selection is pro...
Regression algorithm for emotion detection
Berthelon , Franck; Sander , Peter
2013-01-01
International audience; We present here two components of a computational system for emotion detection. PEMs (Personalized Emotion Maps) store links between bodily expressions and emotion values, and are individually calibrated to capture each person's emotion profile. They are an implementation based on aspects of Scherer's theoretical complex system model of emotion~\\cite{scherer00, scherer09}. We also present a regression algorithm that determines a person's emotional feeling from sensor m...
Directional quantile regression in R
Czech Academy of Sciences Publication Activity Database
Boček, Pavel; Šiman, Miroslav
2017-01-01
Roč. 53, č. 3 (2017), s. 480-492 ISSN 0023-5954 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : multivariate quantile * regression quantile * halfspace depth * depth contour Subject RIV: BD - Theory of Information OBOR OECD: Applied mathematics Impact factor: 0.379, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/bocek-0476587.pdf
International Nuclear Information System (INIS)
Dang Yaping; Hu Guoying; Meng Xianwen
1994-01-01
There are many opinions on the reason of hypothyroidism after hyperthyroidism with 131 I treatment. In this respect, there are a few scientific analyses and reports. The non-condition logistic regression solved this problem successfully. It has a higher scientific value and confidence in the risk factor analysis. 748 follow-up patients' data were analysed by the non-condition logistic regression. The results shown that the half-life and 131 I dose were the main causes of the incidence of hypothyroidism. The degree of confidence is 92.4%
Gaussian Process Regression Model in Spatial Logistic Regression
Sofro, A.; Oktaviarina, A.
2018-01-01
Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.
Multiple predictor smoothing methods for sensitivity analysis: Example results
International Nuclear Information System (INIS)
Storlie, Curtis B.; Helton, Jon C.
2008-01-01
The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described in the first part of this presentation: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. In this, the second and concluding part of the presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present
The study of logistic regression of risk factor on the death cause of uranium miners
International Nuclear Information System (INIS)
Wen Jinai; Yuan Liyun; Jiang Ruyi
1999-01-01
Logistic regression model has widely been used in the field of medicine. The computer software on this model is popular, but it is worth to discuss how to use this model correctly. Using SPSS (Statistical Package for the Social Science) software, unconditional logistic regression method was adopted to carry out multi-factor analyses on the cause of total death, cancer death and lung cancer death of uranium miners. The data is from radioepidemiological database of one uranium mine. The result show that attained age is a risk factor in the logistic regression analyses of total death, cancer death and lung cancer death. In the logistic regression analysis of cancer death, there is a negative correlation between the age of exposure and cancer death. This shows that the younger the age at exposure, the bigger the risk of cancer death. In the logistic regression analysis of lung cancer death, there is a positive correlation between the cumulated exposure and lung cancer death, this show that cumulated exposure is a most important risk factor of lung cancer death on uranium miners. It has been documented by many foreign reports that the lung cancer death rate is higher in uranium miners
Few crystal balls are crystal clear : eyeballing regression
International Nuclear Information System (INIS)
Wittebrood, R.T.
1998-01-01
The theory of regression and statistical analysis as it applies to reservoir analysis was discussed. It was argued that regression lines are not always the final truth. It was suggested that regression lines and eyeballed lines are often equally accurate. The many conditions that must be fulfilled to calculate a proper regression were discussed. Mentioned among these conditions were the distribution of the data, hidden variables, knowledge of how the data was obtained, the need for causal correlation of the variables, and knowledge of the manner in which the regression results are going to be used. 1 tab., 13 figs
Piecewise linear regression splines with hyperbolic covariates
International Nuclear Information System (INIS)
Cologne, John B.; Sposto, Richard
1992-09-01
Consider the problem of fitting a curve to data that exhibit a multiphase linear response with smooth transitions between phases. We propose substituting hyperbolas as covariates in piecewise linear regression splines to obtain curves that are smoothly joined. The method provides an intuitive and easy way to extend the two-phase linear hyperbolic response model of Griffiths and Miller and Watts and Bacon to accommodate more than two linear segments. The resulting regression spline with hyperbolic covariates may be fit by nonlinear regression methods to estimate the degree of curvature between adjoining linear segments. The added complexity of fitting nonlinear, as opposed to linear, regression models is not great. The extra effort is particularly worthwhile when investigators are unwilling to assume that the slope of the response changes abruptly at the join points. We can also estimate the join points (the values of the abscissas where the linear segments would intersect if extrapolated) if their number and approximate locations may be presumed known. An example using data on changing age at menarche in a cohort of Japanese women illustrates the use of the method for exploratory data analysis. (author)
Targeting: Logistic Regression, Special Cases and Extensions
Directory of Open Access Journals (Sweden)
Helmut Schaeben
2014-12-01
Full Text Available Logistic regression is a classical linear model for logit-transformed conditional probabilities of a binary target variable. It recovers the true conditional probabilities if the joint distribution of predictors and the target is of log-linear form. Weights-of-evidence is an ordinary logistic regression with parameters equal to the differences of the weights of evidence if all predictor variables are discrete and conditionally independent given the target variable. The hypothesis of conditional independence can be tested in terms of log-linear models. If the assumption of conditional independence is violated, the application of weights-of-evidence does not only corrupt the predicted conditional probabilities, but also their rank transform. Logistic regression models, including the interaction terms, can account for the lack of conditional independence, appropriate interaction terms compensate exactly for violations of conditional independence. Multilayer artificial neural nets may be seen as nested regression-like models, with some sigmoidal activation function. Most often, the logistic function is used as the activation function. If the net topology, i.e., its control, is sufficiently versatile to mimic interaction terms, artificial neural nets are able to account for violations of conditional independence and yield very similar results. Weights-of-evidence cannot reasonably include interaction terms; subsequent modifications of the weights, as often suggested, cannot emulate the effect of interaction terms.
Tutorial on Using Regression Models with Count Outcomes Using R
Directory of Open Access Journals (Sweden)
A. Alexander Beaujean
2016-02-01
Full Text Available Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares either with or without transforming the count variables. In either case, using typical regression for count data can produce parameter estimates that are biased, thus diminishing any inferences made from such data. As count-variable regression models are seldom taught in training programs, we present a tutorial to help educational researchers use such methods in their own research. We demonstrate analyzing and interpreting count data using Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regression models. The count regression methods are introduced through an example using the number of times students skipped class. The data for this example are freely available and the R syntax used run the example analyses are included in the Appendix.
Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon
2015-01-01
Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.
Murphy, Kevin; Birn, Rasmus M; Handwerker, Daniel A; Jones, Tyler B; Bandettini, Peter A
2009-02-01
Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step.
Regression: The Apple Does Not Fall Far From the Tree.
Vetter, Thomas R; Schober, Patrick
2018-05-15
Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.
Spontaneous regression of pulmonary bullae
International Nuclear Information System (INIS)
Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.
2002-01-01
The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd
Quantum algorithm for linear regression
Wang, Guoming
2017-07-01
We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.
Interpretation of commonly used statistical regression models.
Kasza, Jessica; Wolfe, Rory
2014-01-01
A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.
Keith, Timothy Z
2014-01-01
Multiple Regression and Beyond offers a conceptually oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. Covers both MR and SEM, while explaining their relevance to one another Also includes path analysis, confirmatory factor analysis, and latent growth modeling Figures and tables throughout provide examples and illustrate key concepts and techniques For additional resources, please visit: http://tzkeith.com/.
Ridge regression estimator: combining unbiased and ordinary ridge regression methods of estimation
Directory of Open Access Journals (Sweden)
Sharad Damodar Gore
2009-10-01
Full Text Available Statistical literature has several methods for coping with multicollinearity. This paper introduces a new shrinkage estimator, called modified unbiased ridge (MUR. This estimator is obtained from unbiased ridge regression (URR in the same way that ordinary ridge regression (ORR is obtained from ordinary least squares (OLS. Properties of MUR are derived. Results on its matrix mean squared error (MMSE are obtained. MUR is compared with ORR and URR in terms of MMSE. These results are illustrated with an example based on data generated by Hoerl and Kennard (1975.
DEFF Research Database (Denmark)
Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove
2007-01-01
the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...
On Weighted Support Vector Regression
DEFF Research Database (Denmark)
Han, Xixuan; Clemmensen, Line Katrine Harder
2014-01-01
We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...
Modeling oil production based on symbolic regression
International Nuclear Information System (INIS)
Yang, Guangfei; Li, Xianneng; Wang, Jianliang; Lian, Lian; Ma, Tieju
2015-01-01
Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak
Image superresolution using support vector regression.
Ni, Karl S; Nguyen, Truong Q
2007-06-01
A thorough investigation of the application of support vector regression (SVR) to the superresolution problem is conducted through various frameworks. Prior to the study, the SVR problem is enhanced by finding the optimal kernel. This is done by formulating the kernel learning problem in SVR form as a convex optimization problem, specifically a semi-definite programming (SDP) problem. An additional constraint is added to reduce the SDP to a quadratically constrained quadratic programming (QCQP) problem. After this optimization, investigation of the relevancy of SVR to superresolution proceeds with the possibility of using a single and general support vector regression for all image content, and the results are impressive for small training sets. This idea is improved upon by observing structural properties in the discrete cosine transform (DCT) domain to aid in learning the regression. Further improvement involves a combination of classification and SVR-based techniques, extending works in resolution synthesis. This method, termed kernel resolution synthesis, uses specific regressors for isolated image content to describe the domain through a partitioned look of the vector space, thereby yielding good results.
Robust Regression and its Application in Financial Data Analysis
Mansoor Momeni; Mahmoud Dehghan Nayeri; Ali Faal Ghayoumi; Hoda Ghorbani
2010-01-01
This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from th...
Directory of Open Access Journals (Sweden)
Hong-Juan Li
2013-04-01
Full Text Available Electric load forecasting is an important issue for a power utility, associated with the management of daily operations such as energy transfer scheduling, unit commitment, and load dispatch. Inspired by strong non-linear learning capability of support vector regression (SVR, this paper presents a SVR model hybridized with the empirical mode decomposition (EMD method and auto regression (AR for electric load forecasting. The electric load data of the New South Wales (Australia market are employed for comparing the forecasting performances of different forecasting models. The results confirm the validity of the idea that the proposed model can simultaneously provide forecasting with good accuracy and interpretability.
Using the Ridge Regression Procedures to Estimate the Multiple Linear Regression Coefficients
Gorgees, HazimMansoor; Mahdi, FatimahAssim
2018-05-01
This article concerns with comparing the performance of different types of ordinary ridge regression estimators that have been already proposed to estimate the regression parameters when the near exact linear relationships among the explanatory variables is presented. For this situations we employ the data obtained from tagi gas filling company during the period (2008-2010). The main result we reached is that the method based on the condition number performs better than other methods since it has smaller mean square error (MSE) than the other stated methods.
Bayesian Inference of a Multivariate Regression Model
Directory of Open Access Journals (Sweden)
Marick S. Sinay
2014-01-01
Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Face Alignment via Regressing Local Binary Features.
Ren, Shaoqing; Cao, Xudong; Wei, Yichen; Sun, Jian
2016-03-01
This paper presents a highly efficient and accurate regression approach for face alignment. Our approach has two novel components: 1) a set of local binary features and 2) a locality principle for learning those features. The locality principle guides us to learn a set of highly discriminative local binary features for each facial landmark independently. The obtained local binary features are used to jointly learn a linear regression for the final output. This approach achieves the state-of-the-art results when tested on the most challenging benchmarks to date. Furthermore, because extracting and regressing local binary features are computationally very cheap, our system is much faster than previous methods. It achieves over 3000 frames per second (FPS) on a desktop or 300 FPS on a mobile phone for locating a few dozens of landmarks. We also study a key issue that is important but has received little attention in the previous research, which is the face detector used to initialize alignment. We investigate several face detectors and perform quantitative evaluation on how they affect alignment accuracy. We find that an alignment friendly detector can further greatly boost the accuracy of our alignment method, reducing the error up to 16% relatively. To facilitate practical usage of face detection/alignment methods, we also propose a convenient metric to measure how good a detector is for alignment initialization.
Geographically weighted regression model on poverty indicator
Slamet, I.; Nugroho, N. F. T. A.; Muslich
2017-12-01
In this research, we applied geographically weighted regression (GWR) for analyzing the poverty in Central Java. We consider Gaussian Kernel as weighted function. The GWR uses the diagonal matrix resulted from calculating kernel Gaussian function as a weighted function in the regression model. The kernel weights is used to handle spatial effects on the data so that a model can be obtained for each location. The purpose of this paper is to model of poverty percentage data in Central Java province using GWR with Gaussian kernel weighted function and to determine the influencing factors in each regency/city in Central Java province. Based on the research, we obtained geographically weighted regression model with Gaussian kernel weighted function on poverty percentage data in Central Java province. We found that percentage of population working as farmers, population growth rate, percentage of households with regular sanitation, and BPJS beneficiaries are the variables that affect the percentage of poverty in Central Java province. In this research, we found the determination coefficient R2 are 68.64%. There are two categories of district which are influenced by different of significance factors.
General regression and representation model for classification.
Directory of Open Access Journals (Sweden)
Jianjun Qian
Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.
Credit Scoring Problem Based on Regression Analysis
Khassawneh, Bashar Suhil Jad Allah
2014-01-01
ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....
Variable selection and model choice in geoadditive regression models.
Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard
2009-06-01
Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.
An Original Stepwise Multilevel Logistic Regression Analysis of Discriminatory Accuracy
DEFF Research Database (Denmark)
Merlo, Juan; Wagner, Philippe; Ghith, Nermin
2016-01-01
BACKGROUND AND AIM: Many multilevel logistic regression analyses of "neighbourhood and health" focus on interpreting measures of associations (e.g., odds ratio, OR). In contrast, multilevel analysis of variance is rarely considered. We propose an original stepwise analytical approach that disting...
Estimating the causes of traffic accidents using logistic regression and discriminant analysis.
Karacasu, Murat; Ergül, Barış; Altin Yavuz, Arzu
2014-01-01
Factors that affect traffic accidents have been analysed in various ways. In this study, we use the methods of logistic regression and discriminant analysis to determine the damages due to injury and non-injury accidents in the Eskisehir Province. Data were obtained from the accident reports of the General Directorate of Security in Eskisehir; 2552 traffic accidents between January and December 2009 were investigated regarding whether they resulted in injury. According to the results, the effects of traffic accidents were reflected in the variables. These results provide a wealth of information that may aid future measures toward the prevention of undesired results.
Regularized Label Relaxation Linear Regression.
Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu
2018-04-01
Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.
Smith, Paul F; Ganesh, Siva; Liu, Ping
2013-10-30
Regression is a common statistical tool for prediction in neuroscience. However, linear regression is by far the most common form of regression used, with regression trees receiving comparatively little attention. In this study, the results of conventional multiple linear regression (MLR) were compared with those of random forest regression (RFR), in the prediction of the concentrations of 9 neurochemicals in the vestibular nucleus complex and cerebellum that are part of the l-arginine biochemical pathway (agmatine, putrescine, spermidine, spermine, l-arginine, l-ornithine, l-citrulline, glutamate and γ-aminobutyric acid (GABA)). The R(2) values for the MLRs were higher than the proportion of variance explained values for the RFRs: 6/9 of them were ≥ 0.70 compared to 4/9 for RFRs. Even the variables that had the lowest R(2) values for the MLRs, e.g. ornithine (0.50) and glutamate (0.61), had much lower proportion of variance explained values for the RFRs (0.27 and 0.49, respectively). The RSE values for the MLRs were lower than those for the RFRs in all but two cases. In general, MLRs seemed to be superior to the RFRs in terms of predictive value and error. In the case of this data set, MLR appeared to be superior to RFR in terms of its explanatory value and error. This result suggests that MLR may have advantages over RFR for prediction in neuroscience with this kind of data set, but that RFR can still have good predictive value in some cases. Copyright © 2013 Elsevier B.V. All rights reserved.
Estimating the exceedance probability of rain rate by logistic regression
Chiu, Long S.; Kedem, Benjamin
1990-01-01
Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.
Meaney, Christopher; Moineddin, Rahim
2014-01-24
In biomedical research, response variables are often encountered which have bounded support on the open unit interval--(0,1). Traditionally, researchers have attempted to estimate covariate effects on these types of response data using linear regression. Alternative modelling strategies may include: beta regression, variable-dispersion beta regression, and fractional logit regression models. This study employs a Monte Carlo simulation design to compare the statistical properties of the linear regression model to that of the more novel beta regression, variable-dispersion beta regression, and fractional logit regression models. In the Monte Carlo experiment we assume a simple two sample design. We assume observations are realizations of independent draws from their respective probability models. The randomly simulated draws from the various probability models are chosen to emulate average proportion/percentage/rate differences of pre-specified magnitudes. Following simulation of the experimental data we estimate average proportion/percentage/rate differences. We compare the estimators in terms of bias, variance, type-1 error and power. Estimates of Monte Carlo error associated with these quantities are provided. If response data are beta distributed with constant dispersion parameters across the two samples, then all models are unbiased and have reasonable type-1 error rates and power profiles. If the response data in the two samples have different dispersion parameters, then the simple beta regression model is biased. When the sample size is small (N0 = N1 = 25) linear regression has superior type-1 error rates compared to the other models. Small sample type-1 error rates can be improved in beta regression models using bias correction/reduction methods. In the power experiments, variable-dispersion beta regression and fractional logit regression models have slightly elevated power compared to linear regression models. Similar results were observed if the
Sulistianingsih, E.; Kiftiah, M.; Rosadi, D.; Wahyuni, H.
2017-04-01
Gross Domestic Product (GDP) is an indicator of economic growth in a region. GDP is a panel data, which consists of cross-section and time series data. Meanwhile, panel regression is a tool which can be utilised to analyse panel data. There are three models in panel regression, namely Common Effect Model (CEM), Fixed Effect Model (FEM) and Random Effect Model (REM). The models will be chosen based on results of Chow Test, Hausman Test and Lagrange Multiplier Test. This research analyses palm oil about production, export, and government consumption to five district GDP are in West Kalimantan, namely Sanggau, Sintang, Sambas, Ketapang and Bengkayang by panel regression. Based on the results of analyses, it concluded that REM, which adjusted-determination-coefficient is 0,823, is the best model in this case. Also, according to the result, only Export and Government Consumption that influence GDP of the districts.
Independent contrasts and PGLS regression estimators are equivalent.
Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary
2012-05-01
We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.
Levine, Matthew E; Albers, David J; Hripcsak, George
2016-01-01
Time series analysis methods have been shown to reveal clinical and biological associations in data collected in the electronic health record. We wish to develop reliable high-throughput methods for identifying adverse drug effects that are easy to implement and produce readily interpretable results. To move toward this goal, we used univariate and multivariate lagged regression models to investigate associations between twenty pairs of drug orders and laboratory measurements. Multivariate lagged regression models exhibited higher sensitivity and specificity than univariate lagged regression in the 20 examples, and incorporating autoregressive terms for labs and drugs produced more robust signals in cases of known associations among the 20 example pairings. Moreover, including inpatient admission terms in the model attenuated the signals for some cases of unlikely associations, demonstrating how multivariate lagged regression models' explicit handling of context-based variables can provide a simple way to probe for health-care processes that confound analyses of EHR data.
Polygenic scores via penalized regression on summary statistics.
Mak, Timothy Shin Heng; Porsch, Robert Milan; Choi, Shing Wan; Zhou, Xueya; Sham, Pak Chung
2017-09-01
Polygenic scores (PGS) summarize the genetic contribution of a person's genotype to a disease or phenotype. They can be used to group participants into different risk categories for diseases, and are also used as covariates in epidemiological analyses. A number of possible ways of calculating PGS have been proposed, and recently there is much interest in methods that incorporate information available in published summary statistics. As there is no inherent information on linkage disequilibrium (LD) in summary statistics, a pertinent question is how we can use LD information available elsewhere to supplement such analyses. To answer this question, we propose a method for constructing PGS using summary statistics and a reference panel in a penalized regression framework, which we call lassosum. We also propose a general method for choosing the value of the tuning parameter in the absence of validation data. In our simulations, we showed that pseudovalidation often resulted in prediction accuracy that is comparable to using a dataset with validation phenotype and was clearly superior to the conservative option of setting the tuning parameter of lassosum to its lowest value. We also showed that lassosum achieved better prediction accuracy than simple clumping and P-value thresholding in almost all scenarios. It was also substantially faster and more accurate than the recently proposed LDpred. © 2017 WILEY PERIODICALS, INC.
Testing the equality of nonparametric regression curves based on ...
African Journals Online (AJOL)
Abstract. In this work we propose a new methodology for the comparison of two regression functions f1 and f2 in the case of homoscedastic error structure and a fixed design. Our approach is based on the empirical Fourier coefficients of the regression functions f1 and f2 respectively. As our main results we obtain the ...
A Methodology for Generating Placement Rules that Utilizes Logistic Regression
Wurtz, Keith
2008-01-01
The purpose of this article is to provide the necessary tools for institutional researchers to conduct a logistic regression analysis and interpret the results. Aspects of the logistic regression procedure that are necessary to evaluate models are presented and discussed with an emphasis on cutoff values and choosing the appropriate number of…
Bayesian regression of piecewise homogeneous Poisson processes
Directory of Open Access Journals (Sweden)
Diego Sevilla
2015-12-01
Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015
SPE dose prediction using locally weighted regression
International Nuclear Information System (INIS)
Hines, J. W.; Townsend, L. W.; Nichols, T. F.
2005-01-01
When astronauts are outside earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events (SPEs). The total dose received from a major SPE in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be mitigated with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train and provides predictions as accurate as neural network models previously used. (authors)
SPE dose prediction using locally weighted regression
International Nuclear Information System (INIS)
Hines, J. W.; Townsend, L. W.; Nichols, T. F.
2005-01-01
When astronauts are outside Earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events. The total dose received from a major solar particle event in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be reduced with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train, and provides predictions as accurate as the neural network models that were used previously. (authors)
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Competing Risks Quantile Regression at Work
DEFF Research Database (Denmark)
Dlugosz, Stephan; Lo, Simon M. S.; Wilke, Ralf
2017-01-01
large-scale maternity duration data with multiple competing risks derived from German linked social security records to analyse how public policies are related to the length of economic inactivity of young mothers after giving birth. Our results show that the model delivers detailed insights...
Comparing parametric and nonparametric regression methods for panel data
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard; Henningsen, Arne
We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...
A flexible fuzzy regression algorithm for forecasting oil consumption estimation
International Nuclear Information System (INIS)
Azadeh, A.; Khakestani, M.; Saberi, M.
2009-01-01
Oil consumption plays a vital role in socio-economic development of most countries. This study presents a flexible fuzzy regression algorithm for forecasting oil consumption based on standard economic indicators. The standard indicators are annual population, cost of crude oil import, gross domestic production (GDP) and annual oil production in the last period. The proposed algorithm uses analysis of variance (ANOVA) to select either fuzzy regression or conventional regression for future demand estimation. The significance of the proposed algorithm is three fold. First, it is flexible and identifies the best model based on the results of ANOVA and minimum absolute percentage error (MAPE), whereas previous studies consider the best fitted fuzzy regression model based on MAPE or other relative error results. Second, the proposed model may identify conventional regression as the best model for future oil consumption forecasting because of its dynamic structure, whereas previous studies assume that fuzzy regression always provide the best solutions and estimation. Third, it utilizes the most standard independent variables for the regression models. To show the applicability and superiority of the proposed flexible fuzzy regression algorithm the data for oil consumption in Canada, United States, Japan and Australia from 1990 to 2005 are used. The results show that the flexible algorithm provides accurate solution for oil consumption estimation problem. The algorithm may be used by policy makers to accurately foresee the behavior of oil consumption in various regions.
Energy Technology Data Exchange (ETDEWEB)
Alfke, H.; Alfke, B.; Froelich, J.J.; Klose, K.J.; Wagner, H.J. [Klinik fuer Strahlendiagnostik Philipps Univ. Marburg (Germany)
2003-08-01
Purpose: To analyze outcome and predictive factors for patient survival and patency rates of unresectable malignant biliary obstruction treated with percutaneous transhepatic insertion of metal stents. Materials and Methods: This is a retroselective analysis of 130 patients treated in one interventional radiological center with data collected from patient records and by telephone interviews. The procedure-related data had been prospectively documented in a computer data base. The Kaplan-Meier analysis was performed for univariate and multivariate comparison of survival and patency rates with the log-rank test used for different tumor types. Predictive factors for survival and 30-day mortality were analyzed by a stepwise logistic regression. Results: Underlying causes of malignant biliary obstructions were cholangiocarcinoma in 50, pancreatic carcinoma in 29, liver metastases in 27, gallbladder carcinoma in 20, and other tumors in 4 patients. The technical success rate was 99%, the complication rate 27% and the 30-day mortality 11%. Primary patency rates (406 days with a median of 207 days) did not differ significantly for different tumor types. The survival rates were significantly (p = 0.03 by log-rank test) better for patients with cholangiocarcinoma than for patients with pancreatic carcinoma and liver metastases. Multiple regression analysis revealed no predictive factor for patient survival and 30-day mortality. Conclusion: Percutaneous transhepatic insertion of metal biliary endoprostheses offers a good initial and long-term relief of jaundice caused by malignant biliary obstruction. Although survival rates for patients with cholangiocarcinoma are better than for other causes of malignant biliary obstruction, a clear predictive factor is lacking for patients undergoing palliative biliary stent insertion. (orig.) [German] Ziel: Ergebnisse der perkutanen transhepatischen Metallendoprothesenimplantation bei malignen Gallenwegsverschluessen zu evaluieren und
Regression away from the mean: Theory and examples.
Schwarz, Wolf; Reike, Dennis
2018-02-01
Using a standard repeated measures model with arbitrary true score distribution and normal error variables, we present some fundamental closed-form results which explicitly indicate the conditions under which regression effects towards (RTM) and away from the mean are expected. Specifically, we show that for skewed and bimodal distributions many or even most cases will show a regression effect that is in expectation away from the mean, or that is not just towards but actually beyond the mean. We illustrate our results in quantitative detail with typical examples from experimental and biometric applications, which exhibit a clear regression away from the mean ('egression from the mean') signature. We aim not to repeal cautionary advice against potential RTM effects, but to present a balanced view of regression effects, based on a clear identification of the conditions governing the form that regression effects take in repeated measures designs. © 2017 The British Psychological Society.
Unbalanced Regressions and the Predictive Equation
DEFF Research Database (Denmark)
Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo
Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...
Semiparametric regression during 2003–2007
Ruppert, David; Wand, M.P.; Carroll, Raymond J.
2009-01-01
Semiparametric regression is a fusion between parametric regression and nonparametric regression that integrates low-rank penalized splines, mixed model and hierarchical Bayesian methodology – thus allowing more streamlined handling of longitudinal and spatial correlation. We review progress in the field over the five-year period between 2003 and 2007. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application.
Gaussian process regression analysis for functional data
Shi, Jian Qing
2011-01-01
Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime
Significance testing in ridge regression for genetic data
Directory of Open Access Journals (Sweden)
De Iorio Maria
2011-09-01
Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.
Standards for Standardized Logistic Regression Coefficients
Menard, Scott
2011-01-01
Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…
A Seemingly Unrelated Poisson Regression Model
King, Gary
1989-01-01
This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.
Directory of Open Access Journals (Sweden)
Qiutong Jin
2016-06-01
Full Text Available Estimating the spatial distribution of precipitation is an important and challenging task in hydrology, climatology, ecology, and environmental science. In order to generate a highly accurate distribution map of average annual precipitation for the Loess Plateau in China, multiple linear regression Kriging (MLRK and geographically weighted regression Kriging (GWRK methods were employed using precipitation data from the period 1980–2010 from 435 meteorological stations. The predictors in regression Kriging were selected by stepwise regression analysis from many auxiliary environmental factors, such as elevation (DEM, normalized difference vegetation index (NDVI, solar radiation, slope, and aspect. All predictor distribution maps had a 500 m spatial resolution. Validation precipitation data from 130 hydrometeorological stations were used to assess the prediction accuracies of the MLRK and GWRK approaches. Results showed that both prediction maps with a 500 m spatial resolution interpolated by MLRK and GWRK had a high accuracy and captured detailed spatial distribution data; however, MLRK produced a lower prediction error and a higher variance explanation than GWRK, although the differences were small, in contrast to conclusions from similar studies.
International Nuclear Information System (INIS)
Geiser, Achim
2015-12-01
A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.
Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan
2011-11-01
To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.
Moderation analysis using a two-level regression model.
Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott
2014-10-01
Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.
The purpose of this report is to provide a reference manual that could be used by investigators for making informed use of logistic regression using two methods (standard logistic regression and MARS). The details for analyses of relationships between a dependent binary response ...
Regression with Sparse Approximations of Data
DEFF Research Database (Denmark)
Noorzad, Pardis; Sturm, Bob L.
2012-01-01
We propose sparse approximation weighted regression (SPARROW), a method for local estimation of the regression function that uses sparse approximation with a dictionary of measurements. SPARROW estimates the regression function at a point with a linear combination of a few regressands selected...... by a sparse approximation of the point in terms of the regressors. We show SPARROW can be considered a variant of \\(k\\)-nearest neighbors regression (\\(k\\)-NNR), and more generally, local polynomial kernel regression. Unlike \\(k\\)-NNR, however, SPARROW can adapt the number of regressors to use based...
Spontaneous regression of a congenital melanocytic nevus
Directory of Open Access Journals (Sweden)
Amiya Kumar Nath
2011-01-01
Full Text Available Congenital melanocytic nevus (CMN may rarely regress which may also be associated with a halo or vitiligo. We describe a 10-year-old girl who presented with CMN on the left leg since birth, which recently started to regress spontaneously with associated depigmentation in the lesion and at a distant site. Dermoscopy performed at different sites of the regressing lesion demonstrated loss of epidermal pigments first followed by loss of dermal pigments. Histopathology and Masson-Fontana stain demonstrated lymphocytic infiltration and loss of pigment production in the regressing area. Immunohistochemistry staining (S100 and HMB-45, however, showed that nevus cells were present in the regressing areas.
Mapping geogenic radon potential by regression kriging
Energy Technology Data Exchange (ETDEWEB)
Pásztor, László [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Szabó, Katalin Zsuzsanna, E-mail: sz_k_zs@yahoo.de [Department of Chemistry, Institute of Environmental Science, Szent István University, Páter Károly u. 1, Gödöllő 2100 (Hungary); Szatmári, Gábor; Laborczi, Annamária [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Horváth, Ákos [Department of Atomic Physics, Eötvös University, Pázmány Péter sétány 1/A, 1117 Budapest (Hungary)
2016-02-15
Radon ({sup 222}Rn) gas is produced in the radioactive decay chain of uranium ({sup 238}U) which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on the physical and meteorological parameters of the soil and can enter and accumulate in buildings. Health risks originating from indoor radon concentration can be attributed to natural factors and is characterized by geogenic radon potential (GRP). Identification of areas with high health risks require spatial modeling, that is, mapping of radon risk. In addition to geology and meteorology, physical soil properties play a significant role in the determination of GRP. In order to compile a reliable GRP map for a model area in Central-Hungary, spatial auxiliary information representing GRP forming environmental factors were taken into account to support the spatial inference of the locally measured GRP values. Since the number of measured sites was limited, efficient spatial prediction methodologies were searched for to construct a reliable map for a larger area. Regression kriging (RK) was applied for the interpolation using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly, the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Overall accuracy of the map was tested by Leave-One-Out Cross-Validation. Furthermore the spatial reliability of the resultant map is also estimated by the calculation of the 90% prediction interval of the local prediction values. The applicability of the applied method as well as that of the map is discussed briefly. - Highlights: • A new method
Mapping geogenic radon potential by regression kriging
International Nuclear Information System (INIS)
Pásztor, László; Szabó, Katalin Zsuzsanna; Szatmári, Gábor; Laborczi, Annamária; Horváth, Ákos
2016-01-01
Radon ( 222 Rn) gas is produced in the radioactive decay chain of uranium ( 238 U) which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on the physical and meteorological parameters of the soil and can enter and accumulate in buildings. Health risks originating from indoor radon concentration can be attributed to natural factors and is characterized by geogenic radon potential (GRP). Identification of areas with high health risks require spatial modeling, that is, mapping of radon risk. In addition to geology and meteorology, physical soil properties play a significant role in the determination of GRP. In order to compile a reliable GRP map for a model area in Central-Hungary, spatial auxiliary information representing GRP forming environmental factors were taken into account to support the spatial inference of the locally measured GRP values. Since the number of measured sites was limited, efficient spatial prediction methodologies were searched for to construct a reliable map for a larger area. Regression kriging (RK) was applied for the interpolation using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly, the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Overall accuracy of the map was tested by Leave-One-Out Cross-Validation. Furthermore the spatial reliability of the resultant map is also estimated by the calculation of the 90% prediction interval of the local prediction values. The applicability of the applied method as well as that of the map is discussed briefly. - Highlights: • A new method, regression
Regression calibration with more surrogates than mismeasured variables
Kipnis, Victor
2012-06-29
In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.
Regression calibration with more surrogates than mismeasured variables
Kipnis, Victor; Midthune, Douglas; Freedman, Laurence S.; Carroll, Raymond J.
2012-01-01
In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.
BANK FAILURE PREDICTION WITH LOGISTIC REGRESSION
Directory of Open Access Journals (Sweden)
Taha Zaghdoudi
2013-04-01
Full Text Available In recent years the economic and financial world is shaken by a wave of financial crisis and resulted in violent bank fairly huge losses. Several authors have focused on the study of the crises in order to develop an early warning model. It is in the same path that our work takes its inspiration. Indeed, we have tried to develop a predictive model of Tunisian bank failures with the contribution of the binary logistic regression method. The specificity of our prediction model is that it takes into account microeconomic indicators of bank failures. The results obtained using our provisional model show that a bank's ability to repay its debt, the coefficient of banking operations, bank profitability per employee and leverage financial ratio has a negative impact on the probability of failure.
Regression testing in the TOTEM DCS
International Nuclear Information System (INIS)
Rodríguez, F Lucas; Atanassov, I; Burkimsher, P; Frost, O; Taskinen, J; Tulimaki, V
2012-01-01
The Detector Control System of the TOTEM experiment at the LHC is built with the industrial product WinCC OA (PVSS). The TOTEM system is generated automatically through scripts using as input the detector Product Breakdown Structure (PBS) structure and its pinout connectivity, archiving and alarm metainformation, and some other heuristics based on the naming conventions. When those initial parameters and automation code are modified to include new features, the resulting PVSS system can also introduce side-effects. On a daily basis, a custom developed regression testing tool takes the most recent code from a Subversion (SVN) repository and builds a new control system from scratch. This system is exported in plain text format using the PVSS export tool, and compared with a system previously validated by a human. A report is sent to the developers with any differences highlighted, in readiness for validation and acceptance as a new stable version. This regression approach is not dependent on any development framework or methodology. This process has been satisfactory during several months, proving to be a very valuable tool before deploying new versions in the production systems.
Hartling, Lisa; Featherstone, Robin; Nuspl, Megan; Shave, Kassi; Dryden, Donna M; Vandermeer, Ben
2017-04-19
Systematic reviews (SRs) are an important source of information about healthcare interventions. A key component of a well-conducted SR is a comprehensive literature search. There is limited evidence on the contribution of non-English reports, unpublished studies, and dissertations and their impact on results of meta-analyses. Our sample included SRs from three Cochrane Review Groups: Acute Respiratory Infections (ARI), Infectious Diseases (ID), Developmental Psychosocial and Learning Problems (DPLP) (n = 129). Outcomes included: 1) proportion of reviews that searched for and included each study type; 2) proportion of relevant studies represented by each study type; and 3) impact on results and conclusions of the primary meta-analysis for each study type. Most SRs searched for non-English studies; however, these were included in only 12% of reviews and represented less than 5% of included studies. There was a change in results in only four reviews (total sample = 129); in two cases the change did not have an impact on the statistical or clinical significance of results. Most SRs searched for unpublished studies but the majority did not include these (only 6%) and they represented 2% of included studies. In most cases the impact of including unpublished studies was small; a substantial impact was observed in one case that relied solely on unpublished data. Few reviews in ARI (9%) and ID (3%) searched for dissertations compared to 65% in DPLP. Overall, dissertations were included in only nine SRs and represented less than 2% of included studies. In the majority of cases the change in results was negligible or small; in the case where a large change was noted, the estimate was more conservative without dissertations. The majority of SRs searched for non-English and unpublished studies; however, these represented a small proportion of included studies and rarely impacted the results and conclusions of the review. Inclusion of these study types may have an impact
Parameters Estimation of Geographically Weighted Ordinal Logistic Regression (GWOLR) Model
Zuhdi, Shaifudin; Retno Sari Saputro, Dewi; Widyaningsih, Purnami
2017-06-01
A regression model is the representation of relationship between independent variable and dependent variable. The dependent variable has categories used in the logistic regression model to calculate odds on. The logistic regression model for dependent variable has levels in the logistics regression model is ordinal. GWOLR model is an ordinal logistic regression model influenced the geographical location of the observation site. Parameters estimation in the model needed to determine the value of a population based on sample. The purpose of this research is to parameters estimation of GWOLR model using R software. Parameter estimation uses the data amount of dengue fever patients in Semarang City. Observation units used are 144 villages in Semarang City. The results of research get GWOLR model locally for each village and to know probability of number dengue fever patient categories.
Short-term load forecasting with increment regression tree
Energy Technology Data Exchange (ETDEWEB)
Yang, Jingfei; Stenzel, Juergen [Darmstadt University of Techonology, Darmstadt 64283 (Germany)
2006-06-15
This paper presents a new regression tree method for short-term load forecasting. Both increment and non-increment tree are built according to the historical data to provide the data space partition and input variable selection. Support vector machine is employed to the samples of regression tree nodes for further fine regression. Results of different tree nodes are integrated through weighted average method to obtain the comprehensive forecasting result. The effectiveness of the proposed method is demonstrated through its application to an actual system. (author)
Sampayo-Cordero, Miguel; Miguel-Huguet, Bernat; Pardo-Mateos, Almudena; Moltó-Abad, Marc; Muñoz-Delgado, Cecilia; Pérez-López, Jordi
2018-02-01
Case reports might have a prominent role in the rare diseases field, due to the small number of patients affected by one such disease. A previous systematic review regarding the efficacy of laronidase therapy in patients with mucopolysaccharidosis type I (MPS-I) who initiated enzyme replacement therapy (ERT) in adult age has been published. The review included a meta-analysis of 19 clinical studies and the description of eleven case reports. It was of interest to perform a meta-analysis of those case reports to explore the role of such meta-analyses as a tool for evidence-based medicine in rare diseases. The study included all case reports with standard treatment regimen. Primary analysis was the percentage of case reports showing an improvement in a specific outcome. Only when that percentage was statistically higher than 5%, the improvement was confirmed as such. The outcomes that accomplished this criterion were ranked and compared to the GRADE criteria obtained by those same outcomes in the previous meta-analysis of clinical studies. There were three outcomes that had a significant improvement: Urine glycosaminoglycans, liver volume and 6-minute walking test. Positive and negative predictive values, sensitivity and specificity for the results of the meta-analysis of case reports as compared to that of clinical studies were 100%, 88.9%, 75% and 100%, respectively. Accordingly, absolute (Rho=0.82, 95%CI: 0.47 to 0.95) and relative agreement (Kappa=0.79, 95%CI: 0.593 to 0.99) between the number of case reports with improvement in a specific outcome and the GRADE evidence score for that outcome were good. Sensitivity analysis showed that agreement between the meta-analysis of case reports and that of the clinical studies were good only when using a strong confirmatory strategy for outcome improvement in case reports. We found an agreement between the results of meta-analyses from case reports and from clinical studies in the efficacy of laronidase therapy in
Regression analysis of nuclear plant capacity factors
International Nuclear Information System (INIS)
Stocks, K.J.; Faulkner, J.I.
1980-07-01
Operating data on all commercial nuclear power plants of the PWR, HWR, BWR and GCR types in the Western World are analysed statistically to determine whether the explanatory variables size, year of operation, vintage and reactor supplier are significant in accounting for the variation in capacity factor. The results are compared with a number of previous studies which analysed only United States reactors. The possibility of specification errors affecting the results is also examined. Although, in general, the variables considered are statistically significant, they explain only a small portion of the variation in the capacity factor. The equations thus obtained should certainly not be used to predict the lifetime performance of future large reactors
Poisson Mixture Regression Models for Heart Disease Prediction
Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611
Analyses of developmental rate isomorphy in ectotherms: Introducing the dirichlet regression
Czech Academy of Sciences Publication Activity Database
Boukal S., David; Ditrich, Tomáš; Kutcherov, D.; Sroka, Pavel; Dudová, Pavla; Papáček, M.
2015-01-01
Roč. 10, č. 6 (2015), e0129341 E-ISSN 1932-6203 R&D Projects: GA ČR GAP505/10/0096 Grant - others:European Fund(CZ) PERG04-GA-2008-239543; GA JU(CZ) 145/2013/P Institutional support: RVO:60077344 Keywords : ectotherms Subject RIV: ED - Physiology Impact factor: 3.057, year: 2015 http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0129341
The benefits of using quantile regression for analysing the effect of weeds on organic winter wheat
Casagrande, M.; Makowski, D.; Jeuffroy, M.H.; Valantin-Morison, M.; David, C.
2010-01-01
P>In organic farming, weeds are one of the threats that limit crop yield. An early prediction of weed effect on yield loss and the size of late weed populations could help farmers and advisors to improve weed management. Numerous studies predicting the effect of weeds on yield have already been
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
DEFF Research Database (Denmark)
Scott, Neil W.; Fayers, Peter M.; Aaronson, Neil K.
2010-01-01
Differential item functioning (DIF) methods can be used to determine whether different subgroups respond differently to particular items within a health-related quality of life (HRQoL) subscale, after allowing for overall subgroup differences in that scale. This article reviews issues that arise...
Energy Technology Data Exchange (ETDEWEB)
Tchamen, G.W.; Gaucher, J. [Hydro-Quebec Production, Montreal, PQ (Canada). Direction Barrage et Environnement, Unite Barrages et Hydraulique
2010-08-15
Owners and operators of high capacity dams in Quebec have a legal obligation to conduct dam break analysis for each of their dams in order to ensure public safety. This paper described traditional hydraulic methodologies and models used to perform dam break analyses. In particular, it examined the influence of the reservoir drawdown submodel on the numerical results of a dam break analysis. Numerical techniques from the field of fluid mechanics and aerodynamics have provided the basis for developing effective hydrodynamic codes that reduce the level of uncertainties associated with dam-break analysis. A static representation that considers the storage curve was compared with a dynamic representation based on Saint-Venant equations and the real bathymetry of the reservoir. The comparison was based on breach of reservoir, maximum water level, flooded area, and wave arrival time in the valley downstream. The study showed that the greatest difference in attained water level was in the vicinity of the dam, and the difference decreased as the distance from the reservoir increased. The analysis showed that the static representation overestimated the maximum depth and inundated area by as much as 20 percent. This overestimation can be reduced by 30 to 40 percent by using dynamic representation. A dynamic model based on a synthetic trapezoidal reconstruction of the storage curve was used, given the lack of bathymetric data for the reservoir. It was concluded that this model can significantly reduce the uncertainty associated with the static model. 7 refs., 9 tabs., 7 figs.
Intermediate and advanced topics in multilevel logistic regression analysis.
Austin, Peter C; Merlo, Juan
2017-09-10
Multilevel data occur frequently in health services, population and public health, and epidemiologic research. In such research, binary outcomes are common. Multilevel logistic regression models allow one to account for the clustering of subjects within clusters of higher-level units when estimating the effect of subject and cluster characteristics on subject outcomes. A search of the PubMed database demonstrated that the use of multilevel or hierarchical regression models is increasing rapidly. However, our impression is that many analysts simply use multilevel regression models to account for the nuisance of within-cluster homogeneity that is induced by clustering. In this article, we describe a suite of analyses that can complement the fitting of multilevel logistic regression models. These ancillary analyses permit analysts to estimate the marginal or population-average effect of covariates measured at the subject and cluster level, in contrast to the within-cluster or cluster-specific effects arising from the original multilevel logistic regression model. We describe the interval odds ratio and the proportion of opposed odds ratios, which are summary measures of effect for cluster-level covariates. We describe the variance partition coefficient and the median odds ratio which are measures of components of variance and heterogeneity in outcomes. These measures allow one to quantify the magnitude of the general contextual effect. We describe an R 2 measure that allows analysts to quantify the proportion of variation explained by different multilevel logistic regression models. We illustrate the application and interpretation of these measures by analyzing mortality in patients hospitalized with a diagnosis of acute myocardial infarction. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.
Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H
2016-01-01
Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.
Conjoined legs: Sirenomelia or caudal regression syndrome?
Directory of Open Access Journals (Sweden)
Sakti Prasad Das
2013-01-01
Full Text Available Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.
Conjoined legs: Sirenomelia or caudal regression syndrome?
Das, Sakti Prasad; Ojha, Niranjan; Ganesh, G Shankar; Mohanty, Ram Narayan
2013-07-01
Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.
Logistic regression against a divergent Bayesian network
Directory of Open Access Journals (Sweden)
Noel Antonio Sánchez Trujillo
2015-01-01
Full Text Available This article is a discussion about two statistical tools used for prediction and causality assessment: logistic regression and Bayesian networks. Using data of a simulated example from a study assessing factors that might predict pulmonary emphysema (where fingertip pigmentation and smoking are considered; we posed the following questions. Is pigmentation a confounding, causal or predictive factor? Is there perhaps another factor, like smoking, that confounds? Is there a synergy between pigmentation and smoking? The results, in terms of prediction, are similar with the two techniques; regarding causation, differences arise. We conclude that, in decision-making, the sum of both: a statistical tool, used with common sense, and previous evidence, taking years or even centuries to develop; is better than the automatic and exclusive use of statistical resources.
Entrepreneurial intention modeling using hierarchical multiple regression
Directory of Open Access Journals (Sweden)
Marina Jeger
2014-12-01
Full Text Available The goal of this study is to identify the contribution of effectuation dimensions to the predictive power of the entrepreneurial intention model over and above that which can be accounted for by other predictors selected and confirmed in previous studies. As is often the case in social and behavioral studies, some variables are likely to be highly correlated with each other. Therefore, the relative amount of variance in the criterion variable explained by each of the predictors depends on several factors such as the order of variable entry and sample specifics. The results show the modest predictive power of two dimensions of effectuation prior to the introduction of the theory of planned behavior elements. The article highlights the main advantages of applying hierarchical regression in social sciences as well as in the specific context of entrepreneurial intention formation, and addresses some of the potential pitfalls that this type of analysis entails.
Gaussian process regression for geometry optimization
Denzel, Alexander; Kästner, Johannes
2018-03-01
We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.
Statistical learning from a regression perspective
Berk, Richard A
2016-01-01
This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...
Applied regression analysis a research tool
Pantula, Sastry; Dickey, David
1998-01-01
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
A note on the use of multiple linear regression in molecular ecology.
Frasier, Timothy R
2016-03-01
Multiple linear regression analyses (also often referred to as generalized linear models--GLMs, or generalized linear mixed models--GLMMs) are widely used in the analysis of data in molecular ecology, often to assess the relative effects of genetic characteristics on individual fitness or traits, or how environmental characteristics influence patterns of genetic differentiation. However, the coefficients resulting from multiple regression analyses are sometimes misinterpreted, which can lead to incorrect interpretations and conclusions within individual studies, and can propagate to wider-spread errors in the general understanding of a topic. The primary issue revolves around the interpretation of coefficients for independent variables when interaction terms are also included in the analyses. In this scenario, the coefficients associated with each independent variable are often interpreted as the independent effect of each predictor variable on the predicted variable. However, this interpretation is incorrect. The correct interpretation is that these coefficients represent the effect of each predictor variable on the predicted variable when all other predictor variables are zero. This difference may sound subtle, but the ramifications cannot be overstated. Here, my goals are to raise awareness of this issue, to demonstrate and emphasize the problems that can result and to provide alternative approaches for obtaining the desired information. © 2015 John Wiley & Sons Ltd.
Regression models of reactor diagnostic signals
International Nuclear Information System (INIS)
Vavrin, J.
1989-01-01
The application is described of an autoregression model as the simplest regression model of diagnostic signals in experimental analysis of diagnostic systems, in in-service monitoring of normal and anomalous conditions and their diagnostics. The method of diagnostics is described using a regression type diagnostic data base and regression spectral diagnostics. The diagnostics is described of neutron noise signals from anomalous modes in the experimental fuel assembly of a reactor. (author)
Uncertainty Analyses and Strategy
International Nuclear Information System (INIS)
Kevin Coppersmith
2001-01-01
The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository
Bulcock, J. W.
The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…
Multivariate Regression Analysis and Slaughter Livestock,
AGRICULTURE, *ECONOMICS), (*MEAT, PRODUCTION), MULTIVARIATE ANALYSIS, REGRESSION ANALYSIS , ANIMALS, WEIGHT, COSTS, PREDICTIONS, STABILITY, MATHEMATICAL MODELS, STORAGE, BEEF, PORK, FOOD, STATISTICAL DATA, ACCURACY
[From clinical judgment to linear regression model.
Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O
2013-01-01
When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R 2 ) indicates the importance of independent variables in the outcome.
Suzuki, Taku; Iwamoto, Takuji; Shizu, Kanae; Suzuki, Katsuji; Yamada, Harumoto; Sato, Kazuki
2017-05-01
This retrospective study was designed to investigate prognostic factors for postoperative outcomes for cubital tunnel syndrome (CubTS) using multiple logistic regression analysis with a large number of patients. Eighty-three patients with CubTS who underwent surgeries were enrolled. The following potential prognostic factors for disease severity were selected according to previous reports: sex, age, type of surgery, disease duration, body mass index, cervical lesion, presence of diabetes mellitus, Workers' Compensation status, preoperative severity, and preoperative electrodiagnostic testing. Postoperative severity of disease was assessed 2 years after surgery by Messina's criteria which is an outcome measure specifically for CubTS. Bivariate analysis was performed to select candidate prognostic factors for multiple linear regression analyses. Multiple logistic regression analysis was conducted to identify the association between postoperative severity and selected prognostic factors. Both bivariate and multiple linear regression analysis revealed only preoperative severity as an independent risk factor for poor prognosis, while other factors did not show any significant association. Although conflicting results exist regarding prognosis of CubTS, this study supports evidence from previous studies and concludes early surgical intervention portends the most favorable prognosis. Copyright © 2017 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
Buchner, Florian; Wasem, Jürgen; Schillo, Sonja
2017-01-01
Risk equalization formulas have been refined since their introduction about two decades ago. Because of the complexity and the abundance of possible interactions between the variables used, hardly any interactions are considered. A regression tree is used to systematically search for interactions, a methodologically new approach in risk equalization. Analyses are based on a data set of nearly 2.9 million individuals from a major German social health insurer. A two-step approach is applied: In the first step a regression tree is built on the basis of the learning data set. Terminal nodes characterized by more than one morbidity-group-split represent interaction effects of different morbidity groups. In the second step the 'traditional' weighted least squares regression equation is expanded by adding interaction terms for all interactions detected by the tree, and regression coefficients are recalculated. The resulting risk adjustment formula shows an improvement in the adjusted R 2 from 25.43% to 25.81% on the evaluation data set. Predictive ratios are calculated for subgroups affected by the interactions. The R 2 improvement detected is only marginal. According to the sample level performance measures used, not involving a considerable number of morbidity interactions forms no relevant loss in accuracy. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Diagnostic Algorithm to Reflect Regressive Changes of Human Papilloma Virus in Tissue Biopsies
Lhee, Min Jin; Cha, Youn Jin; Bae, Jong Man; Kim, Young Tae
2014-01-01
Purpose Landmark indicators have not yet to be developed to detect the regression of cervical intraepithelial neoplasia (CIN). We propose that quantitative viral load and indicative histological criteria can be used to differentiate between atypical squamous cells of undetermined significance (ASCUS) and a CIN of grade 1. Materials and Methods We collected 115 tissue biopsies from women who tested positive for the human papilloma virus (HPV). Nine morphological parameters including nuclear size, perinuclear halo, hyperchromasia, typical koilocyte (TK), abortive koilocyte (AK), bi-/multi-nucleation, keratohyaline granules, inflammation, and dyskeratosis were examined for each case. Correlation analyses, cumulative logistic regression, and binary logistic regression were used to determine optimal cut-off values of HPV copy numbers. The parameters TK, perinuclear halo, multi-nucleation, and nuclear size were significantly correlated quantitatively to HPV copy number. Results An HPV loading number of 58.9 and AK number of 20 were optimal to discriminate between negative and subtle findings in biopsies. An HPV loading number of 271.49 and AK of 20 were optimal for discriminating between equivocal changes and obvious koilocytosis. Conclusion We propose that a squamous epithelial lesion with AK of >20 and quantitative HPV copy number between 58.9-271.49 represents a new spectrum of subtle pathological findings, characterized by AK in ASCUS. This can be described as a distinct entity and called "regressing koilocytosis". PMID:24532500
Evaluation of Linear Regression Simultaneous Myoelectric Control Using Intramuscular EMG.
Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J
2016-04-01
The objective of this study was to evaluate the ability of linear regression models to decode patterns of muscle coactivation from intramuscular electromyogram (EMG) and provide simultaneous myoelectric control of a virtual 3-DOF wrist/hand system. Performance was compared to the simultaneous control of conventional myoelectric prosthesis methods using intramuscular EMG (parallel dual-site control)-an approach that requires users to independently modulate individual muscles in the residual limb, which can be challenging for amputees. Linear regression control was evaluated in eight able-bodied subjects during a virtual Fitts' law task and was compared to performance of eight subjects using parallel dual-site control. An offline analysis also evaluated how different types of training data affected prediction accuracy of linear regression control. The two control systems demonstrated similar overall performance; however, the linear regression method demonstrated improved performance for targets requiring use of all three DOFs, whereas parallel dual-site control demonstrated improved performance for targets that required use of only one DOF. Subjects using linear regression control could more easily activate multiple DOFs simultaneously, but often experienced unintended movements when trying to isolate individual DOFs. Offline analyses also suggested that the method used to train linear regression systems may influence controllability. Linear regression myoelectric control using intramuscular EMG provided an alternative to parallel dual-site control for 3-DOF simultaneous control at the wrist and hand. The two methods demonstrated different strengths in controllability, highlighting the tradeoff between providing simultaneous control and the ability to isolate individual DOFs when desired.
Mo, Yangzhi; Li, Jun; Jiang, Bin; Su, Tao; Geng, Xiaofei; Liu, Junwen; Jiang, Haoyu; Shen, Chengde; Ding, Ping; Zhong, Guangcai; Cheng, Zhineng; Liao, Yuhong; Tian, Chongguo; Chen, Yingjun; Zhang, Gan
2018-08-01
Humic-like substances (HULIS) are a class of high molecular weight, light-absorbing compounds that are highly related to brown carbon (BrC). In this study, the sources and compositions of HULIS isolated from fine particles collected in Beijing, China during the 2014 Asia-Pacific Economic Cooperation (APEC) summit were characterized based on carbon isotope ( 13 C and 14 C) and Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR MS) analyses, respectively. HULIS were the main light-absorbing components of water-soluble organic carbon (WSOC), accounting for 80.2 ± 6.1% of the WSOC absorption capacity at 365 nm. The carbon isotope data showed that HULIS had a lower non-fossil contribution (53 ± 4%) and were less enriched with 13 C (-24.2 ± 0.6‰) relative to non-HULIS (62 ± 8% and -20.8 ± 0.3‰, respectively). The higher relative intensity fraction of sulfur-containing compounds in HULIS before and after APEC was attributed to higher sulfur dioxide levels emitted from fossil fuel combustion, whereas the higher fraction of nitrogen-containing compounds during APEC may have been due to the relatively greater contribution of non-fossil compounds or the influence of nitrate radical chemistry. The results of investigating the relationships among the sources, elemental compositions, and optical properties of HULIS demonstrated that the light absorption of HULIS appeared to increase with increasing unsaturation degree, but decrease with increasing oxidation level. The unsaturation of HULIS was affected by both sources and aging level. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ajani, J A; Buyse, M; Lichinitser, M; Gorbunova, V; Bodoky, G; Douillard, J Y; Cascinu, S; Heinemann, V; Zaucha, R; Carrato, A; Ferry, D; Moiseyenko, V
2013-11-01
The aim of developing oral fluorouracil (5-FU) is to provide a more convenient administration route with similar efficacy and the best achievable tolerance. S-1, a novel oral fluoropyrimidine, was specifically designed to overcome the limitations of intravenous fluoropyrimidine therapies. A multicentre, randomised phase 3 trial was undertaken to compare S-1/cisplatin (CS) with infusional 5-FU/cisplatin (CF) in 1053 patients with untreated, advanced gastric/gastroesophageal adenocarcinoma. This report discusses a post-hoc noninferiority overall survival (OS) and safety analyses. Results (1029 treated; CS = 521/CF = 508) revealed OS in CS (8.6 months) was statistically noninferior to CF (7.9 months) [hazard ratio (HR) = 0.92 (two-sided 95% confidence interval (CI), 0.80-1.05)] for any margin equal to or greater than 1.05. Statistically significant safety advantages for the CS arm were observed [G3/4 neutropenia (CS, 18.6%; CF, 40.0%), febrile neutropenia (CS, 1.7%; CF, 6.9%), G3/4 stomatitis (CS, 1.3%; CF, 13.6%), diarrhoea (all grades: CS, 29.2%; CF, 38.4%) and renal adverse events (all grades: CS, 18.8%; CF, 33.5%)]. Hand-foot syndrome, infrequently reported, was mainly grade 1/2 in both arms. Treatment-related deaths were significantly lower in the CS arm than the CF arm (2.5% and 4.9%, respectively; Psafety profile and provides a new treatment option for patients with advanced gastric carcinoma. Copyright © 2013 Elsevier Ltd. All rights reserved.
Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.
2018-01-01
In a number of environmental studies, relationships between natural processes are often assessed through regression analyses, using time series data. Such data are often multi-scale and non-stationary, leading to a poor accuracy of the resulting regression models and therefore to results with moderate reliability. To deal with this issue, the present paper introduces the EMD-regression methodology consisting in applying the empirical mode decomposition (EMD) algorithm on data series and then using the resulting components in regression models. The proposed methodology presents a number of advantages. First, it accounts of the issues of non-stationarity associated to the data series. Second, this approach acts as a scan for the relationship between a response variable and the predictors at different time scales, providing new insights about this relationship. To illustrate the proposed methodology it is applied to study the relationship between weather and cardiovascular mortality in Montreal, Canada. The results shed new knowledge concerning the studied relationship. For instance, they show that the humidity can cause excess mortality at the monthly time scale, which is a scale not visible in classical models. A comparison is also conducted with state of the art methods which are the generalized additive models and distributed lag models, both widely used in weather-related health studies. The comparison shows that EMD-regression achieves better prediction performances and provides more details than classical models concerning the relationship.
Regression modeling methods, theory, and computation with SAS
Panik, Michael
2009-01-01
Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,
Examination of influential observations in penalized spline regression
Türkan, Semra
2013-10-01
In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.
Network class superposition analyses.
Directory of Open Access Journals (Sweden)
Carl A B Pearson
Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.
Mixed kernel function support vector regression for global sensitivity analysis
Cheng, Kai; Lu, Zhenzhou; Wei, Yuhao; Shi, Yan; Zhou, Yicheng
2017-11-01
Global sensitivity analysis (GSA) plays an important role in exploring the respective effects of input variables on an assigned output response. Amongst the wide sensitivity analyses in literature, the Sobol indices have attracted much attention since they can provide accurate information for most models. In this paper, a mixed kernel function (MKF) based support vector regression (SVR) model is employed to evaluate the Sobol indices at low computational cost. By the proposed derivation, the estimation of the Sobol indices can be obtained by post-processing the coefficients of the SVR meta-model. The MKF is constituted by the orthogonal polynomials kernel function and Gaussian radial basis kernel function, thus the MKF possesses both the global characteristic advantage of the polynomials kernel function and the local characteristic advantage of the Gaussian radial basis kernel function. The proposed approach is suitable for high-dimensional and non-linear problems. Performance of the proposed approach is validated by various analytical functions and compared with the popular polynomial chaos expansion (PCE). Results demonstrate that the proposed approach is an efficient method for global sensitivity analysis.
Gregory, T.; Sewando, P.
2013-01-01
Adoption of technology is an important factor in economic development. The thrust of this study was to establish factors affecting adoption of QPM technology in Northern zone of Tanzania. Primary data was collected from a random sample of 120 smallholder maize farmers in four villages. Data collected were analysed using descriptive and quantitative methods. Logit model was used to determine factors that influence adoption of QPM technology. The regression results indicated that education of t...
Collaborative regression-based anatomical landmark detection
International Nuclear Information System (INIS)
Gao, Yaozong; Shen, Dinggang
2015-01-01
Anatomical landmark detection plays an important role in medical image analysis, e.g. for registration, segmentation and quantitative analysis. Among the various existing methods for landmark detection, regression-based methods have recently attracted much attention due to their robustness and efficiency. In these methods, landmarks are localised through voting from all image voxels, which is completely different from the classification-based methods that use voxel-wise classification to detect landmarks. Despite their robustness, the accuracy of regression-based landmark detection methods is often limited due to (1) the inclusion of uninformative image voxels in the voting procedure, and (2) the lack of effective ways to incorporate inter-landmark spatial dependency into the detection step. In this paper, we propose a collaborative landmark detection framework to address these limitations. The concept of collaboration is reflected in two aspects. (1) Multi-resolution collaboration. A multi-resolution strategy is proposed to hierarchically localise landmarks by gradually excluding uninformative votes from faraway voxels. Moreover, for informative voxels near the landmark, a spherical sampling strategy is also designed at the training stage to improve their prediction accuracy. (2) Inter-landmark collaboration. A confidence-based landmark detection strategy is proposed to improve the detection accuracy of ‘difficult-to-detect’ landmarks by using spatial guidance from ‘easy-to-detect’ landmarks. To evaluate our method, we conducted experiments extensively on three datasets for detecting prostate landmarks and head and neck landmarks in computed tomography images, and also dental landmarks in cone beam computed tomography images. The results show the effectiveness of our collaborative landmark detection framework in improving landmark detection accuracy, compared to other state-of-the-art methods. (paper)
Impact of multicollinearity on small sample hydrologic regression models
Kroll, Charles N.; Song, Peter
2013-06-01
Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.
Simulation Experiments in Practice: Statistical Design and Regression Analysis
Kleijnen, J.P.C.
2007-01-01
In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...
DEFF Research Database (Denmark)
Sharifzadeh, Sara; Skytte, Jacob Lercke; Nielsen, Otto Højager Attermann
2012-01-01
Statistical solutions find wide spread use in food and medicine quality control. We investigate the effect of different regression and sparse regression methods for a viscosity estimation problem using the spectro-temporal features from new Sub-Surface Laser Scattering (SLS) vision system. From...... with sparse LAR, lasso and Elastic Net (EN) sparse regression methods. Due to the inconsistent measurement condition, Locally Weighted Scatter plot Smoothing (Loess) has been employed to alleviate the undesired variation in the estimated viscosity. The experimental results of applying different methods show...
RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,
This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)
Hierarchical regression analysis in structural Equation Modeling
de Jong, P.F.
1999-01-01
In a hierarchical or fixed-order regression analysis, the independent variables are entered into the regression equation in a prespecified order. Such an analysis is often performed when the extra amount of variance accounted for in a dependent variable by a specific independent variable is the main
Categorical regression dose-response modeling
The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...
Variable importance in latent variable regression models
Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.
2014-01-01
The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable
Stepwise versus Hierarchical Regression: Pros and Cons
Lewis, Mitzi
2007-01-01
Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…
Gibrat’s law and quantile regressions
DEFF Research Database (Denmark)
Distante, Roberta; Petrella, Ivan; Santoro, Emiliano
2017-01-01
The nexus between firm growth, size and age in U.S. manufacturing is examined through the lens of quantile regression models. This methodology allows us to overcome serious shortcomings entailed by linear regression models employed by much of the existing literature, unveiling a number of important...
Regression Analysis and the Sociological Imagination
De Maio, Fernando
2014-01-01
Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.
Principles of Quantile Regression and an Application
Chen, Fang; Chalhoub-Deville, Micheline
2014-01-01
Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…
ON REGRESSION REPRESENTATIONS OF STOCHASTIC-PROCESSES
RUSCHENDORF, L; DEVALK, [No Value
We construct a.s. nonlinear regression representations of general stochastic processes (X(n))n is-an-element-of N. As a consequence we obtain in particular special regression representations of Markov chains and of certain m-dependent sequences. For m-dependent sequences we obtain a constructive
Pathological assessment of liver fibrosis regression
Directory of Open Access Journals (Sweden)
WANG Bingqiong
2017-03-01
Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.
Should metacognition be measured by logistic regression?
Rausch, Manuel; Zehetleitner, Michael
2017-03-01
Are logistic regression slopes suitable to quantify metacognitive sensitivity, i.e. the efficiency with which subjective reports differentiate between correct and incorrect task responses? We analytically show that logistic regression slopes are independent from rating criteria in one specific model of metacognition, which assumes (i) that rating decisions are based on sensory evidence generated independently of the sensory evidence used for primary task responses and (ii) that the distributions of evidence are logistic. Given a hierarchical model of metacognition, logistic regression slopes depend on rating criteria. According to all considered models, regression slopes depend on the primary task criterion. A reanalysis of previous data revealed that massive numbers of trials are required to distinguish between hierarchical and independent models with tolerable accuracy. It is argued that researchers who wish to use logistic regression as measure of metacognitive sensitivity need to control the primary task criterion and rating criteria. Copyright © 2017 Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Larsen, Klaus; Merlo, Juan
2005-01-01
The logistic regression model is frequently used in epidemiologic studies, yielding odds ratio or relative risk interpretations. Inspired by the theory of linear normal models, the logistic regression model has been extended to allow for correlated responses by introducing random effects. However......, the model does not inherit the interpretational features of the normal model. In this paper, the authors argue that the existing measures are unsatisfactory (and some of them are even improper) when quantifying results from multilevel logistic regression analyses. The authors suggest a measure...... of heterogeneity, the median odds ratio, that quantifies cluster heterogeneity and facilitates a direct comparison between covariate effects and the magnitude of heterogeneity in terms of well-known odds ratios. Quantifying cluster-level covariates in a meaningful way is a challenge in multilevel logistic...
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Krautwurst, Sven; Gerilowski, Konstantin; Kolyer, Richard; Jonsson, Haflidi; Krings, Thomas; Horstjann, Markus; Leifer, Ira; Vigil, Sam; Buchwitz, Michael; Schüttemeyer, Dirk; Fladeland, Matthew M.; Burrows, John P.; Bovensmann, Heinrich
2015-04-01
the German Research Center for Geoscience (GFZ) in Potsdam. The in-situ measurements were obtained by a greenhouse gas (GHG) in-situ analyser operated by NASA's Ames Research Center (ARC). Both instruments were installed aboard a DHC-6 Twin Otter aircraft operated by the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS). Initial results - including estimated fugitive emission rates - will be presented for the landfill Olinda Alpha in Brea, Orange County, Los Angeles Basin, California, which was overflown on four different days during the COMEX field campaign in late summer 2014.
National Research Council Canada - National Science Library
Pfleiderer, Elaine M; Scroggins, Cheryl L; Manning, Carol A
2009-01-01
Two separate logistic regression analyses were conducted for low- and high-altitude sectors to determine whether a set of dynamic sector characteristics variables could reliably discriminate between operational error (OE...
Research and analyze of physical health using multiple regression analysis
Directory of Open Access Journals (Sweden)
T. S. Kyi
2014-01-01
Full Text Available This paper represents the research which is trying to create a mathematical model of the "healthy people" using the method of regression analysis. The factors are the physical parameters of the person (such as heart rate, lung capacity, blood pressure, breath holding, weight height coefficient, flexibility of the spine, muscles of the shoulder belt, abdominal muscles, squatting, etc.., and the response variable is an indicator of physical working capacity. After performing multiple regression analysis, obtained useful multiple regression models that can predict the physical performance of boys the aged of fourteen to seventeen years. This paper represents the development of regression model for the sixteen year old boys and analyzed results.
Radiation regression patterns after cobalt plaque insertion for retinoblastoma
International Nuclear Information System (INIS)
Buys, R.J.; Abramson, D.H.; Ellsworth, R.M.; Haik, B.
1983-01-01
An analysis of 31 eyes of 30 patients who had been treated with cobalt plaques for retinoblastoma disclosed that a type I radiation regression pattern developed in 15 patients; type II, in one patient, and type III, in five patients. Nine patients had a regression pattern characterized by complete destruction of the tumor, the surrounding choroid, and all of the vessels in the area into which the plaque was inserted. This resulting white scar, corresponding to the sclerae only, was classified as a type IV radiation regression pattern. There was no evidence of tumor recurrence in patients with type IV regression patterns, with an average follow-up of 6.5 years, after receiving cobalt plaque therapy. Twenty-nine of these 30 patients had been unsuccessfully treated with at least one other modality (ie, light coagulation, cryotherapy, external beam radiation, or chemotherapy)
Radiation regression patterns after cobalt plaque insertion for retinoblastoma
Energy Technology Data Exchange (ETDEWEB)
Buys, R.J.; Abramson, D.H.; Ellsworth, R.M.; Haik, B.
1983-08-01
An analysis of 31 eyes of 30 patients who had been treated with cobalt plaques for retinoblastoma disclosed that a type I radiation regression pattern developed in 15 patients; type II, in one patient, and type III, in five patients. Nine patients had a regression pattern characterized by complete destruction of the tumor, the surrounding choroid, and all of the vessels in the area into which the plaque was inserted. This resulting white scar, corresponding to the sclerae only, was classified as a type IV radiation regression pattern. There was no evidence of tumor recurrence in patients with type IV regression patterns, with an average follow-up of 6.5 years, after receiving cobalt plaque therapy. Twenty-nine of these 30 patients had been unsuccessfully treated with at least one other modality (ie, light coagulation, cryotherapy, external beam radiation, or chemotherapy).
Easy methods for extracting individual regression slopes: Comparing SPSS, R, and Excel
Directory of Open Access Journals (Sweden)
Roland Pfister
2013-10-01
Full Text Available Three different methods for extracting coefficientsof linear regression analyses are presented. The focus is on automatic and easy-to-use approaches for common statistical packages: SPSS, R, and MS Excel / LibreOffice Calc. Hands-on examples are included for each analysis, followed by a brief description of how a subsequent regression coefficient analysis is performed.
Duman, T. Y.; Can, T.; Gokceoglu, C.; Nefeslioglu, H. A.; Sonmez, H.
2006-11-01
As a result of industrialization, throughout the world, cities have been growing rapidly for the last century. One typical example of these growing cities is Istanbul, the population of which is over 10 million. Due to rapid urbanization, new areas suitable for settlement and engineering structures are necessary. The Cekmece area located west of the Istanbul metropolitan area is studied, because the landslide activity is extensive in this area. The purpose of this study is to develop a model that can be used to characterize landslide susceptibility in map form using logistic regression analysis of an extensive landslide database. A database of landslide activity was constructed using both aerial-photography and field studies. About 19.2% of the selected study area is covered by deep-seated landslides. The landslides that occur in the area are primarily located in sandstones with interbedded permeable and impermeable layers such as claystone, siltstone and mudstone. About 31.95% of the total landslide area is located at this unit. To apply logistic regression analyses, a data matrix including 37 variables was constructed. The variables used in the forwards stepwise analyses are different measures of slope, aspect, elevation, stream power index (SPI), plan curvature, profile curvature, geology, geomorphology and relative permeability of lithological units. A total of 25 variables were identified as exerting strong influence on landslide occurrence, and included by the logistic regression equation. Wald statistics values indicate that lithology, SPI and slope are more important than the other parameters in the equation. Beta coefficients of the 25 variables included the logistic regression equation provide a model for landslide susceptibility in the Cekmece area. This model is used to generate a landslide susceptibility map that correctly classified 83.8% of the landslide-prone areas.
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Logistic Regression in the Identification of Hazards in Construction
Drozd, Wojciech
2017-10-01
The construction site and its elements create circumstances that are conducive to the formation of risks to safety during the execution of works. Analysis indicates the critical importance of these factors in the set of characteristics that describe the causes of accidents in the construction industry. This article attempts to analyse the characteristics related to the construction site, in order to indicate their importance in defining the circumstances of accidents at work. The study includes sites inspected in 2014 - 2016 by the employees of the District Labour Inspectorate in Krakow (Poland). The analysed set of detailed (disaggregated) data includes both quantitative and qualitative characteristics. The substantive task focused on classification modelling in the identification of hazards in construction and identifying those of the analysed characteristics that are important in an accident. In terms of methodology, resource data analysis using statistical classifiers, in the form of logistic regression, was the method used.
Applied Regression Modeling A Business Approach
Pardoe, Iain
2012-01-01
An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a
Energy Technology Data Exchange (ETDEWEB)
Schuchardt, Lukas D.
2012-11-01
With the introduction of regulation by incentives in 2009, German grid operators were faced with a new challenge, i.e. economic efficiency instead of technical excellence became the determining factor on which decisions for actions were based. To these new requirements, grid operators reacted by introducing a central regulation management department. The book presents an empirical investigation of the tasks, organisational structure and goals of the new department and analyses the institutional changes in the structure of grid operating utilities.
Analyzing hospitalization data: potential limitations of Poisson regression.
Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R
2015-08-01
Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Vectors, a tool in statistical regression theory
Corsten, L.C.A.
1958-01-01
Using linear algebra this thesis developed linear regression analysis including analysis of variance, covariance analysis, special experimental designs, linear and fertility adjustments, analysis of experiments at different places and times. The determination of the orthogonal projection, yielding
Genetics Home Reference: caudal regression syndrome
... umbilical artery: Further support for a caudal regression-sirenomelia spectrum. Am J Med Genet A. 2007 Dec ... AK, Dickinson JE, Bower C. Caudal dysgenesis and sirenomelia-single centre experience suggests common pathogenic basis. Am ...
Dynamic travel time estimation using regression trees.
2008-10-01
This report presents a methodology for travel time estimation by using regression trees. The dissemination of travel time information has become crucial for effective traffic management, especially under congested road conditions. In the absence of c...
Fuzzy multiple linear regression: A computational approach
Juang, C. H.; Huang, X. H.; Fleming, J. W.
1992-01-01
This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.
Computing multiple-output regression quantile regions
Czech Academy of Sciences Publication Activity Database
Paindaveine, D.; Šiman, Miroslav
2012-01-01
Roč. 56, č. 4 (2012), s. 840-853 ISSN 0167-9473 R&D Projects: GA MŠk(CZ) 1M06047 Institutional research plan: CEZ:AV0Z10750506 Keywords : halfspace depth * multiple-output regression * parametric linear programming * quantile regression Subject RIV: BA - General Mathematics Impact factor: 1.304, year: 2012 http://library.utia.cas.cz/separaty/2012/SI/siman-0376413.pdf
There is No Quantum Regression Theorem
International Nuclear Information System (INIS)
Ford, G.W.; OConnell, R.F.
1996-01-01
The Onsager regression hypothesis states that the regression of fluctuations is governed by macroscopic equations describing the approach to equilibrium. It is here asserted that this hypothesis fails in the quantum case. This is shown first by explicit calculation for the example of quantum Brownian motion of an oscillator and then in general from the fluctuation-dissipation theorem. It is asserted that the correct generalization of the Onsager hypothesis is the fluctuation-dissipation theorem. copyright 1996 The American Physical Society
Spontaneous regression of metastatic Merkel cell carcinoma.
LENUS (Irish Health Repository)
Hassan, S J
2010-01-01
Merkel cell carcinoma is a rare aggressive neuroendocrine carcinoma of the skin predominantly affecting elderly Caucasians. It has a high rate of local recurrence and regional lymph node metastases. It is associated with a poor prognosis. Complete spontaneous regression of Merkel cell carcinoma has been reported but is a poorly understood phenomenon. Here we present a case of complete spontaneous regression of metastatic Merkel cell carcinoma demonstrating a markedly different pattern of events from those previously published.
Forecasting exchange rates: a robust regression approach
Preminger, Arie; Franck, Raphael
2005-01-01
The least squares estimation method as well as other ordinary estimation method for regression models can be severely affected by a small number of outliers, thus providing poor out-of-sample forecasts. This paper suggests a robust regression approach, based on the S-estimation method, to construct forecasting models that are less sensitive to data contamination by outliers. A robust linear autoregressive (RAR) and a robust neural network (RNN) models are estimated to study the predictabil...
Marginal longitudinal semiparametric regression via penalized splines
Al Kadiri, M.
2010-08-01
We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.
Marginal longitudinal semiparametric regression via penalized splines
Al Kadiri, M.; Carroll, R.J.; Wand, M.P.
2010-01-01
We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.
Regression tools for CO2 inversions: application of a shrinkage estimator to process attribution
International Nuclear Information System (INIS)
Shaby, Benjamin A.; Field, Christopher B.
2006-01-01
In this study we perform an atmospheric inversion based on a shrinkage estimator. This method is used to estimate surface fluxes of CO 2 , first partitioned according to constituent geographic regions, and then according to constituent processes that are responsible for the total flux. Our approach differs from previous approaches in two important ways. The first is that the technique of linear Bayesian inversion is recast as a regression problem. Seen as such, standard regression tools are employed to analyse and reduce errors in the resultant estimates. A shrinkage estimator, which combines standard ridge regression with the linear 'Bayesian inversion' model, is introduced. This method introduces additional bias into the model with the aim of reducing variance such that errors are decreased overall. Compared with standard linear Bayesian inversion, the ridge technique seems to reduce both flux estimation errors and prediction errors. The second divergence from previous studies is that instead of dividing the world into geographically distinct regions and estimating the CO 2 flux in each region, the flux space is divided conceptually into processes that contribute to the total global flux. Formulating the problem in this manner adds to the interpretability of the resultant estimates and attempts to shed light on the problem of attributing sources and sinks to their underlying mechanisms
bayesQR: A Bayesian Approach to Quantile Regression
Directory of Open Access Journals (Sweden)
Dries F. Benoit
2017-01-01
Full Text Available After its introduction by Koenker and Basset (1978, quantile regression has become an important and popular tool to investigate the conditional response distribution in regression. The R package bayesQR contains a number of routines to estimate quantile regression parameters using a Bayesian approach based on the asymmetric Laplace distribution. The package contains functions for the typical quantile regression with continuous dependent variable, but also supports quantile regression for binary dependent variables. For both types of dependent variables, an approach to variable selection using the adaptive lasso approach is provided. For the binary quantile regression model, the package also contains a routine that calculates the fitted probabilities for each vector of predictors. In addition, functions for summarizing the results, creating traceplots, posterior histograms and drawing quantile plots are included. This paper starts with a brief overview of the theoretical background of the models used in the bayesQR package. The main part of this paper discusses the computational problems that arise in the implementation of the procedure and illustrates the usefulness of the package through selected examples.