Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Applications of MIDAS regression in analysing trends in water quality
Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.
2014-04-01
We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.
Analysing Access Control Specifications
DEFF Research Database (Denmark)
Probst, Christian W.; Hansen, René Rydhof
2009-01-01
When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.
Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H
2016-04-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.
Statistical and regression analyses of detected extrasolar systems
Czech Academy of Sciences Publication Activity Database
Pintr, Pavel; Peřinová, V.; Lukš, A.; Pathak, A.
2013-01-01
Roč. 75, č. 1 (2013), s. 37-45 ISSN 0032-0633 Institutional support: RVO:61389021 Keywords : Exoplanets * Kepler candidates * Regression analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.630, year: 2013 http://www.sciencedirect.com/science/article/pii/S0032063312003066
Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies
Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.
2016-01-01
The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epide...
Analysing inequalities in Germany a structured additive distributional regression approach
Silbersdorff, Alexander
2017-01-01
This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals’ realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.
Use of probabilistic weights to enhance linear regression myoelectric control.
Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J
2015-12-01
Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts' law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p linear regression control. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.
Controlling attribute effect in linear regression
Calders, Toon; Karim, Asim A.; Kamiran, Faisal; Ali, Wasif Mohammad; Zhang, Xiangliang
2013-01-01
In data mining we often have to learn from biased data, because, for instance, data comes from different batches or there was a gender or racial bias in the collection of social data. In some applications it may be necessary to explicitly control this bias in the models we learn from the data. This paper is the first to study learning linear regression models under constraints that control the biasing effect of a given attribute such as gender or batch number. We show how propensity modeling can be used for factoring out the part of the bias that can be justified by externally provided explanatory attributes. Then we analytically derive linear models that minimize squared error while controlling the bias by imposing constraints on the mean outcome or residuals of the models. Experiments with discrimination-aware crime prediction and batch effect normalization tasks show that the proposed techniques are successful in controlling attribute effects in linear regression models. © 2013 IEEE.
Controlling attribute effect in linear regression
Calders, Toon
2013-12-01
In data mining we often have to learn from biased data, because, for instance, data comes from different batches or there was a gender or racial bias in the collection of social data. In some applications it may be necessary to explicitly control this bias in the models we learn from the data. This paper is the first to study learning linear regression models under constraints that control the biasing effect of a given attribute such as gender or batch number. We show how propensity modeling can be used for factoring out the part of the bias that can be justified by externally provided explanatory attributes. Then we analytically derive linear models that minimize squared error while controlling the bias by imposing constraints on the mean outcome or residuals of the models. Experiments with discrimination-aware crime prediction and batch effect normalization tasks show that the proposed techniques are successful in controlling attribute effects in linear regression models. © 2013 IEEE.
Evaluation of Linear Regression Simultaneous Myoelectric Control Using Intramuscular EMG.
Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J
2016-04-01
The objective of this study was to evaluate the ability of linear regression models to decode patterns of muscle coactivation from intramuscular electromyogram (EMG) and provide simultaneous myoelectric control of a virtual 3-DOF wrist/hand system. Performance was compared to the simultaneous control of conventional myoelectric prosthesis methods using intramuscular EMG (parallel dual-site control)-an approach that requires users to independently modulate individual muscles in the residual limb, which can be challenging for amputees. Linear regression control was evaluated in eight able-bodied subjects during a virtual Fitts' law task and was compared to performance of eight subjects using parallel dual-site control. An offline analysis also evaluated how different types of training data affected prediction accuracy of linear regression control. The two control systems demonstrated similar overall performance; however, the linear regression method demonstrated improved performance for targets requiring use of all three DOFs, whereas parallel dual-site control demonstrated improved performance for targets that required use of only one DOF. Subjects using linear regression control could more easily activate multiple DOFs simultaneously, but often experienced unintended movements when trying to isolate individual DOFs. Offline analyses also suggested that the method used to train linear regression systems may influence controllability. Linear regression myoelectric control using intramuscular EMG provided an alternative to parallel dual-site control for 3-DOF simultaneous control at the wrist and hand. The two methods demonstrated different strengths in controllability, highlighting the tradeoff between providing simultaneous control and the ability to isolate individual DOFs when desired.
Use of probabilistic weights to enhance linear regression myoelectric control
Smith, Lauren H.; Kuiken, Todd A.; Hargrove, Levi J.
2015-12-01
Objective. Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Approach. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts’ law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Main results. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p < 0.05) by preventing extraneous movement at additional DOFs. Similar results were seen in experiments with two transradial amputees. Though goodness-of-fit evaluations suggested that the EMG feature distributions showed some deviations from the Gaussian, equal-covariance assumptions used in this experiment, the assumptions were sufficiently met to provide improved performance compared to linear regression control. Significance. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.
Tripepi, Giovanni; Jager, Kitty J.; Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine
2011-01-01
Because of some limitations of stratification methods, epidemiologists frequently use multiple linear and logistic regression analyses to address specific epidemiological questions. If the dependent variable is a continuous one (for example, systolic pressure and serum creatinine), the researcher
USE OF THE SIMPLE LINEAR REGRESSION MODEL IN MACRO-ECONOMICAL ANALYSES
Directory of Open Access Journals (Sweden)
Constantin ANGHELACHE
2011-10-01
Full Text Available The article presents the fundamental aspects of the linear regression, as a toolbox which can be used in macroeconomic analyses. The article describes the estimation of the parameters, the statistical tests used, the homoscesasticity and heteroskedasticity. The use of econometrics instrument in macroeconomics is an important factor that guarantees the quality of the models, analyses, results and possible interpretation that can be drawn at this level.
Karami, K; Zerehdaran, S; Barzanooni, B; Lotfi, E
2017-12-01
1. The aim of the present study was to estimate genetic parameters for average egg weight (EW) and egg number (EN) at different ages in Japanese quail using multi-trait random regression (MTRR) models. 2. A total of 8534 records from 900 quail, hatched between 2014 and 2015, were used in the study. Average weekly egg weights and egg numbers were measured from second until sixth week of egg production. 3. Nine random regression models were compared to identify the best order of the Legendre polynomials (LP). The most optimal model was identified by the Bayesian Information Criterion. A model with second order of LP for fixed effects, second order of LP for additive genetic effects and third order of LP for permanent environmental effects (MTRR23) was found to be the best. 4. According to the MTRR23 model, direct heritability for EW increased from 0.26 in the second week to 0.53 in the sixth week of egg production, whereas the ratio of permanent environment to phenotypic variance decreased from 0.48 to 0.1. Direct heritability for EN was low, whereas the ratio of permanent environment to phenotypic variance decreased from 0.57 to 0.15 during the production period. 5. For each trait, estimated genetic correlations among weeks of egg production were high (from 0.85 to 0.98). Genetic correlations between EW and EN were low and negative for the first two weeks, but they were low and positive for the rest of the egg production period. 6. In conclusion, random regression models can be used effectively for analysing egg production traits in Japanese quail. Response to selection for increased egg weight would be higher at older ages because of its higher heritability and such a breeding program would have no negative genetic impact on egg production.
Directory of Open Access Journals (Sweden)
Esther Leushuis
2016-12-01
Full Text Available Background: Standardization of the semen analysis may improve reproducibility. We assessed variability between laboratories in semen analyses and evaluated whether a transformation using Z scores and regression statistics was able to reduce this variability. Materials and Methods: We performed a retrospective cohort study. We calculated between-laboratory coefficients of variation (CVB for sperm concentration and for morphology. Subsequently, we standardized the semen analysis results by calculating laboratory specific Z scores, and by using regression. We used analysis of variance for four semen parameters to assess systematic differences between laboratories before and after the transformations, both in the circulation samples and in the samples obtained in the prospective cohort study in the Netherlands between January 2002 and February 2004. Results: The mean CVB was 7% for sperm concentration (range 3 to 13% and 32% for sperm morphology (range 18 to 51%. The differences between the laboratories were statistically significant for all semen parameters (all P<0.001. Standardization using Z scores did not reduce the differences in semen analysis results between the laboratories (all P<0.001. Conclusion: There exists large between-laboratory variability for sperm morphology and small, but statistically significant, between-laboratory variation for sperm concentration. Standardization using Z scores does not eliminate between-laboratory variability.
Fundamental data analyses for measurement control
International Nuclear Information System (INIS)
Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.
1987-02-01
A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs
International Nuclear Information System (INIS)
Bhowmik, K.R.; Islam, S.
2016-01-01
Logistic regression (LR) analysis is the most common statistical methodology to find out the determinants of childhood mortality. However, the significant predictors cannot be ranked according to their influence on the response variable. Multiple classification (MC) analysis can be applied to identify the significant predictors with a priority index which helps to rank the predictors. The main objective of the study is to find the socio-demographic determinants of childhood mortality at neonatal, post-neonatal, and post-infant period by fitting LR model as well as to rank those through MC analysis. The study is conducted using the data of Bangladesh Demographic and Health Survey 2007 where birth and death information of children were collected from their mothers. Three dichotomous response variables are constructed from children age at death to fit the LR and MC models. Socio-economic and demographic variables significantly associated with the response variables separately are considered in LR and MC analyses. Both the LR and MC models identified the same significant predictors for specific childhood mortality. For both the neonatal and child mortality, biological factors of children, regional settings, and parents socio-economic status are found as 1st, 2nd, and 3rd significant groups of predictors respectively. Mother education and household environment are detected as major significant predictors of post-neonatal mortality. This study shows that MC analysis with or without LR analysis can be applied to detect determinants with rank which help the policy makers taking initiatives on a priority basis. (author)
The number of subjects per variable required in linear regression analyses
P.C. Austin (Peter); E.W. Steyerberg (Ewout)
2015-01-01
textabstractObjectives To determine the number of independent variables that can be included in a linear regression model. Study Design and Setting We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression
The number of subjects per variable required in linear regression analyses.
Austin, Peter C; Steyerberg, Ewout W
2015-06-01
To determine the number of independent variables that can be included in a linear regression model. We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression coefficients and standard errors, on the empirical coverage of estimated confidence intervals, and on the accuracy of the estimated R(2) of the fitted model. A minimum of approximately two SPV tended to result in estimation of regression coefficients with relative bias of less than 10%. Furthermore, with this minimum number of SPV, the standard errors of the regression coefficients were accurately estimated and estimated confidence intervals had approximately the advertised coverage rates. A much higher number of SPV were necessary to minimize bias in estimating the model R(2), although adjusted R(2) estimates behaved well. The bias in estimating the model R(2) statistic was inversely proportional to the magnitude of the proportion of variation explained by the population regression model. Linear regression models require only two SPV for adequate estimation of regression coefficients, standard errors, and confidence intervals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Kevin D. Cashman
2017-05-01
Full Text Available Dietary Reference Values (DRVs for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years of the vitamin D intake–serum 25-hydroxyvitamin D (25(OHD dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OHD concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OHD >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OHD to vitamin D intake.
Kromhout, D.
2009-01-01
Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements of the
Li, Spencer D.
2011-01-01
Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…
Wu, Dane W.
2002-01-01
The year 2000 US presidential election between Al Gore and George Bush has been the most intriguing and controversial one in American history. The state of Florida was the trigger for the controversy, mainly, due to the use of the misleading "butterfly ballot". Using prediction (or confidence) intervals for least squares regression lines…
Directory of Open Access Journals (Sweden)
Giuliano de Oliveira Freitas
2013-10-01
Full Text Available PURPOSE: To determine linear regression models between Alpins descriptive indices and Thibos astigmatic power vectors (APV, assessing the validity and strength of such correlations. METHODS: This case series prospectively assessed 62 eyes of 31 consecutive cataract patients with preoperative corneal astigmatism between 0.75 and 2.50 diopters in both eyes. Patients were randomly assorted among two phacoemulsification groups: one assigned to receive AcrySof®Toric intraocular lens (IOL in both eyes and another assigned to have AcrySof Natural IOL associated with limbal relaxing incisions, also in both eyes. All patients were reevaluated postoperatively at 6 months, when refractive astigmatism analysis was performed using both Alpins and Thibos methods. The ratio between Thibos postoperative APV and preoperative APV (APVratio and its linear regression to Alpins percentage of success of astigmatic surgery, percentage of astigmatism corrected and percentage of astigmatism reduction at the intended axis were assessed. RESULTS: Significant negative correlation between the ratio of post- and preoperative Thibos APVratio and Alpins percentage of success (%Success was found (Spearman's ρ=-0.93; linear regression is given by the following equation: %Success = (-APVratio + 1.00x100. CONCLUSION: The linear regression we found between APVratio and %Success permits a validated mathematical inference concerning the overall success of astigmatic surgery.
Check-all-that-apply data analysed by Partial Least Squares regression
DEFF Research Database (Denmark)
Rinnan, Åsmund; Giacalone, Davide; Frøst, Michael Bom
2015-01-01
are analysed by multivariate techniques. CATA data can be analysed both by setting the CATA as the X and the Y. The former is the PLS-Discriminant Analysis (PLS-DA) version, while the latter is the ANOVA-PLS (A-PLS) version. We investigated the difference between these two approaches, concluding...
DEFF Research Database (Denmark)
Scott, Neil W; Fayers, Peter M; Aaronson, Neil K
2010-01-01
Differential item functioning (DIF) methods can be used to determine whether different subgroups respond differently to particular items within a health-related quality of life (HRQoL) subscale, after allowing for overall subgroup differences in that scale. This article reviews issues that arise ...... when testing for DIF in HRQoL instruments. We focus on logistic regression methods, which are often used because of their efficiency, simplicity and ease of application....
Analyses of Developmental Rate Isomorphy in Ectotherms: Introducing the Dirichlet Regression.
Directory of Open Access Journals (Sweden)
David S Boukal
Full Text Available Temperature drives development in insects and other ectotherms because their metabolic rate and growth depends directly on thermal conditions. However, relative durations of successive ontogenetic stages often remain nearly constant across a substantial range of temperatures. This pattern, termed 'developmental rate isomorphy' (DRI in insects, appears to be widespread and reported departures from DRI are generally very small. We show that these conclusions may be due to the caveats hidden in the statistical methods currently used to study DRI. Because the DRI concept is inherently based on proportional data, we propose that Dirichlet regression applied to individual-level data is an appropriate statistical method to critically assess DRI. As a case study we analyze data on five aquatic and four terrestrial insect species. We find that results obtained by Dirichlet regression are consistent with DRI violation in at least eight of the studied species, although standard analysis detects significant departure from DRI in only four of them. Moreover, the departures from DRI detected by Dirichlet regression are consistently much larger than previously reported. The proposed framework can also be used to infer whether observed departures from DRI reflect life history adaptations to size- or stage-dependent effects of varying temperature. Our results indicate that the concept of DRI in insects and other ectotherms should be critically re-evaluated and put in a wider context, including the concept of 'equiproportional development' developed for copepods.
International Nuclear Information System (INIS)
Slutskaya, N.G.; Mosseh, I.B.
2006-01-01
Data about genetic mutations under radiation and chemical treatment for different types of cells have been analyzed with correlation and regression analyses. Linear correlation between different genetic effects in sex cells and somatic cells have found. The results may be extrapolated on sex cells of human and mammals. (authors)
Cashman, Kevin D.; Ritz, Christian; Kiely, Mairead
2017-01-01
Dietary Reference Values (DRVs) for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs) are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD)-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years) of the vitamin D intake–serum 25-hydroxyvitamin D (25(OH)D) dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OH)D concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years) from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OH)D >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OH)D to vitamin D intake. PMID:28481259
DEFF Research Database (Denmark)
Tybjærg-Hansen, Anne
2009-01-01
Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements...... of the risk factors are observed on a subsample. We extend the multivariate RC techniques to a meta-analysis framework where multiple studies provide independent repeat measurements and information on disease outcome. We consider the cases where some or all studies have repeat measurements, and compare study......-specific, averaged and empirical Bayes estimates of RC parameters. Additionally, we allow for binary covariates (e.g. smoking status) and for uncertainty and time trends in the measurement error corrections. Our methods are illustrated using a subset of individual participant data from prospective long-term studies...
Huang, Banglian; Yang, Yiming; Luo, Tingting; Wu, S.; Du, Xuezhu; Cai, Detian; Loo, van, E.N.; Huang Bangquan
2013-01-01
In the present study correlation, regression and path analyses were carried out to decide correlations among the agro- nomic traits and their contributions to seed yield per plant in Crambe abyssinica. Partial correlation analysis indicated that plant height (X1) was significantly correlated with branching height and the number of first branches (P <0.01); Branching height (X2) was significantly correlated with pod number of primary inflorescence (P <0.01) and number of secondary branch...
Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.
Onder, Seyhan; Mutlu, Mert
2017-09-01
Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.
Fundamental data analyses for measurement control
International Nuclear Information System (INIS)
Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.
1989-01-01
An important aspect of a complete measurement control program for special nuclear materials is the analysis of data from periodic control measurements of known standards. This chapter covers the following topics: basic algorithms including an introduction and terminology, the standard case (known mean and standard deviation), Shewart control charts, and sequential test for bias; modifications for nonstandard cases including modification for changing (decaying) standard value, modifications for deteriorating measurement precision, and modifications when repeated measurements are made; maintenance information including estimation of historical standard deviation (standard case), estimation of historical standard deviation (changing with time), normality and outliners, and other tests of randomness
Directory of Open Access Journals (Sweden)
Luise A Seeker
Full Text Available Telomeres cap the ends of linear chromosomes and shorten with age in many organisms. In humans short telomeres have been linked to morbidity and mortality. With the accumulation of longitudinal datasets the focus shifts from investigating telomere length (TL to exploring TL change within individuals over time. Some studies indicate that the speed of telomere attrition is predictive of future disease. The objectives of the present study were to 1 characterize the change in bovine relative leukocyte TL (RLTL across the lifetime in Holstein Friesian dairy cattle, 2 estimate genetic parameters of RLTL over time and 3 investigate the association of differences in individual RLTL profiles with productive lifespan. RLTL measurements were analysed using Legendre polynomials in a random regression model to describe TL profiles and genetic variance over age. The analyses were based on 1,328 repeated RLTL measurements of 308 female Holstein Friesian dairy cattle. A quadratic Legendre polynomial was fitted to the fixed effect of age in months and to the random effect of the animal identity. Changes in RLTL, heritability and within-trait genetic correlation along the age trajectory were calculated and illustrated. At a population level, the relationship between RLTL and age was described by a positive quadratic function. Individuals varied significantly regarding the direction and amount of RLTL change over life. The heritability of RLTL ranged from 0.36 to 0.47 (SE = 0.05-0.08 and remained statistically unchanged over time. The genetic correlation of RLTL at birth with measurements later in life decreased with the time interval between samplings from near unity to 0.69, indicating that TL later in life might be regulated by different genes than TL early in life. Even though animals differed in their RLTL profiles significantly, those differences were not correlated with productive lifespan (p = 0.954.
Directory of Open Access Journals (Sweden)
Luciane Flores Jacobi
2002-01-01
Full Text Available Esta pesquisa tem por objetivo empregar o gráfico de controle de regressão, como ferramenta de controle estatístico, para monitorar processos produtivos, onde uma variável de estado, que seja de interesse, possa ser expressa como função de uma variável de controle. Existem vários estudos sobre o controle de variáveis em processos produtivos, mas, na maioria das vezes, são em relação ao controle de cada variável, separadamente, não podendo ser utilizados para um estudo comparativo. Esta pesquisa, portanto, apresenta uma técnica eficiente no controle simultâneo de variáveis correlacionadas.The main purpose of this research is to apply the regression control chart as tool of statistical control to monitor productive processes, where a state variable that is of interest can be expressed as function of a control variable. Several studies exist to control variables in productive processes, but most of time they are separately in relation to the control of each variable, and however not could be used for a comparative study. This research, therefore, it presents an efficient technique to control simultaneous by correlated variables.
Directory of Open Access Journals (Sweden)
Land Walker H
2011-01-01
Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.
Directory of Open Access Journals (Sweden)
Željko V. Račić
2010-12-01
Full Text Available This paper aims to present the specifics of the application of multiple linear regression model. The economic (financial crisis is analyzed in terms of gross domestic product which is in a function of the foreign trade balance (on one hand and the credit cards, i.e. indebtedness of the population on this basis (on the other hand, in the USA (from 1999. to 2008. We used the extended application model which shows how the analyst should run the whole development process of regression model. This process began with simple statistical features and the application of regression procedures, and ended with residual analysis, intended for the study of compatibility of data and model settings. This paper also analyzes the values of some standard statistics used in the selection of appropriate regression model. Testing of the model is carried out with the use of the Statistics PASW 17 program.
International Nuclear Information System (INIS)
Hulsteijn, Leonie T. van; Corssmit, Eleonora P.M.; Coremans, Ida E.M.; Smit, Johannes W.A.; Jansen, Jeroen C.; Dekkers, Olaf M.
2013-01-01
The primary treatment goal of radiotherapy for paragangliomas of the head and neck region (HNPGLs) is local control of the tumor, i.e. stabilization of tumor volume. Interestingly, regression of tumor volume has also been reported. Up to the present, no meta-analysis has been performed giving an overview of regression rates after radiotherapy in HNPGLs. The main objective was to perform a systematic review and meta-analysis to assess regression of tumor volume in HNPGL-patients after radiotherapy. A second outcome was local tumor control. Design of the study is systematic review and meta-analysis. PubMed, EMBASE, Web of Science, COCHRANE and Academic Search Premier and references of key articles were searched in March 2012 to identify potentially relevant studies. Considering the indolent course of HNPGLs, only studies with ⩾12 months follow-up were eligible. Main outcomes were the pooled proportions of regression and local control after radiotherapy as initial, combined (i.e. directly post-operatively or post-embolization) or salvage treatment (i.e. after initial treatment has failed) for HNPGLs. A meta-analysis was performed with an exact likelihood approach using a logistic regression with a random effect at the study level. Pooled proportions with 95% confidence intervals (CI) were reported. Fifteen studies were included, concerning a total of 283 jugulotympanic HNPGLs in 276 patients. Pooled regression proportions for initial, combined and salvage treatment were respectively 21%, 33% and 52% in radiosurgery studies and 4%, 0% and 64% in external beam radiotherapy studies. Pooled local control proportions for radiotherapy as initial, combined and salvage treatment ranged from 79% to 100%. Radiotherapy for jugulotympanic paragangliomas results in excellent local tumor control and therefore is a valuable treatment for these types of tumors. The effects of radiotherapy on regression of tumor volume remain ambiguous, although the data suggest that regression can
Johansen, Mette; Bahrt, Henriette; Altman, Roy D; Bartels, Else M; Juhl, Carsten B; Bliddal, Henning; Lund, Hans; Christensen, Robin
2016-08-01
The aim was to identify factors explaining inconsistent observations concerning the efficacy of intra-articular hyaluronic acid compared to intra-articular sham/control, or non-intervention control, in patients with symptomatic osteoarthritis, based on randomized clinical trials (RCTs). A systematic review and meta-regression analyses of available randomized trials were conducted. The outcome, pain, was assessed according to a pre-specified hierarchy of potentially available outcomes. Hedges׳s standardized mean difference [SMD (95% CI)] served as effect size. REstricted Maximum Likelihood (REML) mixed-effects models were used to combine study results, and heterogeneity was calculated and interpreted as Tau-squared and I-squared, respectively. Overall, 99 studies (14,804 patients) met the inclusion criteria: Of these, only 71 studies (72%), including 85 comparisons (11,216 patients), had adequate data available for inclusion in the primary meta-analysis. Overall, compared with placebo, intra-articular hyaluronic acid reduced pain with an effect size of -0.39 [-0.47 to -0.31; P hyaluronic acid. Based on available trial data, intra-articular hyaluronic acid showed a better effect than intra-articular saline on pain reduction in osteoarthritis. Publication bias and the risk of selective outcome reporting suggest only small clinical effect compared to saline. Copyright © 2016 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Reddy, T.A. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States)); Claridge, D.E. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States))
1994-01-01
Multiple regression modeling of monitored building energy use data is often faulted as a reliable means of predicting energy use on the grounds that multicollinearity between the regressor variables can lead both to improper interpretation of the relative importance of the various physical regressor parameters and to a model with unstable regressor coefficients. Principal component analysis (PCA) has the potential to overcome such drawbacks. While a few case studies have already attempted to apply this technique to building energy data, the objectives of this study were to make a broader evaluation of PCA and multiple regression analysis (MRA) and to establish guidelines under which one approach is preferable to the other. Four geographic locations in the US with different climatic conditions were selected and synthetic data sequence representative of daily energy use in large institutional buildings were generated in each location using a linear model with outdoor temperature, outdoor specific humidity and solar radiation as the three regression variables. MRA and PCA approaches were then applied to these data sets and their relative performances were compared. Conditions under which PCA seems to perform better than MRA were identified and preliminary recommendations on the use of either modeling approach formulated. (orig.)
JT-60 configuration parameters for feedback control determined by regression analysis
Energy Technology Data Exchange (ETDEWEB)
Matsukawa, Makoto; Hosogane, Nobuyuki; Ninomiya, Hiromasa (Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment)
1991-12-01
The stepwise regression procedure was applied to obtain measurement formulas for equilibrium parameters used in the feedback control of JT-60. This procedure automatically selects variables necessary for the measurements, and selects a set of variables which are not likely to be picked up by physical considerations. Regression equations with stable and small multicollinearity were obtained and it was experimentally confirmed that the measurement formulas obtained through this procedure were accurate enough to be applicable to the feedback control of plasma configurations in JT-60. (author).
JT-60 configuration parameters for feedback control determined by regression analysis
International Nuclear Information System (INIS)
Matsukawa, Makoto; Hosogane, Nobuyuki; Ninomiya, Hiromasa
1991-12-01
The stepwise regression procedure was applied to obtain measurement formulas for equilibrium parameters used in the feedback control of JT-60. This procedure automatically selects variables necessary for the measurements, and selects a set of variables which are not likely to be picked up by physical considerations. Regression equations with stable and small multicollinearity were obtained and it was experimentally confirmed that the measurement formulas obtained through this procedure were accurate enough to be applicable to the feedback control of plasma configurations in JT-60. (author)
Directory of Open Access Journals (Sweden)
Waters Lauren
2012-08-01
Full Text Available Abstract Background Unanticipated control group improvements have been observed in intervention trials targeting various health behaviours. This phenomenon has not been studied in the context of behavioural weight loss intervention trials. The purpose of this study is to conduct a systematic review and meta-regression of behavioural weight loss interventions to quantify control group weight change, and relate the size of this effect to specific trial and sample characteristics. Methods Database searches identified reports of intervention trials meeting the inclusion criteria. Data on control group weight change and possible explanatory factors were abstracted and analysed descriptively and quantitatively. Results 85 trials were reviewed and 72 were included in the meta-regression. While there was no change in control group weight, control groups receiving usual care lost 1 kg more than control groups that received no intervention, beyond measurement. Conclusions There are several possible explanations why control group changes occur in intervention trials targeting other behaviours, but not for weight loss. Control group participation may prevent weight gain, although more research is needed to confirm this hypothesis.
Microcomputer-controlled thermoluminescent analyser IJS MR-200
International Nuclear Information System (INIS)
Mihelic, M.; Miklavzic, U.; Rupnik, Z.; Satalic, P.; Spreizer, F.; Zerovnik, I.
1985-01-01
Performances and concept of the multipurpose, microcomputer-controlled thermoluminescent analyser, designed for use in laboratory work TL dosemeters as well as for routine dose readings in the range from ecological to accident doses is described. The main features of the analyser are: time-linear sampling, digitalisation, storing, and subsequent displaying on the monitor time scale of the glow and and temperature curve of the TL material; digital stabilization, control and diagnostic of the analog unit; ability of storing 7 different 8-parametric heating programs; ability of storing 15 evaluation programs defined by 2 or 4 parameters and 3 different algorithms (altogether 5 types of evaluations). Analyser has several features intended for routine work: 9 function keys and possibilities of file forming on cassette or display disc, of dose calculation and averaging, of printing reports with names, and possibility of additional programming in Basic. (author)
Laszlo, Sarah; Federmeier, Kara D.
2010-01-01
Linking print with meaning tends to be divided into subprocesses, such as recognition of an input's lexical entry and subsequent access of semantics. However, recent results suggest that the set of semantic features activated by an input is broader than implied by a view wherein access serially follows recognition. EEG was collected from participants who viewed items varying in number and frequency of both orthographic neighbors and lexical associates. Regression analysis of single item ERPs replicated past findings, showing that N400 amplitudes are greater for items with more neighbors, and further revealed that N400 amplitudes increase for items with more lexical associates and with higher frequency neighbors or associates. Together, the data suggest that in the N400 time window semantic features of items broadly related to inputs are active, consistent with models in which semantic access takes place in parallel with stimulus recognition. PMID:20624252
Robust estimation for homoscedastic regression in the secondary analysis of case-control data
Wei, Jiawei
2012-12-04
Primary analysis of case-control studies focuses on the relationship between disease D and a set of covariates of interest (Y, X). A secondary application of the case-control study, which is often invoked in modern genetic epidemiologic association studies, is to investigate the interrelationship between the covariates themselves. The task is complicated owing to the case-control sampling, where the regression of Y on X is different from what it is in the population. Previous work has assumed a parametric distribution for Y given X and derived semiparametric efficient estimation and inference without any distributional assumptions about X. We take up the issue of estimation of a regression function when Y given X follows a homoscedastic regression model, but otherwise the distribution of Y is unspecified. The semiparametric efficient approaches can be used to construct semiparametric efficient estimates, but they suffer from a lack of robustness to the assumed model for Y given X. We take an entirely different approach. We show how to estimate the regression parameters consistently even if the assumed model for Y given X is incorrect, and thus the estimates are model robust. For this we make the assumption that the disease rate is known or well estimated. The assumption can be dropped when the disease is rare, which is typically so for most case-control studies, and the estimation algorithm simplifies. Simulations and empirical examples are used to illustrate the approach.
Robust estimation for homoscedastic regression in the secondary analysis of case-control data
Wei, Jiawei; Carroll, Raymond J.; Mü ller, Ursula U.; Keilegom, Ingrid Van; Chatterjee, Nilanjan
2012-01-01
Primary analysis of case-control studies focuses on the relationship between disease D and a set of covariates of interest (Y, X). A secondary application of the case-control study, which is often invoked in modern genetic epidemiologic association studies, is to investigate the interrelationship between the covariates themselves. The task is complicated owing to the case-control sampling, where the regression of Y on X is different from what it is in the population. Previous work has assumed a parametric distribution for Y given X and derived semiparametric efficient estimation and inference without any distributional assumptions about X. We take up the issue of estimation of a regression function when Y given X follows a homoscedastic regression model, but otherwise the distribution of Y is unspecified. The semiparametric efficient approaches can be used to construct semiparametric efficient estimates, but they suffer from a lack of robustness to the assumed model for Y given X. We take an entirely different approach. We show how to estimate the regression parameters consistently even if the assumed model for Y given X is incorrect, and thus the estimates are model robust. For this we make the assumption that the disease rate is known or well estimated. The assumption can be dropped when the disease is rare, which is typically so for most case-control studies, and the estimation algorithm simplifies. Simulations and empirical examples are used to illustrate the approach.
Support vector regression model based predictive control of water level of U-tube steam generators
Energy Technology Data Exchange (ETDEWEB)
Kavaklioglu, Kadir, E-mail: kadir.kavaklioglu@pau.edu.tr
2014-10-15
Highlights: • Water level of U-tube steam generators was controlled in a model predictive fashion. • Models for steam generator water level were built using support vector regression. • Cost function minimization for future optimal controls was performed by using the steepest descent method. • The results indicated the feasibility of the proposed method. - Abstract: A predictive control algorithm using support vector regression based models was proposed for controlling the water level of U-tube steam generators of pressurized water reactors. Steam generator data were obtained using a transfer function model of U-tube steam generators. Support vector regression based models were built using a time series type model structure for five different operating powers. Feedwater flow controls were calculated by minimizing a cost function that includes the level error, the feedwater change and the mismatch between feedwater and steam flow rates. Proposed algorithm was applied for a scenario consisting of a level setpoint change and a steam flow disturbance. The results showed that steam generator level can be controlled at all powers effectively by the proposed method.
Directory of Open Access Journals (Sweden)
Abdelfattah M. Selim
2018-03-01
Full Text Available Aim: The present cross-sectional study was conducted to determine the seroprevalence and potential risk factors associated with Bovine viral diarrhea virus (BVDV disease in cattle and buffaloes in Egypt, to model the potential risk factors associated with the disease using logistic regression (LR models, and to fit the best predictive model for the current data. Materials and Methods: A total of 740 blood samples were collected within November 2012-March 2013 from animals aged between 6 months and 3 years. The potential risk factors studied were species, age, sex, and herd location. All serum samples were examined with indirect ELIZA test for antibody detection. Data were analyzed with different statistical approaches such as Chi-square test, odds ratios (OR, univariable, and multivariable LR models. Results: Results revealed a non-significant association between being seropositive with BVDV and all risk factors, except for species of animal. Seroprevalence percentages were 40% and 23% for cattle and buffaloes, respectively. OR for all categories were close to one with the highest OR for cattle relative to buffaloes, which was 2.237. Likelihood ratio tests showed a significant drop of the -2LL from univariable LR to multivariable LR models. Conclusion: There was an evidence of high seroprevalence of BVDV among cattle as compared with buffaloes with the possibility of infection in different age groups of animals. In addition, multivariable LR model was proved to provide more information for association and prediction purposes relative to univariable LR models and Chi-square tests if we have more than one predictor.
Botha, J.; De Ridder, J.H.; Potgieter, J.C.; Steyn, H.S.; Malan, L.
2013-01-01
A recently proposed model for waist circumference cut points (RPWC), driven by increased blood pressure, was demonstrated in an African population. We therefore aimed to validate the RPWC by comparing the RPWC and the Joint Statement Consensus (JSC) models via Logistic Regression (LR) and Neural Networks (NN) analyses. Urban African gender groups (N=171) were stratified according to the JSC and RPWC cut point models. Ultrasound carotid intima media thickness (CIMT), blood pressure (BP) and fa...
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Note on Penalized Regression Spline Estimation in the Secondary Analysis of Case-Control Data
Gazioglu, Suzan; Wei, Jiawei; Jennings, Elizabeth M.; Carroll, Raymond J.
2013-01-01
Primary analysis of case-control studies focuses on the relationship between disease (D) and a set of covariates of interest (Y, X). A secondary application of the case-control study, often invoked in modern genetic epidemiologic association studies, is to investigate the interrelationship between the covariates themselves. The task is complicated due to the case-control sampling, and to avoid the biased sampling that arises from the design, it is typical to use the control data only. In this paper, we develop penalized regression spline methodology that uses all the data, and improves precision of estimation compared to using only the controls. A simulation study and an empirical example are used to illustrate the methodology.
A Note on Penalized Regression Spline Estimation in the Secondary Analysis of Case-Control Data
Gazioglu, Suzan
2013-05-25
Primary analysis of case-control studies focuses on the relationship between disease (D) and a set of covariates of interest (Y, X). A secondary application of the case-control study, often invoked in modern genetic epidemiologic association studies, is to investigate the interrelationship between the covariates themselves. The task is complicated due to the case-control sampling, and to avoid the biased sampling that arises from the design, it is typical to use the control data only. In this paper, we develop penalized regression spline methodology that uses all the data, and improves precision of estimation compared to using only the controls. A simulation study and an empirical example are used to illustrate the methodology.
Doping control analyses in horseracing: a clinician's guide.
Wong, Jenny K Y; Wan, Terence S M
2014-04-01
Doping(1) in sports is highly detrimental, not only to the athletes involved but to the sport itself as well as to the confidence of the spectators and other participants. To protect the integrity of any sport, there must be in place an effective doping control program. In human sports, a 'top-down' and generally unified approach is taken where the rules and regulations against doping for the majority of elite sport events held in any country are governed by the World Anti-Doping Agency (WADA). However, in horseracing, there is no single organisation regulating this form of equestrian sport; instead, the rules and regulations are provided by individual racing authorities and so huge variations exist in the doping control programs currently in force around the world. This review summarises the current status of doping control analyses in horseracing, from sample collection, to the analyses of the samples, and to the need for harmonisation as well as exploring some of the difficulties currently faced by racing authorities, racing chemists and regulatory veterinarians worldwide. Copyright © 2014 Elsevier Ltd. All rights reserved.
Systematic realisation of control flow analyses for CML
DEFF Research Database (Denmark)
Gasser, K.L.S.; Nielson, Flemming; Nielson, Hanne Riis
1997-01-01
We present a methodology for the systematic realisation of control flow analyses and illustrate it for Concurrent ML. We start with an abstract specification of the analysis that is next proved semantically sound with respect to a traditional small-step operational semantics; this result holds......) to be defined in a syntax-directed manner, and (iii) to generate a set of constraints that subsequently can be solved by standard techniques. We prove equivalence results between the different versions of the analysis; in particular it follows that the least solution to the constraints generated...
Dyer, Betsey D.; Kahn, Michael J.; LeBlanc, Mark D.
2008-01-01
Classification and regression tree (CART) analysis was applied to genome-wide tetranucleotide frequencies (genomic signatures) of 195 archaea and bacteria. Although genomic signatures have typically been used to classify evolutionary divergence, in this study, convergent evolution was the focus. Temperature optima for most of the organisms examined could be distinguished by CART analyses of tetranucleotide frequencies. This suggests that pervasive (nonlinear) qualities of genomes may reflect certain environmental conditions (such as temperature) in which those genomes evolved. The predominant use of GAGA and AGGA as the discriminating tetramers in CART models suggests that purine-loading and codon biases of thermophiles may explain some of the results. PMID:19054742
Risk factors of regression and undercorrection in photorefractive keratectomy:a case-control study
Directory of Open Access Journals (Sweden)
Seyed-Farzad Mohammadi
2015-10-01
Full Text Available AIM:To determine risk factors of regression and undercorrection following photorefractive keratectomy (PRK in myopia or myopic astigmatism.METHODS: A case-control study was designed in which eyes with an indication for re-treatment (RT were defined as cases; primary criteria for RT indication, as assessed at least 9mo postoperatively, included an uncorrected distance visual acuity (UDVA of 20/30 or worse and a stable refraction for more than 3mo. Additional considerations included optical quality symptoms and significant higher order aberrations (HOAs. Controls were chosen from the same cohort of operated eyes which had complete post-operative follow up data beyond 9mo and did not need RT. The cohort included patients who had undergone PRK by the Tissue-Saving (TS ablation profile of Technolas 217z100 excimer laser (Bausch & Lomb, Rochester, NY, USA. Mitomycin C had been used in all of the primary procedures.RESULTS:We had 70 case eyes and 158 control eyes, and they were comparable in terms of age, sex and follow-up time (P values:0.58, 1.00 and 0.89, respectively. Pre-operative spherical equivalent of more than -5.00 diopter (D, intended optical zone (OZ diameter of less than 6.00 mm and ocular fixation instability during laser ablation were associated with RT indications (all P values <0.001. These factors maintained their significance in the multiple logistic regression model with odd ratios of 6.12, 6.71 and 7.89, respectively.CONCLUSION:Higher refractive correction (>-5.00 D, smaller OZ (<6.00 mm and unstable fixation during laser ablation of PRK for myopia and myopic astigmatism were found to be strong predictors of undercorrection and regression.
Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R
2016-12-01
: MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We
Project W-320 SAR and process control thermal analyses
International Nuclear Information System (INIS)
Sathyanarayana, K.
1997-01-01
This report summarizes the results of thermal hydraulic computer modeling supporting Project W-320 for process control and SAR documentation. Parametric analyses were performed for the maximum steady state waste temperature. The parameters included heat load distribution, tank heat load, fluffing factor and thermal conductivity. Uncertainties in the fluffing factor and heat load distribution had the largest effect on maximum waste temperature. Safety analyses were performed for off normal events including loss of ventilation, loss of evaporation and loss of secondary chiller. The loss of both the primary and secondary ventilation was found to be the most limiting event with saturation temperature in the bottom waste reaching in just over 30 days. An evaluation was performed for the potential lowering of the supernatant level in tank 241-AY-102. The evaluation included a loss of ventilation and steam bump analysis. The reduced supernatant level decreased the time to reach saturation temperature in the waste for the loss of ventilation by about one week. However, the consequence of a steam bump were dramatically reduced
Directory of Open Access Journals (Sweden)
Chan-Yun Yang
2013-01-01
Full Text Available Austempered ductile iron has emerged as a notable material in several engineering fields, including marine applications. The initial austenite carbon content after austenization transform but before austempering process for generating bainite matrix proved critical in controlling the resulted microstructure and thus mechanical properties. In this paper, support vector regression is employed in order to establish a relationship between the initial carbon concentration in the austenite with austenization temperature and alloy contents, thereby exercising improved control in the mechanical properties of the austempered ductile irons. Particularly, the paper emphasizes a methodology tailored to deal with a limited amount of available data with intrinsically contracted and skewed distribution. The collected information from a variety of data sources presents another challenge of highly uncertain variance. The authors present a hybrid model consisting of a procedure of a histogram equalizer and a procedure of a support-vector-machine (SVM- based regression to gain a more robust relationship to respond to the challenges. The results show greatly improved accuracy of the proposed model in comparison to two former established methodologies. The sum squared error of the present model is less than one fifth of that of the two previous models.
Correcting for cryptic relatedness by a regression-based genomic control method
Directory of Open Access Journals (Sweden)
Yang Yaning
2009-12-01
Full Text Available Abstract Background Genomic control (GC method is a useful tool to correct for the cryptic relatedness in population-based association studies. It was originally proposed for correcting for the variance inflation of Cochran-Armitage's additive trend test by using information from unlinked null markers, and was later generalized to be applicable to other tests with the additional requirement that the null markers are matched with the candidate marker in allele frequencies. However, matching allele frequencies limits the number of available null markers and thus limits the applicability of the GC method. On the other hand, errors in genotype/allele frequencies may cause further bias and variance inflation and thereby aggravate the effect of GC correction. Results In this paper, we propose a regression-based GC method using null markers that are not necessarily matched in allele frequencies with the candidate marker. Variation of allele frequencies of the null markers is adjusted by a regression method. Conclusion The proposed method can be readily applied to the Cochran-Armitage's trend tests other than the additive trend test, the Pearson's chi-square test and other robust efficiency tests. Simulation results show that the proposed method is effective in controlling type I error in the presence of population substructure.
Xie, Heping; Wang, Fuxing; Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan
2017-01-01
Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = -0.11, 95% CI = [-0.19, -0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = -0.70, 95% CI = [-1.02, -0.38], p < 0.001), as well as dtransfer for cueing (β = -0.60, 95% CI = [-0.92, -0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning.
Hutton, Eileen K; Simioni, Julia C; Thabane, Lehana
2017-08-01
Among women with a fetus with a non-cephalic presentation, external cephalic version (ECV) has been shown to reduce the rate of breech presentation at birth and cesarean birth. Compared with ECV at term, beginning ECV prior to 37 weeks' gestation decreases the number of infants in a non-cephalic presentation at birth. The purpose of this secondary analysis was to investigate factors associated with a successful ECV procedure and to present this in a clinically useful format. Data were collected as part of the Early ECV Pilot and Early ECV2 Trials, which randomized 1776 women with a fetus in breech presentation to either early ECV (34-36 weeks' gestation) or delayed ECV (at or after 37 weeks). The outcome of interest was successful ECV, defined as the fetus being in a cephalic presentation immediately following the procedure, as well as at the time of birth. The importance of several factors in predicting successful ECV was investigated using two statistical methods: logistic regression and classification and regression tree (CART) analyses. Among nulliparas, non-engagement of the presenting part and an easily palpable fetal head were independently associated with success. Among multiparas, non-engagement of the presenting part, gestation less than 37 weeks and an easily palpable fetal head were found to be independent predictors of success. These findings were consistent with results of the CART analyses. Regardless of parity, descent of the presenting part was the most discriminating factor in predicting successful ECV and cephalic presentation at birth. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.
Directory of Open Access Journals (Sweden)
Mirjam J Knol
Full Text Available BACKGROUND: In randomized controlled trials (RCTs, the odds ratio (OR can substantially overestimate the risk ratio (RR if the incidence of the outcome is over 10%. This study determined the frequency of use of ORs, the frequency of overestimation of the OR as compared with its accompanying RR in published RCTs, and we assessed how often regression models that calculate RRs were used. METHODS: We included 288 RCTs published in 2008 in five major general medical journals (Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, New England Journal of Medicine. If an OR was reported, we calculated the corresponding RR, and we calculated the percentage of overestimation by using the formula . RESULTS: Of 193 RCTs with a dichotomous primary outcome, 24 (12.4% presented a crude and/or adjusted OR for the primary outcome. In five RCTs (2.6%, the OR differed more than 100% from its accompanying RR on the log scale. Forty-one of all included RCTs (n = 288; 14.2% presented ORs for other outcomes, or for subgroup analyses. Nineteen of these RCTs (6.6% had at least one OR that deviated more than 100% from its accompanying RR on the log scale. Of 53 RCTs that adjusted for baseline variables, 15 used logistic regression. Alternative methods to estimate RRs were only used in four RCTs. CONCLUSION: ORs and logistic regression are often used in RCTs and in many articles the OR did not approximate the RR. Although the authors did not explicitly misinterpret these ORs as RRs, misinterpretation by readers can seriously affect treatment decisions and policy making.
The need to control for regression to the mean in social psychology studies
Directory of Open Access Journals (Sweden)
Rongjun eYu
2015-01-01
Full Text Available It is common in repeated measurements for extreme values at the first measurement to approach the mean at the subsequent measurement, a phenomenon called regression to the mean (RTM. If RTM is not fully controlled, it will lead to erroneous conclusions. The wide use of repeated measurements in social psychology creates a risk that an RTM effect will influence results. However, insufficient attention is paid to RTM in most social psychological research. Notable cases include studies on the phenomena of social conformity and unrealistic optimism. In Study 1, 13 university students rated and re-rated the facial attractiveness of a series of female faces as a test of the social conformity effect. In Study 2, 15 university students estimated and re-estimated their risk of experiencing a series of adverse life events as a test of the unrealistic optimism effect. Although these studies used methodologies similar to those used in earlier research, the social conformity and unrealistic optimism effects were no longer evident after controlling for RTM. Based on these findings we suggest several ways to control for the RTM effect in social psychology studies.
Greene, LaVana; Elzey, Brianda; Franklin, Mariah; Fakayode, Sayo O
2017-03-05
The negative health impact of polycyclic aromatic hydrocarbons (PAHs) and differences in pharmacological activity of enantiomers of chiral molecules in humans highlights the need for analysis of PAHs and their chiral analogue molecules in humans. Herein, the first use of cyclodextrin guest-host inclusion complexation, fluorescence spectrophotometry, and chemometric approach to PAH (anthracene) and chiral-PAH analogue derivatives (1-(9-anthryl)-2,2,2-triflouroethanol (TFE)) analyses are reported. The binding constants (K b ), stoichiometry (n), and thermodynamic properties (Gibbs free energy (ΔG), enthalpy (ΔH), and entropy (ΔS)) of anthracene and enantiomers of TFE-methyl-β-cyclodextrin (Me-β-CD) guest-host complexes were also determined. Chemometric partial-least-square (PLS) regression analysis of emission spectra data of Me-β-CD-guest-host inclusion complexes was used for the determination of anthracene and TFE enantiomer concentrations in Me-β-CD-guest-host inclusion complex samples. The values of calculated K b and negative ΔG suggest the thermodynamic favorability of anthracene-Me-β-CD and enantiomeric of TFE-Me-β-CD inclusion complexation reactions. However, anthracene-Me-β-CD and enantiomer TFE-Me-β-CD inclusion complexations showed notable differences in the binding affinity behaviors and thermodynamic properties. The PLS regression analysis resulted in square-correlation-coefficients of 0.997530 or better and a low LOD of 3.81×10 -7 M for anthracene and 3.48×10 -8 M for TFE enantiomers at physiological conditions. Most importantly, PLS regression accurately determined the anthracene and TFE enantiomer concentrations with an average low error of 2.31% for anthracene, 4.44% for R-TFE and 3.60% for S-TFE. The results of the study are highly significant because of its high sensitivity and accuracy for analysis of PAH and chiral PAH analogue derivatives without the need of an expensive chiral column, enantiomeric resolution, or use of a polarized
Peluso, Marco E M; Munnia, Armelle; Ceppi, Marcello
2014-11-05
Exposures to bisphenol-A, a weak estrogenic chemical, largely used for the production of plastic containers, can affect the rodent behaviour. Thus, we examined the relationships between bisphenol-A and the anxiety-like behaviour, spatial skills, and aggressiveness, in 12 toxicity studies of rodent offspring from females orally exposed to bisphenol-A, while pregnant and/or lactating, by median and linear splines analyses. Subsequently, the meta-regression analysis was applied to quantify the behavioural changes. U-shaped, inverted U-shaped and J-shaped dose-response curves were found to describe the relationships between bisphenol-A with the behavioural outcomes. The occurrence of anxiogenic-like effects and spatial skill changes displayed U-shaped and inverted U-shaped curves, respectively, providing examples of effects that are observed at low-doses. Conversely, a J-dose-response relationship was observed for aggressiveness. When the proportion of rodents expressing certain traits or the time that they employed to manifest an attitude was analysed, the meta-regression indicated that a borderline significant increment of anxiogenic-like effects was present at low-doses regardless of sexes (β)=-0.8%, 95% C.I. -1.7/0.1, P=0.076, at ≤120 μg bisphenol-A. Whereas, only bisphenol-A-males exhibited a significant inhibition of spatial skills (β)=0.7%, 95% C.I. 0.2/1.2, P=0.004, at ≤100 μg/day. A significant increment of aggressiveness was observed in both the sexes (β)=67.9,C.I. 3.4, 172.5, P=0.038, at >4.0 μg. Then, bisphenol-A treatments significantly abrogated spatial learning and ability in males (Pbisphenol-A, e.g. ≤120 μg/day, were associated to behavioural aberrations in offspring. Copyright © 2014. Published by Elsevier Ireland Ltd.
TEMPERATURE DISTRIBUTION MONITORING AND ANALYSES AT DIFFERENT HEATING CONTROL PRINCIPLES
DEFF Research Database (Denmark)
Simone, Angela; Rode, Carsten; Olesen, Bjarne W.
2010-01-01
under different control strategies of the heating system (Pseudo Random Binary Sequence signal controlling all the heaters (PRBS) or thermostatic control of the heaters (THERM)). A comparison of the measured temperatures within the room, for the five series of experiments, shows a better correlation...
Oldenburg, Catherine E; Venkatesh Prajna, N; Krishnan, Tiruvengada; Rajaraman, Revathi; Srinivasan, Muthiah; Ray, Kathryn J; O'Brien, Kieran S; Glymour, M Maria; Porco, Travis C; Acharya, Nisha R; Rose-Nussbaumer, Jennifer; Lietman, Thomas M
2018-08-01
We compare results from regression discontinuity (RD) analysis to primary results of a randomized controlled trial (RCT) utilizing data from two contemporaneous RCTs for treatment of fungal corneal ulcers. Patients were enrolled in the Mycotic Ulcer Treatment Trials I and II (MUTT I & MUTT II) based on baseline visual acuity: patients with acuity ≤ 20/400 (logMAR 1.3) enrolled in MUTT I, and >20/400 in MUTT II. MUTT I investigated the effect of topical natamycin versus voriconazole on best spectacle-corrected visual acuity. MUTT II investigated the effect of topical voriconazole plus placebo versus topical voriconazole plus oral voriconazole. We compared the RD estimate (natamycin arm of MUTT I [N = 162] versus placebo arm of MUTT II [N = 54]) to the RCT estimate from MUTT I (topical natamycin [N = 162] versus topical voriconazole [N = 161]). In the RD, patients receiving natamycin had mean improvement of 4-lines of visual acuity at 3 months (logMAR -0.39, 95% CI: -0.61, -0.17) compared to topical voriconazole plus placebo, and 2-lines in the RCT (logMAR -0.18, 95% CI: -0.30, -0.05) compared to topical voriconazole. The RD and RCT estimates were similar, although the RD design overestimated effects compared to the RCT.
Fossati, Andrea; Widiger, Thomas A; Borroni, Serena; Maffei, Cesare; Somma, Antonella
2017-06-01
To extend the evidence on the reliability and construct validity of the Five-Factor Model Rating Form (FFMRF) in its self-report version, two independent samples of Italian participants, which were composed of 510 adolescent high school students and 457 community-dwelling adults, respectively, were administered the FFMRF in its Italian translation. Adolescent participants were also administered the Italian translation of the Borderline Personality Features Scale for Children-11 (BPFSC-11), whereas adult participants were administered the Italian translation of the Triarchic Psychopathy Measure (TriPM). Cronbach α values were consistent with previous findings; in both samples, average interitem r values indicated acceptable internal consistency for all FFMRF scales. A multidimensional graded item response theory model indicated that the majority of FFMRF items had adequate discrimination parameters; information indices supported the reliability of the FFMRF scales. Both categorical (i.e., item-level) and scale-level regression analyses suggested that the FFMRF scores may predict a nonnegligible amount of variance in the BPFSC-11 total score in adolescent participants, and in the TriPM scale scores in adult participants.
Choi, Seung Hoan; Labadorf, Adam T; Myers, Richard H; Lunetta, Kathryn L; Dupuis, Josée; DeStefano, Anita L
2017-02-06
Next generation sequencing provides a count of RNA molecules in the form of short reads, yielding discrete, often highly non-normally distributed gene expression measurements. Although Negative Binomial (NB) regression has been generally accepted in the analysis of RNA sequencing (RNA-Seq) data, its appropriateness has not been exhaustively evaluated. We explore logistic regression as an alternative method for RNA-Seq studies designed to compare cases and controls, where disease status is modeled as a function of RNA-Seq reads using simulated and Huntington disease data. We evaluate the effect of adjusting for covariates that have an unknown relationship with gene expression. Finally, we incorporate the data adaptive method in order to compare false positive rates. When the sample size is small or the expression levels of a gene are highly dispersed, the NB regression shows inflated Type-I error rates but the Classical logistic and Bayes logistic (BL) regressions are conservative. Firth's logistic (FL) regression performs well or is slightly conservative. Large sample size and low dispersion generally make Type-I error rates of all methods close to nominal alpha levels of 0.05 and 0.01. However, Type-I error rates are controlled after applying the data adaptive method. The NB, BL, and FL regressions gain increased power with large sample size, large log2 fold-change, and low dispersion. The FL regression has comparable power to NB regression. We conclude that implementing the data adaptive method appropriately controls Type-I error rates in RNA-Seq analysis. Firth's logistic regression provides a concise statistical inference process and reduces spurious associations from inaccurately estimated dispersion parameters in the negative binomial framework.
Samdal, Gro Beate; Eide, Geir Egil; Barth, Tom; Williams, Geoffrey; Meland, Eivind
2017-03-28
This systematic review aims to explain the heterogeneity in results of interventions to promote physical activity and healthy eating for overweight and obese adults, by exploring the differential effects of behaviour change techniques (BCTs) and other intervention characteristics. The inclusion criteria specified RCTs with ≥ 12 weeks' duration, from January 2007 to October 2014, for adults (mean age ≥ 40 years, mean BMI ≥ 30). Primary outcomes were measures of healthy diet or physical activity. Two reviewers rated study quality, coded the BCTs, and collected outcome results at short (≤6 months) and long term (≥12 months). Meta-analyses and meta-regressions were used to estimate effect sizes (ES), heterogeneity indices (I 2 ) and regression coefficients. We included 48 studies containing a total of 82 outcome reports. The 32 long term reports had an overall ES = 0.24 with 95% confidence interval (CI): 0.15 to 0.33 and I 2 = 59.4%. The 50 short term reports had an ES = 0.37 with 95% CI: 0.26 to 0.48, and I 2 = 71.3%. The number of BCTs unique to the intervention group, and the BCTs goal setting and self-monitoring of behaviour predicted the effect at short and long term. The total number of BCTs in both intervention arms and using the BCTs goal setting of outcome, feedback on outcome of behaviour, implementing graded tasks, and adding objects to the environment, e.g. using a step counter, significantly predicted the effect at long term. Setting a goal for change; and the presence of reporting bias independently explained 58.8% of inter-study variation at short term. Autonomy supportive and person-centred methods as in Motivational Interviewing, the BCTs goal setting of behaviour, and receiving feedback on the outcome of behaviour, explained all of the between study variations in effects at long term. There are similarities, but also differences in effective BCTs promoting change in healthy eating and physical activity and
International intercalibration as a method for control of radiochemical analyses
International Nuclear Information System (INIS)
Angelova, A.; Totseva, R.; Karaivanova, R.; Dandulova, Z.; Botsova, L.
1994-01-01
The participation of the Radioecology Section at the National Centre for Radiology and Radiation Protection (NCRRP) in the International Interlaboratory Comparison of radiochemical analyses organized by WHO is reported. The method of evaluating accuracy of the results from inter calibrations concerning radionuclide determination of environmental samples is outlined. The data from analysis of cesium 137, strontium 90 and radium 226 in milk, sediments, soil and seaweed made by 21 laboratories are presented. They show a good accuracy values of the results from NCRRP. 1 tab. 2 figs., 6 refs
Keogh, Ruth H; Mangtani, Punam; Rodrigues, Laura; Nguipdop Djomo, Patrick
2016-01-05
Traditional analyses of standard case-control studies using logistic regression do not allow estimation of time-varying associations between exposures and the outcome. We present two approaches which allow this. The motivation is a study of vaccine efficacy as a function of time since vaccination. Our first approach is to estimate time-varying exposure-outcome associations by fitting a series of logistic regressions within successive time periods, reusing controls across periods. Our second approach treats the case-control sample as a case-cohort study, with the controls forming the subcohort. In the case-cohort analysis, controls contribute information at all times they are at risk. Extensions allow left truncation, frequency matching and, using the case-cohort analysis, time-varying exposures. Simulations are used to investigate the methods. The simulation results show that both methods give correct estimates of time-varying effects of exposures using standard case-control data. Using the logistic approach there are efficiency gains by reusing controls over time and care should be taken over the definition of controls within time periods. However, using the case-cohort analysis there is no ambiguity over the definition of controls. The performance of the two analyses is very similar when controls are used most efficiently under the logistic approach. Using our methods, case-control studies can be used to estimate time-varying exposure-outcome associations where they may not previously have been considered. The case-cohort analysis has several advantages, including that it allows estimation of time-varying associations as a continuous function of time, while the logistic regression approach is restricted to assuming a step function form for the time-varying association.
Directory of Open Access Journals (Sweden)
Ruth H. Keogh
2016-01-01
Full Text Available Abstract Background Traditional analyses of standard case-control studies using logistic regression do not allow estimation of time-varying associations between exposures and the outcome. We present two approaches which allow this. The motivation is a study of vaccine efficacy as a function of time since vaccination. Methods Our first approach is to estimate time-varying exposure-outcome associations by fitting a series of logistic regressions within successive time periods, reusing controls across periods. Our second approach treats the case-control sample as a case-cohort study, with the controls forming the subcohort. In the case-cohort analysis, controls contribute information at all times they are at risk. Extensions allow left truncation, frequency matching and, using the case-cohort analysis, time-varying exposures. Simulations are used to investigate the methods. Results The simulation results show that both methods give correct estimates of time-varying effects of exposures using standard case-control data. Using the logistic approach there are efficiency gains by reusing controls over time and care should be taken over the definition of controls within time periods. However, using the case-cohort analysis there is no ambiguity over the definition of controls. The performance of the two analyses is very similar when controls are used most efficiently under the logistic approach. Conclusions Using our methods, case-control studies can be used to estimate time-varying exposure-outcome associations where they may not previously have been considered. The case-cohort analysis has several advantages, including that it allows estimation of time-varying associations as a continuous function of time, while the logistic regression approach is restricted to assuming a step function form for the time-varying association.
A kernel regression approach to gene-gene interaction detection for case-control studies.
Larson, Nicholas B; Schaid, Daniel J
2013-11-01
Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.
Metrological aspects to quality control for natural gas analyses
Energy Technology Data Exchange (ETDEWEB)
Ribeiro, Claudia Cipriano; Borges, Cleber Nogueira; Cunha, Valnei S. [Instituto Nacional de Metrologia, Normalizacao e Qualidade Industrial (INMETRO), Rio de Janeiro, RJ (Brazil); Augusto, Cristiane R. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil); Augusto, Marco Ignazio [Companhia Estadual de Gas do Rio de Janeiro (CEG), RJ (Brazil)
2008-07-01
The Product's Quality and Services are fundamental topics in the globalized commercial relationship inclusive concern the measurements in natural gas. Considerable investments were necessary for industry especially about the quality control in the commercialized gas with an inclusion of the natural gas in Brazilian energetic resources The Brazilian Regulatory Agency, ANP - Agencia Nacional de Petroleo, Gas Natural e Biocombustiveis - created the Resolution ANP no.16. This Resolution defines the natural gas specification, either national or international source, for commercialization in Brazil and list the tolerance concentration for some components. Between of this components are the inert compounds like the CO{sub 2} and N{sub 2}. The presence of this compounds reduce the calorific power, apart from increase the resistance concern the detonation in the case of vehicular application, and occasion the reduction in the methane concentration in the gas. Controls charts can be useful to verify if the process are or not under Statistical Control. The process can be considerate under statistical control if the measurements have it values between in lower and upper limits stated previously The controls charts can be approach several characteristics in each subgroup: means, standard deviations, amplitude or proportion of defects. The charts are draws for a specific characteristic and to detect some deviate in the process under specific environment conditions. The CEG - Companhia de Distribuicao de Gas do Rio de Janeiro and the DQUIM - Chemical Metrology Division has an agreement for technical cooperation in research and development of gas natural composition Concern the importance of the natural gas in the Nation development, as well as the question approaching the custody transference, the objective of this work is demonstrate the control quality of the natural gas composition between the CEG laboratory and the DQUIM laboratory aiming the quality increase of the
Spady, Richard; Stouli, Sami
2012-01-01
We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...
Simons, Monique; de Vet, Emely; Chinapaw, Mai Jm; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-04-04
Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games-active games-seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a correlate of both active and non-active gaming
de Vet, Emely; Chinapaw, Mai JM; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes
2014-01-01
Background Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games—active games—seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. Objective The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. Methods A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Results Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a
MRI quality control tools for procedures and analyses
International Nuclear Information System (INIS)
Di Nallo, A.M.; Ortenzia, O.; Benassi, M.; D'Arienzo, M.; Coniglio, D.
2006-01-01
Quality control in MRI includes acceptance tests on the installation of a new scanner and tests representative of the system's performance during clinical practice.The first tests are time consuming and carried out to evaluate the agreement of the system with the prescribed procurement specifications. The second tests identify the equipment malfunction requiring maintenance are not time consuming and are suited to a busy clinical scanner. The paper evaluates the feasibility of the AAPM protocols and proposes procedures and practical tools to achieve this purpose.The MRI images, captured from the scanner and transferred in DICOM (Digital Imaging and Communications in Medicine) format by a local network, are analyzed by computerized worksheets and commercial software
Quality control of analyses of mercury in hair
International Nuclear Information System (INIS)
Lind, B.; Friberg, L.; Bigras, L.; Kirkbride, J.; Kennedy, P.; Kjellstroem, T.
1988-01-01
A quality control programme for mercury determinations in hair was developed within a study of 'Mental effects of prenatal methylmercury exposure in New Zealand children'. Hair was obtained from seven females with a mercury concentration of about 0.5-4 μg Hg/g. The hair was cut into 1-5 cm pieces and pulverized by liquid nitrogen grinding using a ring mill. In order to obtain a series of QC samples with varying Hg concentrations, different amounts of powder from all the samples and a reference sample of pulverized hair (11.2 μg Hg/g) were mixed. The mercury concentrations in the original samples and the mixtures were determined by radiochemical neutron activation analysis (RNAA). In total four laboratories participated in the interlaboratory comparison. All laboratories used the cold vapor AAS technique and Hg monitor model 1235, LDC for determinations after wet digestion of the samples. (orig./RB)
Using perceptual control theory to analyse technology integration in teaching
Directory of Open Access Journals (Sweden)
D W Govender
2013-07-01
Full Text Available Contrary to the more traditional scenario of instructor-focused presentation, contemporary education allows individuals to embrace modern technological advances such as computers to concur with, conceptualize and substantiate matters presented before them. Transition from instructor-focused to student-centred presentation is prone to dissension and strife, motivating educators to assess elements of learner-centred teaching in conjunction with traditional teaching mechanisms and how individuals perceive and comprehend information (Andersson, 2008; Kiboss, 2010; United Nations Educational, Scientific and Cultural Organization (UNESCO, 2004. Computers can assist when used in the traditional teacher-student interface, but consideration must be given to teaching method variations and the students embracing these learning applications. If learner-centred teaching is to become accepted certain elements need to be introduced: revision of educators’ learning and teaching applications, time to facilitate knowledge and use of applicable contemporary technologies, and methods compatible with various technologies (Kiboss, 2010. Change is often not easy – while acknowledging the need to alter and revise methods they were taught to instil, educators may fail to embrace incorporation of technology into their teaching platform. Why are educators, who are quite knowledgeable and competent in computer applications and their merits, failing to embrace the benefits of technology in the classroom? A critical assessment of this mandates a transdisciplinary disposition in order to come to an amenable resolution. Perception, inhibition, ignorance and goals are just some reasons why educators are reluctant to incorporate technology despite their proficiency. Perceptual control theory (PCT will be implemented to assess these reasons as a means towards achieving change and assessing how to move forward. Issues associated with educators’ short- and long-term goals as
Directory of Open Access Journals (Sweden)
Susanne Unverzagt
Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health
Directory of Open Access Journals (Sweden)
Varaksin Anatoly
2014-03-01
Full Text Available The methods of the analysis of research data including the concomitant variables (confounders associated with both the response and the current factor are considered. There are two usual ways to take into account such variables: the first, at the stage of planning the experiment and the second, in analyzing the received data. Despite the equal effectiveness of these approaches, there exists strong reason to restrict the usage of regression method to accounting for confounders by ANCOVA. Authors consider the standardization by stratification as a reliable method to account for the effect of confounding factors as opposed to the widely-implemented application of logistic regression and the covariance analysis. The program for the automation of standardization procedure is proposed, it is available at the site of the Institute of Industrial Ecology.
CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability
Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.
Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.
Padula, William V; Mishra, Manish K; Weaver, Christopher D; Yilmaz, Taygan; Splaine, Mark E
2012-06-01
To demonstrate complementary results of regression and statistical process control (SPC) chart analyses for hospital-acquired pressure ulcers (HAPUs), and identify possible links between changes and opportunities for improvement between hospital microsystems and macrosystems. Ordinary least squares and panel data regression of retrospective hospital billing data, and SPC charts of prospective patient records for a US tertiary-care facility (2004-2007). A prospective cohort of hospital inpatients at risk for HAPUs was the study population. There were 337 HAPU incidences hospital wide among 43 844 inpatients. A probit regression model predicted the correlation of age, gender and length of stay on HAPU incidence (pseudo R(2)=0.096). Panel data analysis determined that for each additional day in the hospital, there was a 0.28% increase in the likelihood of HAPU incidence. A p-chart of HAPU incidence showed a mean incidence rate of 1.17% remaining in statistical control. A t-chart showed the average time between events for the last 25 HAPUs was 13.25 days. There was one 57-day period between two incidences during the observation period. A p-chart addressing Braden scale assessments showed that 40.5% of all patients were risk stratified for HAPUs upon admission. SPC charts complement standard regression analysis. SPC amplifies patient outcomes at the microsystem level and is useful for guiding quality improvement. Macrosystems should monitor effective quality improvement initiatives in microsystems and aid the spread of successful initiatives to other microsystems, followed by system-wide analysis with regression. Although HAPU incidence in this study is below the national mean, there is still room to improve HAPU incidence in this hospital setting since 0% incidence is theoretically achievable. Further assessment of pressure ulcer incidence could illustrate improvement in the quality of care and prevent HAPUs.
Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A
2015-01-01
Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.
Reprocessing of spent nuclear fuel, Annex 3: Chemical and radiometric control analyses
International Nuclear Information System (INIS)
Gal, I.
1964-01-01
Simple, fast and reliable control analyses are obligatory during reprocessing. The analyses performed covered measuring the contents of uranium in water and organic solutions, contents of plutonium in water and organic solutions as well as the free acid in both solutions. In addition temporary analyses were done to determine the density of water and organic solutions as well as content of TBP in kerosine
Maximum likelihood estimation for Cox's regression model under nested case-control sampling
DEFF Research Database (Denmark)
Scheike, Thomas; Juul, Anders
2004-01-01
Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazard...
Theobald, Roddy; Freeman, Scott
2014-01-01
Although researchers in undergraduate science, technology, engineering, and mathematics education are currently using several methods to analyze learning gains from pre- and posttest data, the most commonly used approaches have significant shortcomings. Chief among these is the inability to distinguish whether differences in learning gains are due to the effect of an instructional intervention or to differences in student characteristics when students cannot be assigned to control and treatment groups at random. Using pre- and posttest scores from an introductory biology course, we illustrate how the methods currently in wide use can lead to erroneous conclusions, and how multiple linear regression offers an effective framework for distinguishing the impact of an instructional intervention from the impact of student characteristics on test score gains. In general, we recommend that researchers always use student-level regression models that control for possible differences in student ability and preparation to estimate the effect of any nonrandomized instructional intervention on student performance.
Maximum likelihood estimation for Cox's regression model under nested case-control sampling
DEFF Research Database (Denmark)
Scheike, Thomas Harder; Juul, Anders
2004-01-01
-like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...
Directory of Open Access Journals (Sweden)
Xian-Xia Zhang
2013-01-01
Full Text Available This paper presents a reference function based 3D FLC design methodology using support vector regression (SVR learning. The concept of reference function is introduced to 3D FLC for the generation of 3D membership functions (MF, which enhance the capability of the 3D FLC to cope with more kinds of MFs. The nonlinear mathematical expression of the reference function based 3D FLC is derived, and spatial fuzzy basis functions are defined. Via relating spatial fuzzy basis functions of a 3D FLC to kernel functions of an SVR, an equivalence relationship between a 3D FLC and an SVR is established. Therefore, a 3D FLC can be constructed using the learned results of an SVR. Furthermore, the universal approximation capability of the proposed 3D fuzzy system is proven in terms of the finite covering theorem. Finally, the proposed method is applied to a catalytic packed-bed reactor and simulation results have verified its effectiveness.
Directory of Open Access Journals (Sweden)
Claudia Ledesma
2013-08-01
Full Text Available Water quality is traditionally monitored and evaluated based upon field data collected at limited locations. The storage capacity of reservoirs is reduced by deposits of suspended matter. The major factors affecting surface water quality are suspended sediments, chlorophyll and nutrients. Modeling and monitoring the biogeochemical status of reservoirs can be done through data from remote sensors. Since the improvement of sensors’ spatial and spectral resolutions, satellites have been used to monitor the interior areas of bodies of water. Water quality parameters, such as chlorophyll-a concentration and secchi disk depth, were found to have a high correlation with transformed spectral variables derived from bands 1, 2, 3 and 4 of LANDSAT 5TM satellite. We created models of estimated responses in regard to values of chlorophyll-a. To do so, we used population models of single and multiple linear regression, whose parameters are associated with the reflectance data of bands 2 and 4 of the sub-image of the satellite, as well as the data of chlorophyll-a obtained in 25 selected stations. According to the physico-chemical analyzes performed, the characteristics of the water in the reservoir of Rio Tercero, correspond to somewhat hard freshwater with calcium bicarbonate. The water was classified as usable as a source of plant treatment, excellent for irrigation because of its low salinity and low residual sodium carbonate content, but unsuitable for animal consumption because of its low salt content.
Li, Ming; Peng, Qiang; Xiao, Man
2016-01-01
Fortnightly investigations at 12 sampling sites in Meiliang Bay and Gonghu Bay of Lake Taihu (China) were carried out from June to early November 2010. The relationship between abiotic factors and cell density of different Microcystis species was analyzed using the interval maxima regression (IMR) to determine the optimum temperature and nutrient concentrations for growth of different Microcystis species. Our results showed that cell density of all the Microcystis species increased along with the increase of water temperature, but Microcystis aeruginosa adapted to a wide range of temperatures. The optimum total dissolved nitrogen concentrations for M. aeruginosa, Microcystis wesenbergii, Microcystis ichthyoblabe, and unidentified Microcystis were 3.7, 2.0, 2.4, and 1.9 mg L(-1), respectively. The optimum total dissolved phosphorus concentrations for different species were M. wesenbergii (0.27 mg L(-1)) > M. aeruginosa (0.1 mg L(-1)) > M. ichthyoblabe (0.06 mg L(-1)) ≈ unidentified Microcystis, and the iron (Fe(3+)) concentrations were M. wesenbergii (0.73 mg L(-1)) > M. aeruginosa (0.42 mg L(-1)) > M. ichthyoblabe (0.35 mg L(-1)) > unidentified Microcystis (0.09 mg L(-1)). The above results suggest that if phosphorus concentration was reduced to 0.06 mg L(-1) or/and iron concentration was reduced to 0.35 mg L(-1) in Lake Taihu, the large colonial M. wesenbergii and M. aeruginosa would be replaced by small colonial M. ichthyoblabe and unidentified Microcystis. Thereafter, the intensity and frequency of the occurrence of Microcystis blooms would be reduced by changing Microcystis species composition.
Summary of dynamic analyses of the advanced neutron source reactor inner control rods
International Nuclear Information System (INIS)
Hendrich, W.R.
1995-08-01
A summary of the structural dynamic analyses that were instrumental in providing design guidance to the Advanced Neutron source (ANS) inner control element system is presented in this report. The structural analyses and the functional constraints that required certain performance parameters were combined to shape and guide the design effort toward a prediction of successful and reliable control and scram operation to be provided by these inner control rods
Sørensen, By Ole H
2016-10-01
Organizational-level occupational health interventions have great potential to improve employees' health and well-being. However, they often compare unfavourably to individual-level interventions. This calls for improving methods for designing, implementing and evaluating organizational interventions. This paper presents and discusses the regression discontinuity design because, like the randomized control trial, it is a strong summative experimental design, but it typically fits organizational-level interventions better. The paper explores advantages and disadvantages of a regression discontinuity design with an embedded randomized control trial. It provides an example from an intervention study focusing on reducing sickness absence in 196 preschools. The paper demonstrates that such a design fits the organizational context, because it allows management to focus on organizations or workgroups with the most salient problems. In addition, organizations may accept an embedded randomized design because the organizations or groups with most salient needs receive obligatory treatment as part of the regression discontinuity design. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Naradasu Kumar Ravi
2013-01-01
Full Text Available Diesel engine designers are constantly on the look-out for performance enhancement through efficient control of operating parameters. In this paper, the concept of an intelligent engine control system is proposed that seeks to ensure optimized performance under varying operating conditions. The concept is based on arriving at the optimum engine operating parameters to ensure the desired output in terms of efficiency. In addition, a Support Vector Machines based prediction model has been developed to predict the engine performance under varying operating conditions. Experiments were carried out at varying loads, compression ratios and amounts of exhaust gas recirculation using a variable compression ratio diesel engine for data acquisition. It was observed that the SVM model was able to predict the engine performance accurately.
Directory of Open Access Journals (Sweden)
Liyang Wang
2016-01-01
Full Text Available Time-varying external disturbances cause instability of humanoid robots or even tip robots over. In this work, a trapezoidal fuzzy least squares support vector regression- (TF-LSSVR- based control system is proposed to learn the external disturbances and increase the zero-moment-point (ZMP stability margin of humanoid robots. First, the humanoid states and the corresponding control torques of the joints for training the controller are collected by implementing simulation experiments. Secondly, a TF-LSSVR with a time-related trapezoidal fuzzy membership function (TFMF is proposed to train the controller using the simulated data. Thirdly, the parameters of the proposed TF-LSSVR are updated using a cubature Kalman filter (CKF. Simulation results are provided. The proposed method is shown to be effective in learning and adapting occasional external disturbances and ensuring the stability margin of the robot.
Pi, Fengmei; Binzel, Daniel W.; Lee, Tae Jin; Li, Zhefeng; Sun, Meiyan; Rychahou, Piotr; Li, Hui; Haque, Farzin; Wang, Shaoying; Croce, Carlo M.; Guo, Bin; Evers, B. Mark; Guo, Peixuan
2018-01-01
Nanotechnology offers many benefits, and here we report an advantage of applying RNA nanotechnology for directional control. The orientation of arrow-shaped RNA was altered to control ligand display on extracellular vesicle membranes for specific cell targeting, or to regulate intracellular trafficking of small interfering RNA (siRNA) or microRNA (miRNA). Placing membrane-anchoring cholesterol at the tail of the arrow results in display of RNA aptamer or folate on the outer surface of the extracellular vesicle. In contrast, placing the cholesterol at the arrowhead results in partial loading of RNA nanoparticles into the extracellular vesicles. Taking advantage of the RNA ligand for specific targeting and extracellular vesicles for efficient membrane fusion, the resulting ligand-displaying extracellular vesicles were capable of specific delivery of siRNA to cells, and efficiently blocked tumour growth in three cancer models. Extracellular vesicles displaying an aptamer that binds to prostate-specific membrane antigen, and loaded with survivin siRNA, inhibited prostate cancer xenograft. The same extracellular vesicle instead displaying epidermal growth-factor receptor aptamer inhibited orthotopic breast cancer models. Likewise, survivin siRNA-loaded and folate-displaying extracellular vesicles inhibited patient-derived colorectal cancer xenograft.
Regression of stroke-like lesions in MELAS-syndrome after seizure control.
Finsterer, Josef; Barton, Peter
2010-12-01
There are some indications that seizure activity promotes the development of stroke-like episodes, or vice versa, in patients with mitochondrial encephalopathy, lactic acidosis and stroke-like episodes (MELAS) syndrome or other syndromic mitochondrial disorders. A 41-year-old Caucasian female with MELAS syndrome, presenting with short stature, microcytic anaemia, increased blood-sedimentation rate, myopathy, hyper-gammaglobulinaemia, an iron-metabolism defect, migraine-like headaches, and stroke-like episodes, developed complex partial and generalised seizures at age 32 years. Valproic acid was ineffective but after switching to lamotrigine and lorazepam, she became seizure-free for five years and stroke-like episodes did not recur. Cerebral MRI initially showed enhanced gyral thickening and a non-enhanced T2-hyperintensity over the left parieto-temporo-occipital white matter and cortex and enhanced caudate heads. After two years without seizures, the non-enhanced hyperintense parieto-temporo-occipital lesion had disappeared, being attributed to consequent seizure control. The caudate heads, however, remained hyperintense throughout the observational period. This case indicates that adequate seizure control in a patient with MELAS syndrome may prevent the recurrence of stroke-like episodes and may result in the disappearance of stroke-like lesions on MRI.
Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.
2017-01-01
Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...
Tseng, Wo-Jan; Hung, Li-Wei; Shieh, Jiann-Shing; Abbod, Maysam F; Lin, Jinn
2013-07-15
Osteoporotic hip fractures with a significant morbidity and excess mortality among the elderly have imposed huge health and economic burdens on societies worldwide. In this age- and sex-matched case control study, we examined the risk factors of hip fractures and assessed the fracture risk by conditional logistic regression (CLR) and ensemble artificial neural network (ANN). The performances of these two classifiers were compared. The study population consisted of 217 pairs (149 women and 68 men) of fractures and controls with an age older than 60 years. All the participants were interviewed with the same standardized questionnaire including questions on 66 risk factors in 12 categories. Univariate CLR analysis was initially conducted to examine the unadjusted odds ratio of all potential risk factors. The significant risk factors were then tested by multivariate analyses. For fracture risk assessment, the participants were randomly divided into modeling and testing datasets for 10-fold cross validation analyses. The predicting models built by CLR and ANN in modeling datasets were applied to testing datasets for generalization study. The performances, including discrimination and calibration, were compared with non-parametric Wilcoxon tests. In univariate CLR analyses, 16 variables achieved significant level, and six of them remained significant in multivariate analyses, including low T score, low BMI, low MMSE score, milk intake, walking difficulty, and significant fall at home. For discrimination, ANN outperformed CLR in both 16- and 6-variable analyses in modeling and testing datasets (p?hip fracture are more personal than environmental. With adequate model construction, ANN may outperform CLR in both discrimination and calibration. ANN seems to have not been developed to its full potential and efforts should be made to improve its performance.
Wang, Hui; Sui, Weiguo; Xue, Wen; Wu, Junyong; Chen, Jiejing; Dai, Yong
2014-09-01
Immunoglobulin A nephropathy (IgAN) is a complex trait regulated by the interaction among multiple physiologic regulatory systems and probably involving numerous genes, which leads to inconsistent findings in genetic studies. One possibility of failure to replicate some single-locus results is that the underlying genetics of IgAN nephropathy is based on multiple genes with minor effects. To learn the association between 23 single nucleotide polymorphisms (SNPs) in 14 genes predisposing to chronic glomerular diseases and IgAN in Han males, the 23 SNPs genotypes of 21 Han males were detected and analyzed with a BaiO gene chip, and their associations were analyzed with univariate analysis and multiple linear regression analysis. Analysis showed that CTLA4 rs231726 and CR2 rs1048971 revealed a significant association with IgAN. These findings support the multi-gene nature of the etiology of IgAN and propose a potential gene-gene interactive model for future studies.
Duda, David P.; Minnis, Patrick
2009-01-01
Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.
Learn about EPA’s use of the Integrated Planning Model (IPM) to develop estimates of SO2 and NOx emission control costs, projections of futureemissions, and projections of capacity of future control retrofits, assuming controls on EGUs.
Crown, William H
2014-02-01
This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.
Duda, David P.; Minnis, Patrick
2009-01-01
Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.
Directory of Open Access Journals (Sweden)
Orsolya Selymes, PhD Candidate
2011-06-01
Full Text Available The Theory of Social Control (TSC is grounded in satisfaction and happiness research. The study investigated the reasons behind relatively low levels of civil and personal satisfaction, subjective social well-being and experienced happiness in the post-communist Hungarian social context. The basic social process uncovered in the research is self-situating, which involves a continuous assessment of social control, which occurs on three psychological dimensions: activity, fairness and connectedness, operated via social flow. The culturally salient outcome of self-situating in Hungary is self-victimizing, meaning a subjective loss of control on all three dimensions. Some of the most important emotional-motivational consequences of self-victimizing are inhibition, regression and isolation, which contribute to various socio-cultural phenomenon such as distrust, bystander strategies, pessimism or anomie across a number of social situations. Based on the emerging theory, the concept of subjective social control is introduced and an expanded three-dimensional model of civil satisfaction, comfort and contribution, along with psychological and cultural implications, are discussed.Key words: social control, self-situating, self-victimizing, activity, fairness, connectedness, inhibition, fury, isolation
Kasenda, Benjamin; Schandelmaier, Stefan; Sun, Xin; von Elm, Erik; You, John; Blümle, Anette; Tomonaga, Yuki; Saccilotto, Ramon; Amstutz, Alain; Bengough, Theresa; Meerpohl, Joerg J; Stegert, Mihaela; Olu, Kelechi K; Tikkinen, Kari A O; Neumann, Ignacio; Carrasco-Labra, Alonso; Faulhaber, Markus; Mulla, Sohail M; Mertz, Dominik; Akl, Elie A; Bassler, Dirk; Busse, Jason W; Ferreira-González, Ignacio; Lamontagne, Francois; Nordmann, Alain; Gloy, Viktoria; Raatz, Heike; Moja, Lorenzo; Rosenthal, Rachel; Ebrahim, Shanil; Vandvik, Per O; Johnston, Bradley C; Walter, Martin A; Burnand, Bernard; Schwenkglenks, Matthias; Hemkens, Lars G; Bucher, Heiner C; Guyatt, Gordon H; Briel, Matthias
2014-07-16
To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. Cohort of protocols of randomised controlled trial and subsequent full journal publications. Six research ethics committees in Switzerland, Germany, and Canada. 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials. © The DISCO study group 2014.
Directory of Open Access Journals (Sweden)
Bruce B.W. Phiri
2016-06-01
Full Text Available Schistosomiasis and soil-transmitted helminth (STH infections constitute a major public health problem in many parts of sub-Saharan Africa. In areas where prevalence of geo-helminths and schistosomes is high, co-infection with multiple parasite species is common, resulting in disproportionately elevated burden compared with single infections. Determining risk factors of co-infection intensity is important for better design of targeted interventions. In this paper, we examined risk factors of hookworm and S. haematobium co-infection intensity, in Chikwawa district, southern Malawi in 2005, using bivariate count models. Results show that hookworm and S. haematobium infections were much localised with small proportion of individuals harbouring more parasites especially among school-aged children. The risk of co-intensity with both hookworm and S. haematobium was high for all ages, although this diminished with increasing age, increased with fishing (hookworm: coefficient. = 12.29; 95% CI = 11.50–13.09; S. haematobium: 0.040; 95% CI = 0.0037, 3.832. Both infections were abundant in those with primary education (hookworm: coef. = 0.072; 95% CI = 0.056, 0.401 and S. haematobium: coef. = 0.286; 95% CI = 0.034, 0.538. However, much lower risk was observed for those who were farmers (hookworm: coef. = −0.349, 95% CI = −0.547,−0.150; S. haematobium: coef. −0.239, 95% CI = −0.406, −0.072. In conclusion, our findings suggest that efforts to control helminths infection should be co-integrated and health promotion campaigns should be aimed at school-going children and adults who are in constant contact with water.
Directory of Open Access Journals (Sweden)
Weize Li
2018-01-01
Full Text Available Research on stealthiness has become an important topic in the field of data integrity (DI attacks. To construct stealthy DI attacks, a common assumption in most related studies is that attackers have prior model knowledge of physical systems. In this paper, such assumption is relaxed and a covert agent is proposed based on the least squares support vector regression (LSSVR. By estimating a plant model from control and sensory data, the LSSVR-based covert agent can closely imitate the behavior of the physical plant. Then, the covert agent is used to construct a covert loop, which can keep the controller’s input and output both stealthy over a finite time window. Experiments have been carried out to show the effectiveness of the proposed method.
Chen, Jieyu; Xiang, Hongjie; Jiang, Pingping; Yu, Lin; Jing, Yuan; Li, Fei; Wu, Shengwei; Fu, Xiuqiong; Liu, Yanyan; Kwan, Hiuyee; Luo, Ren; Zhao, Xiaoshan; Sun, Xiaomin
2017-02-28
Suboptimal health status (SHS) is the intermediate health state between health and disease, it is medically undiagnosed and is also termed functional somatic syndrome. Although its clinical manifestations are complicated and various, SHS has not reached the disease status. Unhealthy lifestyle is associated with many chronic diseases and mortality. In accordance with the impact of lifestyle on health, it is intriguing to determine the association between unhealthy lifestyle and SHS risk. We conducted a nested case-control study among healthy Chinese college students from March 2012 to September 2013, which was nested in a prospective cohort of 5676 students. We performed 1:1 incidence density sampling with matched controls for birth year, sex, grade, specialty and individual character. SHS was evaluated using the medical examination report and Sub-health Measurement Scale V1.0 (SHMS V1.0). Exposure was defined as an unhealthy lifestyle per the frequency of six behavioral dimensions from the Health-promoting Lifestyle Profile (HPLP-II). We matched 543 cases of SHS (42.66%) in a cohort of 1273 students during the 1.5 years mean follow-up time with controls. A significant difference (t = 9.79, p lifestyle behavior with respect to behavioral dimensions significantly affected SHS likelihood. Further analyses revealed a marked increase (average increased 14.73 points) in lifestyle level among those SHS regression to health after 1.5 years, with respect to the HPLP-II behavioral dimensions, in addition to the total score (t = -15.34, p lifestyles, and the Int. J. Environ. Res. Public Health 2017, 14, 240 2 of 17 mitigation of modifiable lifestyle risk factors may lead to SHS regression. Increased efforts to modify unhealthy lifestyles are necessary to prevent SHS.
Methodology for Analysing Controllability and Observability of Bladed Disc Coupled Vibrations
DEFF Research Database (Denmark)
Christensen, Rene Hardam; Santos, Ilmar
2004-01-01
to place sensors and actuators so that all vibration levels can be monitored and controlled. Due to the special dynamic characteristics of rotating coupled bladed discs, where disc lateral motion is coupled to blade flexible motion, such analyses become quite complicated. The dynamics is described...... by a time-variant mathematical model, which presents parametric vibration modes and centrifugal stiffening effects resulting in increasing blade natural frequencies. In this framework the objective and contribution of this paper is to present a methodology for analysing the modal controllability...
Matson, Johnny L.; Kozlowski, Alison M.
2010-01-01
Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
The development of the microcomputer controlling system for micro uranium on-line analyser
Ye Guo Qiang
2002-01-01
The author presents the microcomputer controlling system for micro uranium on-line analyser under Windows 3.2 system (Chinese). The user program is designed with Visual Basic 4.0, the program of controlling the hardware interface with Windows Dynamic Linking Library (DLL) which is programmed by Borland C sup + sup + 4.5, and the date processing is with Access 2.0 database
Energy Technology Data Exchange (ETDEWEB)
Mihelic, M; Miklavzic, U; Rupnik, Z; Satalic, P; Spreizer, F; Zerovnik, I [Institut Jozef Stefan, Ljubljana (Yugoslavia)
1985-07-01
Performances and concept of the multipurpose, microcomputer-controlled thermoluminescent analyser, designed for use in laboratory work TL dosemeters as well as for routine dose readings in the range from ecological to accident doses is described. The main features of the analyser are: time-linear sampling, digitalisation, storing, and subsequent displaying on the monitor time scale of the glow and and temperature curve of the TL material; digital stabilization, control and diagnostic of the analog unit; ability of storing 7 different 8-parametric heating programs; ability of storing 15 evaluation programs defined by 2 or 4 parameters and 3 different algorithms (altogether 5 types of evaluations). Analyser has several features intended for routine work: 9 function keys and possibilities of file forming on cassette or display disc, of dose calculation and averaging, of printing reports with names, and possibility of additional programming in Basic. (author)
Directory of Open Access Journals (Sweden)
Nufang Fang
2015-07-01
Full Text Available Multivariate statistics are commonly used to identify the factors that control the dynamics of runoff or sediment yields during hydrological processes. However, one issue with the use of conventional statistical methods to address relationships between variables and runoff or sediment yield is multicollinearity. The main objectives of this study were to apply a method for effectively identifying runoff and sediment control factors during hydrological processes and apply that method to a case study. The method combines the clustering approach and partial least squares regression (PLSR models. The case study was conducted in a mountainous watershed in the Three Gorges Area. A total of 29 flood events in three hydrological years in areas with different land uses were obtained. In total, fourteen related variables were separated from hydrographs using the classical hydrograph separation method. Twenty-nine rainfall events were classified into two rainfall regimes (heavy Rainfall Regime I and moderate Rainfall Regime II based on rainfall characteristics and K-means clustering. Four separate PLSR models were constructed to identify the main variables that control runoff and sediment yield for the two rainfall regimes. For Rainfall Regime I, the dominant first-order factors affecting the changes in sediment yield in our study were all of the four rainfall-related variables, flood peak discharge, maximum flood suspended sediment concentration, runoff, and the percentages of forest and farmland. For Rainfall Regime II, antecedent condition-related variables have more effects on both runoff and sediment yield than in Rainfall Regime I. The results suggest that the different control factors of the two rainfall regimes are determined by the rainfall characteristics and thus different runoff mechanisms.
Quality control and conduct of genome-wide association meta-analyses
DEFF Research Database (Denmark)
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C
2014-01-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC...
Olive, David J
2017-01-01
This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...
Kim, Bo Wook; Cho, Hanbyoul; Kim, Hyunki; Nam, Eun Ji; Kim, Sang Wun; Kim, Sunghoon; Kim, Young Tae; Kim, Jae-Hoon
2012-01-01
The aim of this study was early prediction of postmolar gestational trophoblastic neoplasm (GTN) after evacuation of high-risk mole, by comparison of human chorionic gonadotrophin (hCG) regression rates. Fifty patients with a high-risk mole initially and spontaneously regressing after molar evacuation were selected from January 1, 1996 to May 31, 2010 (spontaneous regression group). Fifty patients with a high-risk mole initially and progressing to postmolar GTN after molar evacuation were selected (postmolar GTN group). hCG regression rates represented as hCG/initial hCG were compared between the two groups. The sensitivity and specificity of these rates for prediction of postmolar GTN were assessed using receiver operating characteristic curves. Multivariate analyses of associations between risk factors and postmolar GTN progression were performed. The mean regression rate of hCG between the two groups was compared. hCG regression rates represented as hCG/initial hCG (%) were 0.36% in the spontaneous regression group and 1.45% in the postmolar GTN group in the second week (p=0.003). Prediction of postmolar GTN by hCG regression rate revealed a sensitivity of 48.0% and specificity of 89.5% with a cut-off value of 0.716% and area under the curve (AUC) of 0.759 in the 2nd week (pfactor for postmolar GTN. Crown Copyright © 2011. Published by Elsevier Ireland Ltd. All rights reserved.
Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally
2018-02-01
1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.
The Use of Statistical Process Control Tools for Analysing Financial Statements
Directory of Open Access Journals (Sweden)
Niezgoda Janusz
2017-06-01
Full Text Available This article presents the proposed application of one type of the modified Shewhart control charts in the monitoring of changes in the aggregated level of financial ratios. The control chart x̅ has been used as a basis of analysis. The examined variable from the sample in the mentioned chart is the arithmetic mean. The author proposes to substitute it with a synthetic measure that is determined and based on the selected ratios. As the ratios mentioned above, are expressed in different units and characters, the author applies standardisation. The results of selected comparative analyses have been presented for both bankrupts and non-bankrupts. They indicate the possibility of using control charts as an auxiliary tool in financial analyses.
Stubbs, Brendon; Vancampfort, Davy; Rosenbaum, Simon; Ward, Philip B; Richards, Justin; Soundy, Andrew; Veronese, Nicola; Solmi, Marco; Schuch, Felipe B
2016-01-15
Exercise has established efficacy in improving depressive symptoms. Dropouts from randomized controlled trials (RCT's) pose a threat to the validity of this evidence base, with dropout rates varying across studies. We conducted a systematic review and meta-analysis to investigate the prevalence and predictors of dropout rates among adults with depression participating in exercise RCT's. Three authors identified RCT's from a recent Cochrane review and conducted updated searches of major electronic databases from 01/2013 to 08/2015. We included RCT's of exercise interventions in people with depression (including major depressive disorder (MDD) and depressive symptoms) that reported dropout rates. A random effects meta-analysis and meta regression were conducted. Overall, 40 RCT's were included reporting dropout rates across 52 exercise interventions including 1720 people with depression (49.1 years (range=19-76 years), 72% female (range=0-100)). The trim and fill adjusted prevalence of dropout across all studies was 18.1% (95%CI=15.0-21.8%) and 17.2% (95%CI=13.5-21.7, N=31) in MDD only. In MDD participants, higher baseline depressive symptoms (β=0.0409, 95%CI=0.0809-0.0009, P=0.04) predicted greater dropout, whilst supervised interventions delivered by physiotherapists (β=-1.2029, 95%CI=-2.0967 to -0.3091, p=0.008) and exercise physiologists (β=-1.3396, 95%CI=-2.4478 to -0.2313, p=0.01) predicted lower dropout. A comparative meta-analysis (N=29) established dropout was lower in exercise than control conditions (OR=0.642, 95%CI=0.43-0.95, p=0.02). Exercise is well tolerated by people with depression and drop out in RCT's is lower than control conditions. Thus, exercise is a feasible treatment, in particular when delivered by healthcare professionals with specific training in exercise prescription. Copyright © 2015 Elsevier B.V. All rights reserved.
Alagoz, Baris Baykant; Deniz, Furkan Nur; Keles, Cemal; Tan, Nusret
2015-03-01
This study investigates disturbance rejection capacity of closed loop control systems by means of reference to disturbance ratio (RDR). The RDR analysis calculates the ratio of reference signal energy to disturbance signal energy at the system output and provides a quantitative evaluation of disturbance rejection performance of control systems on the bases of communication channel limitations. Essentially, RDR provides a straightforward analytical method for the comparison and improvement of implicit disturbance rejection capacity of closed loop control systems. Theoretical analyses demonstrate us that RDR of the negative feedback closed loop control systems are determined by energy spectral density of controller transfer function. In this manner, authors derived design criteria for specifications of disturbance rejection performances of PID and fractional order PID (FOPID) controller structures. RDR spectra are calculated for investigation of frequency dependence of disturbance rejection capacity and spectral RDR analyses are carried out for PID and FOPID controllers. For the validation of theoretical results, simulation examples are presented. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
A Two-Stage Penalized Logistic Regression Approach to Case-Control Genome-Wide Association Studies
Directory of Open Access Journals (Sweden)
Jingyuan Zhao
2012-01-01
Full Text Available We propose a two-stage penalized logistic regression approach to case-control genome-wide association studies. This approach consists of a screening stage and a selection stage. In the screening stage, main-effect and interaction-effect features are screened by using L1-penalized logistic like-lihoods. In the selection stage, the retained features are ranked by the logistic likelihood with the smoothly clipped absolute deviation (SCAD penalty (Fan and Li, 2001 and Jeffrey’s Prior penalty (Firth, 1993, a sequence of nested candidate models are formed, and the models are assessed by a family of extended Bayesian information criteria (J. Chen and Z. Chen, 2008. The proposed approach is applied to the analysis of the prostate cancer data of the Cancer Genetic Markers of Susceptibility (CGEMS project in the National Cancer Institute, USA. Simulation studies are carried out to compare the approach with the pair-wise multiple testing approach (Marchini et al. 2005 and the LASSO-patternsearch algorithm (Shi et al. 2007.
Optimal design and experimental analyses of a new micro-vibration control payload-platform
Sun, Xiaoqing; Yang, Bintang; Zhao, Long; Sun, Xiaofen
2016-07-01
This paper presents a new payload-platform, for precision devices, which possesses the capability of isolating the complex space micro-vibration in low frequency range below 5 Hz. The novel payload-platform equipped with smart material actuators is investigated and designed through optimization strategy based on the minimum energy loss rate, for the aim of achieving high drive efficiency and reducing the effect of the magnetic circuit nonlinearity. Then, the dynamic model of the driving element is established by using the Lagrange method and the performance of the designed payload-platform is further discussed through the combination of the controlled auto regressive moving average (CARMA) model with modified generalized prediction control (MGPC) algorithm. Finally, an experimental prototype is developed and tested. The experimental results demonstrate that the payload-platform has an impressive potential of micro-vibration isolation.
Barellini, A; Bogi, L; Licitra, G; Silvi, A M; Zari, A
2009-12-01
Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar.
International Nuclear Information System (INIS)
Lehmann, M.; Pecka, M.; Rocek, J.; Zalesky, K.
1993-12-01
Detailed results are given of neutron physics analyses performed to assess the efficiency and acceptability of modifications of the WWER-440 core protection and control system; the modifications have been proposed with a view to increasing the proportion of mechanical control in the compensation of reactivity effects during reactor unit operation in the variable load mode. The calculations were carried out using the modular MOBY-DICK macrocode system together with the SMV42G36 library of two-group parametrized diffusion constants, containing corrections which allow new-design WWER-440 fuel assemblies to be discriminated. (J.B). 37 tabs., 18 figs., 5 refs
Parametric analyses on dynamic stall control of rotor airfoil via synthetic jet
Directory of Open Access Journals (Sweden)
Qijun ZHAO
2017-12-01
Full Text Available The effects of synthetic jet control on unsteady dynamic stall over rotor airfoil are investigated numerically. A moving-embedded grid method and an Unsteady Reynolds Averaged Navier-Stokes (URANS solver coupled with k-Ï Shear Stress Transport (SST turbulence model are established for predicting the complex flowfields of oscillatory airfoil under jet control. Additionally, a velocity boundary condition modeled by sinusoidal function has been developed to fulfill the perturbation effect of periodic jet. The validity of present CFD method is evaluated by comparisons of the calculated results of baseline dynamic stall case for rotor airfoil and jet control case for VR-7B airfoil with experimental data. Then, parametric analyses are conducted emphatically for an OA212 rotor airfoil to investigate the effects of jet control parameters (jet location, dimensionless frequency, momentum coefficient, jet angle, jet type and dual-jet on dynamic stall characteristics of rotor airfoil. It is demonstrated by the calculated results that efficiency of jet control could be improved with specific momentum coefficient and jet angle when the jet is located near separation point of rotor airfoil. Furthermore, the dual-jet could improve control efficiency more obviously on dynamic stall of rotor airfoil with respect to the unique jet, and the influence laws of dual-jetâs angles and momentum coefficients on control effects are similar to those of the unique jet. Finally, unsteady aerodynamic characteristics of rotor via synthetic jet which is located on the upper surface of rotor blade in forward flight are calculated, and asÂ a result, the aerodynamic characteristics of rotor are improved compared with the baseline. The results indicate that synthetic jet has the capability in improving aerodynamic characteristics of rotor. Keywords: Airfoil, Dynamic stall characteristics, Flow control, Moving-embedded grid methodology, Navier-Stokes equations, Parametric
Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.
1998-01-01
This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.
Kim, Kwang S; Max, Ludo
2014-01-01
To estimate the contributions of feedforward vs. feedback control systems in speech articulation, we analyzed the correspondence between initial and final kinematics in unperturbed tongue and jaw movements for consonant-vowel (CV) and vowel-consonant (VC) syllables. If movement extents and endpoints are highly predictable from early kinematic information, then the movements were most likely completed without substantial online corrections (feedforward control); if the correspondence between early kinematics and final amplitude or position is low, online adjustments may have altered the planned trajectory (feedback control) (Messier and Kalaska, 1999). Five adult speakers produced CV and VC syllables with high, mid, or low vowels while movements of the tongue and jaw were tracked electromagnetically. The correspondence between the kinematic parameters peak acceleration or peak velocity and movement extent as well as between the articulators' spatial coordinates at those kinematic landmarks and movement endpoint was examined both for movements across different target distances (i.e., across vowel height) and within target distances (i.e., within vowel height). Taken together, results suggest that jaw and tongue movements for these CV and VC syllables are mostly under feedforward control but with feedback-based contributions. One type of feedback-driven compensatory adjustment appears to regulate movement duration based on variation in peak acceleration. Results from a statistical model based on multiple regression are presented to illustrate how the relative strength of these feedback contributions can be estimated.
Plant dynamics analyses of fast reactor concept: RAPID-A without any control rod
International Nuclear Information System (INIS)
Kambe, Mitsuru
1996-01-01
Plant dynamics analyses of a fast reactor concept RAPID-A without any control rod have been demonstrated in case of reactor startup and sudden change of the primary flow rate. RAIP-A concept involves Lithium Expansion Module (LEM) for inherent reactivity feedback, Lithium Injection Module (LIM) for inherent ultimate shutdown and Lithium Release Module (LRM) for automated reactor startup. LEM consists of Quick-LEM and Slow-LEM. Slow-LEM provides with moderate reactivity addition as decreasing temperature. Quick-LEM assures quick negative reactivity feedback as increasing temperature. Plant dynamics analyses revealed that reactor power is nearly proportional to the primary flow rate even if the flow rate increases suddenly. Fully automated reactor startup from the subcritical condition has been attempted by inserting reactivity at a constant rate by LRM. Allowable rate of reactivity addition has been obtained in respect to Quick-LEM reactivity worth. (author)
A robust internal control for high-precision DNA methylation analyses by droplet digital PCR.
Pharo, Heidi D; Andresen, Kim; Berg, Kaja C G; Lothe, Ragnhild A; Jeanmougin, Marine; Lind, Guro E
2018-01-01
Droplet digital PCR (ddPCR) allows absolute quantification of nucleic acids and has potential for improved non-invasive detection of DNA methylation. For increased precision of the methylation analysis, we aimed to develop a robust internal control for use in methylation-specific ddPCR. Two control design approaches were tested: (a) targeting a genomic region shared across members of a gene family and (b) combining multiple assays targeting different pericentromeric loci on different chromosomes. Through analyses of 34 colorectal cancer cell lines, the performance of the control assay candidates was optimized and evaluated, both individually and in various combinations, using the QX200™ droplet digital PCR platform (Bio-Rad). The best-performing control was tested in combination with assays targeting methylated CDO1 , SEPT9 , and VIM . A 4Plex panel consisting of EPHA3 , KBTBD4 , PLEKHF1 , and SYT10 was identified as the best-performing control. The use of the 4Plex for normalization reduced the variability in methylation values, corrected for differences in template amount, and diminished the effect of chromosomal aberrations. Positive Droplet Calling (PoDCall), an R-based algorithm for standardized threshold determination, was developed, ensuring consistency of the ddPCR results. Implementation of a robust internal control, i.e., the 4Plex, and an algorithm for automated threshold determination, PoDCall, in methylation-specific ddPCR increase the precision of DNA methylation analysis.
Parametric analyses for synthetic jet control on separation and stall over rotor airfoil
Directory of Open Access Journals (Sweden)
Zhao Guoqing
2014-10-01
Full Text Available Numerical simulations are performed to investigate the effects of synthetic jet control on separation and stall over rotor airfoils. The preconditioned and unsteady Reynolds-averaged Navier–Stokes equations coupled with a k − ω shear stream transport turbulence model are employed to accomplish the flowfield simulation of rotor airfoils under jet control. Additionally, a velocity boundary condition modeled by a sinusoidal function is developed to fulfill the perturbation effect of periodic jets. The validity of the present CFD procedure is evaluated by the simulated results of an isolated synthetic jet and the jet control case for airfoil NACA0015. Then, parametric analyses are conducted specifically for an OA213 rotor airfoil to investigate the effects of jet parameters (forcing frequency, jet location and momentum coefficient, jet direction, and distribution of jet arrays on the control effect of the aerodynamic characteristics of a rotor airfoil. Preliminary results indicate that the efficiency of jet control can be improved with specific frequencies (the best lift-drag ratio at F+ = 2.0 and jet angles (40° or 75° when the jets are located near the separation point of the rotor airfoil. Furthermore, as a result of a suitable combination of jet arrays, the lift coefficient of the airfoil can be improved by nearly 100%, and the corresponding drag coefficient decreased by 26.5% in comparison with the single point control case.
International Nuclear Information System (INIS)
Barellini, A.; Bogi, L.; Licitra, G.; Silvi, A. M.; Zari, A.
2009-01-01
Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar. (authors)
Numerical simulations and analyses of temperature control loop heat pipe for space CCD camera
Meng, Qingliang; Yang, Tao; Li, Chunlin
2016-10-01
As one of the key units of space CCD camera, the temperature range and stability of CCD components affect the image's indexes. Reasonable thermal design and robust thermal control devices are needed. One kind of temperature control loop heat pipe (TCLHP) is designed, which highly meets the thermal control requirements of CCD components. In order to study the dynamic behaviors of heat and mass transfer of TCLHP, particularly in the orbital flight case, a transient numerical model is developed by using the well-established empirical correlations for flow models within three dimensional thermal modeling. The temperature control principle and details of mathematical model are presented. The model is used to study operating state, flow and heat characteristics based upon the analyses of variations of temperature, pressure and quality under different operating modes and external heat flux variations. The results indicate that TCLHP can satisfy the thermal control requirements of CCD components well, and always ensure good temperature stability and uniformity. By comparison between flight data and simulated results, it is found that the model is to be accurate to within 1°C. The model can be better used for predicting and understanding the transient performance of TCLHP.
ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS
Energy Technology Data Exchange (ETDEWEB)
WILLIAMS, J.C.
2003-11-15
This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.
Analysing and controlling the tax evasion dynamics via majority-vote model
Energy Technology Data Exchange (ETDEWEB)
Lima, F W S, E-mail: fwslima@gmail.co, E-mail: wel@ufpi.edu.b [Departamento de Fisica, Universidade Federal do PiauI, 64049-550, Teresina - PI (Brazil)
2010-09-01
Within the context of agent-based Monte-Carlo simulations, we study the well-known majority-vote model (MVM) with noise applied to tax evasion on simple square lattices, Voronoi-Delaunay random lattices, Barabasi-Albert networks, and Erdoes-Renyi random graphs. In the order to analyse and to control the fluctuations for tax evasion in the economics model proposed by Zaklan, MVM is applied in the neighborhood of the noise critical q{sub c} to evolve the Zaklan model. The Zaklan model had been studied recently using the equilibrium Ising model. Here we show that the Zaklan model is robust because this can be studied using equilibrium dynamics of Ising model also through the nonequilibrium MVM and on various topologies cited above giving the same behavior regardless of dynamic or topology used here.
Analysing and controlling the tax evasion dynamics via majority-vote model
International Nuclear Information System (INIS)
Lima, F W S
2010-01-01
Within the context of agent-based Monte-Carlo simulations, we study the well-known majority-vote model (MVM) with noise applied to tax evasion on simple square lattices, Voronoi-Delaunay random lattices, Barabasi-Albert networks, and Erdoes-Renyi random graphs. In the order to analyse and to control the fluctuations for tax evasion in the economics model proposed by Zaklan, MVM is applied in the neighborhood of the noise critical q c to evolve the Zaklan model. The Zaklan model had been studied recently using the equilibrium Ising model. Here we show that the Zaklan model is robust because this can be studied using equilibrium dynamics of Ising model also through the nonequilibrium MVM and on various topologies cited above giving the same behavior regardless of dynamic or topology used here.
Directory of Open Access Journals (Sweden)
Jieyu Chen
2017-02-01
. Further analyses revealed a marked increase (average increased 14.73 points in lifestyle level among those SHS regression to health after 1.5 years, with respect to the HPLP-II behavioral dimensions, in addition to the total score (t = -15.34, p < 0.001. Conclusions: SHS is highly attributable to unhealthy lifestyles, and the Int. J. Environ. Res. Public Health 2017, 14, 240 2 of 17 mitigation of modifiable lifestyle risk factors may lead to SHS regression. Increased efforts to modify unhealthy lifestyles are necessary to prevent SHS.
Modelling and Analysing Access Control Policies in XACML 3.0
DEFF Research Database (Denmark)
Ramli, Carroline Dewi Puspa Kencana
(c.f. GM03,Mos05,Ris13) and manual analysis of the overall effect and consequences of a large XACML policy set is a very daunting and time-consuming task. In this thesis we address the problem of understanding the semantics of access control policy language XACML, in particular XACML version 3.0....... The main focus of this thesis is modelling and analysing access control policies in XACML 3.0. There are two main contributions in this thesis. First, we study and formalise XACML 3.0, in particular the Policy Decision Point (PDP). The concrete syntax of XACML is based on the XML format, while its standard...... semantics is described normatively using natural language. The use of English text in standardisation leads to the risk of misinterpretation and ambiguity. In order to avoid this drawback, we define an abstract syntax of XACML 3.0 and a formal XACML semantics. Second, we propose a logic-based XACML analysis...
Logistic regression applied to natural hazards: rare event logistic regression with replications
Directory of Open Access Journals (Sweden)
M. Guns
2012-06-01
Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Logistic regression applied to natural hazards: rare event logistic regression with replications
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
Yuan, Qi-ling; Wang, Peng; Liu, Liang; Sun, Fu; Cai, Yong-song; Wu, Wen-tao; Ye, Mao-lin; Ma, Jiang-tao; Xu, Bang-bang; Zhang, Yin-gang
2016-01-01
The aims of this systematic review were to study the analgesic effect of real acupuncture and to explore whether sham acupuncture (SA) type is related to the estimated effect of real acupuncture for musculoskeletal pain. Five databases were searched. The outcome was pain or disability immediately (≤1 week) following an intervention. Standardized mean differences (SMDs) with 95% confidence intervals were calculated. Meta-regression was used to explore possible sources of heterogeneity. Sixty-three studies (6382 individuals) were included. Eight condition types were included. The pooled effect size was moderate for pain relief (59 trials, 4980 individuals, SMD −0.61, 95% CI −0.76 to −0.47; P acupuncture has a moderate effect (approximate 12-point reduction on the 100-mm visual analogue scale) on musculoskeletal pain. SA type did not appear to be related to the estimated effect of real acupuncture. PMID:27471137
Mulroy, Sara J; Winstein, Carolee J; Kulig, Kornelia; Beneck, George J; Fowler, Eileen G; DeMuth, Sharon K; Sullivan, Katherine J; Brown, David A; Lane, Christianne J
2011-12-01
Each of the 4 randomized clinical trials (RCTs) hosted by the Physical Therapy Clinical Research Network (PTClinResNet) targeted a different disability group (low back disorder in the Muscle-Specific Strength Training Effectiveness After Lumbar Microdiskectomy [MUSSEL] trial, chronic spinal cord injury in the Strengthening and Optimal Movements for Painful Shoulders in Chronic Spinal Cord Injury [STOMPS] trial, adult stroke in the Strength Training Effectiveness Post-Stroke [STEPS] trial, and pediatric cerebral palsy in the Pediatric Endurance and Limb Strengthening [PEDALS] trial for children with spastic diplegic cerebral palsy) and tested the effectiveness of a muscle-specific or functional activity-based intervention on primary outcomes that captured pain (STOMPS, MUSSEL) or locomotor function (STEPS, PEDALS). The focus of these secondary analyses was to determine causal relationships among outcomes across levels of the International Classification of Functioning, Disability and Health (ICF) framework for the 4 RCTs. With the database from PTClinResNet, we used 2 separate secondary statistical approaches-mediation analysis for the MUSSEL and STOMPS trials and regression analysis for the STEPS and PEDALS trials-to test relationships among muscle performance, primary outcomes (pain related and locomotor related), activity and participation measures, and overall quality of life. Predictive models were stronger for the 2 studies with pain-related primary outcomes. Change in muscle performance mediated or predicted reductions in pain for the MUSSEL and STOMPS trials and, to some extent, walking speed for the STEPS trial. Changes in primary outcome variables were significantly related to changes in activity and participation variables for all 4 trials. Improvement in activity and participation outcomes mediated or predicted increases in overall quality of life for the 3 trials with adult populations. Variables included in the statistical models were limited to those
Magee, Laura A; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K; Logan, Alexander G; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G; Moutquin, Jean Marie
2016-07-01
For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. This was a planned, secondary analysis of data from the 987 women in the CHIPS Trial. Logistic regression was used to examine the impact of 19 candidate predictors on the probability of adverse perinatal (pregnancy loss or high level neonatal care for >48 h, or birthweight hypertension, preeclampsia, or delivery at blood pressure within 1 week before randomization. Continuous variables were represented continuously or dichotomized based on the smaller p-value in univariate analyses. An area-under-the-receiver-operating-curve (AUC ROC) of ≥0.70 was taken to reflect a potentially useful model. Point estimates for AUC ROC were hypertension (0.70, 95% CI 0.67-0.74) and delivery at hypertension develop an elevated blood pressure in pregnancy, or formerly normotensive women develop new gestational hypertension, maternal and current pregnancy clinical characteristics cannot predict adverse outcomes in the index pregnancy. © 2016 The Authors. Acta Obstetricia et Gynecologica Scandinavica published by John Wiley & Sons Ltd on behalf of Nordic Federation of Societies of Obstetrics and Gynecology (NFOG).
Kim, Seong-Gil; Kim, Wan-Soo
2018-05-15
BACKGROUND The purpose of this study was to investigate the effect of ankle ROM and lower-extremity muscle strength on static balance control ability in young adults. MATERIAL AND METHODS This study was conducted with 65 young adults, but 10 young adults dropped out during the measurement, so 55 young adults (male: 19, female: 36) completed the study. Postural sway (length and velocity) was measured with eyes open and closed, and ankle ROM (AROM and PROM of dorsiflexion and plantarflexion) and lower-extremity muscle strength (flexor and extensor of hip, knee, and ankle joint) were measured. Pearson correlation coefficient was used to examine the correlation between variables and static balance ability. Simple linear regression analysis and multiple linear regression analysis were used to examine the effect of variables on static balance ability. RESULTS In correlation analysis, plantarflexion ROM (AROM and PROM) and lower-extremity muscle strength (except hip extensor) were significantly correlated with postural sway (psimple correlation analysis, all variables that passed the correlation analysis procedure had significant influence (plinear regression analysis, plantar flexion PROM with eyes open significantly influenced sway length (B=0.681) and sway velocity (B=0.011). CONCLUSIONS Lower-extremity muscle strength and ankle plantarflexion ROM influenced static balance control ability, with ankle plantarflexion PROM showing the greatest influence. Therefore, both contractile structures and non-contractile structures should be of interest when considering static balance control ability improvement.
Control designs and stability analyses for Helly’s car-following model
Rosas-Jaimes, Oscar A.; Quezada-Téllez, Luis A.; Fernández-Anaya, Guillermo
Car-following is an approach to understand traffic behavior restricted to pairs of cars, identifying a “leader” moving in front of a “follower”, which at the same time, it is assumed that it does not surpass to the first one. From the first attempts to formulate the way in which individual cars are affected in a road through these models, linear differential equations were suggested by author like Pipes or Helly. These expressions represent such phenomena quite well, even though they have been overcome by other more recent and accurate models. However, in this paper, we show that those early formulations have some properties that are not fully reported, presenting the different ways in which they can be expressed, and analyzing them in their stability behaviors. Pipes’ model can be extended to what it is known as Helly’s model, which is viewed as a more precise model to emulate this microscopic approach to traffic. Once established some convenient forms of expression, two control designs are suggested herein. These regulation schemes are also complemented with their respective stability analyses, which reflect some important properties with implications in real driving. It is significant that these linear designs can be very easy to understand and to implement, including those important features related to safety and comfort.
International Nuclear Information System (INIS)
Openshaw, F.L.; Chan, T.W.
1980-07-01
This paper presents a description of the analyses of the control/protective system preliminary designs for the gas turbine high-temperature gas-cooled reactor (GT-HTGR) power plant. The control system is designed to regulate reactor power, control electric load and turbine speed, control the temperature of the helium delivered to the turbines, and control thermal transients experienced by the system components. In addition, it provides the required control programming for startup, shutdown, load ramp, and other expected operations. The control system also handles conditions imposed on the system during upset and emergency conditions such as loop trip, reactor trip, or electrical load rejection
International Nuclear Information System (INIS)
Jennings, C.D.; Mount, M.E.
1983-08-01
More than 16,000 radiochemical analyses were performed on about 5400 samples of soils, vegetation, animals, fish, invertebrates, and water to establish amounts of 90 Sr, 137 Cs, 241 Am, and plutonium isotopes in the Northern Marshall Islands. Three laboratories were contracted by Lawrence Livermore National Laboratory to perform the radiochemical analyses: Environmental Analysis Laboratory (EAL), Richmond, California; Eberline Instrument Corporation (EIC), Albuquerque, New Mexico; and Laboratory of Radiation Ecology (LRE), University of Washington, Seattle, Washington. The analytical precision and accuracy were monitored by regularly including duplicate samples and natural matrix standards in each group of about 100 samples analyzed. Based on the duplicates and standards, over 83% of the radiochemical analyses in this survey were acceptable - 97% of the analyses by EAL, 45% of the analyses by EIC, and 98% of the analyses by LRE
Differentiating regressed melanoma from regressed lichenoid keratosis.
Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A
2017-04-01
Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Energy Technology Data Exchange (ETDEWEB)
HIGGINS,J.C.; OHARA,J.M.; ALMEIDA,P.
2002-09-19
THE JOSE CABRERA NUCLEAR POWER PLANT IS A ONE LOOP WESTINGHOUSE PRESSURIZED WATER REACTOR. IN THE CONTROL ROOM, THE DISPLAYS AND CONTROLS USED BY OPERATORS FOR THE EMERGENCY OPERATING PROCEDURES ARE DISTRIBUTED ON FRONT AND BACK PANELS. THIS CONFIGURATION CONTRIBUTED TO RISK IN THE PROBABILISTIC SAFETY ASSESSMENT WHERE IMPORTANT OPERATOR ACTIONS ARE REQUIRED. THIS STUDY WAS UNDERTAKEN TO EVALUATE THE IMPACT OF THE DESIGN ON CREW PERFORMANCE AND PLANT SAFETY AND TO DEVELOP DESIGN IMPROVEMENTS.FIVE POTENTIAL EFFECTS WERE IDENTIFIED. THEN NUREG-0711 [1], PROGRAMMATIC, HUMAN FACTORS, ANALYSES WERE CONDUCTED TO SYSTEMATICALLY EVALUATE THE CR-LA YOUT TO DETERMINE IF THERE WAS EVIDENCE OF THE POTENTIAL EFFECTS. THESE ANALYSES INCLUDED OPERATING EXPERIENCE REVIEW, PSA REVIEW, TASK ANALYSES, AND WALKTHROUGH SIMULATIONS. BASED ON THE RESULTS OF THESE ANALYSES, A VARIETY OF CONTROL ROOM MODIFICATIONS WERE IDENTIFIED. FROM THE ALTERNATIVES, A SELECTION WAS MADE THAT PROVIDED A REASONABLEBALANCE BE TWEEN PERFORMANCE, RISK AND ECONOMICS, AND MODIFICATIONS WERE MADE TO THE PLANT.
Reid, M. E.; Iverson, R. M.; Brien, D. L.; Iverson, N. R.; Lahusen, R. G.; Logan, M.
2004-12-01
Most studies of landslide initiation employ limit equilibrium analyses of slope stability. Owing to a lack of detailed data, however, few studies have tested limit-equilibrium predictions against physical measurements of slope failure. We have conducted a series of field-scale, highly controlled landslide initiation experiments at the USGS debris-flow flume in Oregon; these experiments provide exceptional data to test limit equilibrium methods. In each of seven experiments, we attempted to induce failure in a 0.65m thick, 2m wide, 6m3 prism of loamy sand placed behind a retaining wall in the 31° sloping flume. We systematically investigated triggering of sliding by groundwater injection, by prolonged moderate-intensity sprinkling, and by bursts of high intensity sprinkling. We also used vibratory compaction to control soil porosity and thereby investigate differences in failure behavior of dense and loose soils. About 50 sensors were monitored at 20 Hz during the experiments, including nests of tiltmeters buried at 7 cm spacing to define subsurface failure geometry, and nests of tensiometers and pore-pressure sensors to define evolving pore-pressure fields. In addition, we performed ancillary laboratory tests to measure soil porosity, shear strength, hydraulic conductivity, and compressibility. In loose soils (porosity of 0.52 to 0.55), abrupt failure typically occurred along the flume bed after substantial soil deformation. In denser soils (porosity of 0.41 to 0.44), gradual failure occurred within the soil prism. All failure surfaces had a maximum length to depth ratio of about 7. In even denser soil (porosity of 0.39), we could not induce failure by sprinkling. The internal friction angle of the soils varied from 28° to 40° with decreasing porosity. We analyzed stability at failure, given the observed pore-pressure conditions just prior to large movement, using a 1-D infinite-slope method and a more complete 2-D Janbu method. Each method provides a static
International Nuclear Information System (INIS)
Subba Rao, R.V.
2016-01-01
CORAL (COmpact facility for Reprocessing of Advanced fuels in Lead cell) is an experimental facility for demonstrating the reprocessing of irradiated fast reactor fuels discharged from the Fast Breeder Test Reactor (FBTR). The objective of the reprocessing plant is to achieve nuclear grade plutonium and uranium oxides with minimum process waste volumes. The process flow sheet for the reprocessing of spent Fast Reactor Fuel consists of Transport of spent fuel, Chopping, Dissolution, Feed conditioning, Solvent Extraction cycle, Partitioning Cycle and Re-conversion of Plutonium nitrate and uranium nitrate to respective oxides. The efficiency and performance of the plant to achieve desired objective depends on the analyses of various species in the different steps adopted during reprocessing of fuels. The analytical requirements in the plant can be broadly classified as 1. Process control Analyses (Analyses which effect the performance of the plant- PCA); 2. Plant control Analyses (Analyses which indicates efficiency of the plant-PLCA); 3. Nuclear Material Accounting samples (Analyses which has bearing on nuclear material accounting in the plant - NUMAC) and Quality control Analyses (Quality of the input bulk chemicals as well as products - QCA). The analytical methods selected are based on the duration of analyses, precision and accuracies required for each type analytical requirement classified earlier. The process and plant control analyses requires lower precision and accuracies as compared to NUMAC analyses, which requires very high precision accuracy. The time taken for analyses should be as lower as possible for process and plant control analyses as compared to NUMAC analyses. The analytical methods required for determining U and Pu in process and plant samples from FRFR will be different as compared to samples from TRFR (Thermal Reactor Fuel Reprocessing) due to higher Pu to U ratio in FRFR as compared TRFR and they should be such that they can be easily
Pedrini, D. T.; Pedrini, Bonnie C.
Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…
Directory of Open Access Journals (Sweden)
Suely Godoy Agostinho Gimeno
1995-08-01
Full Text Available Exemplifica-se a aplicação de análise multivariada, por estratificação e com regressão logística, utilizando dados de um estudo caso-controle sobre câncer de esôfago. Oitenta e cinco casos e 292 controles foram classificados segundo sexo, idade e os hábitos de beber e de fumar. As estimativas por ponto dos odds ratios foram semelhantes, sendo as duas técnicas consideradas complementares.Data of a case-control study of esophageal cancer were used as an example of the use of multivariate analysis with stratification and logistic regression. Eighty-five cases and 292 controls were classified according to sex, age and smoking and drinking habits. The point estimates of the odds ratios were similar, and the techniques were considered complementary.
Kim, Seong-Gil
2018-01-01
Background The purpose of this study was to investigate the effect of ankle ROM and lower-extremity muscle strength on static balance control ability in young adults. Material/Methods This study was conducted with 65 young adults, but 10 young adults dropped out during the measurement, so 55 young adults (male: 19, female: 36) completed the study. Postural sway (length and velocity) was measured with eyes open and closed, and ankle ROM (AROM and PROM of dorsiflexion and plantarflexion) and lower-extremity muscle strength (flexor and extensor of hip, knee, and ankle joint) were measured. Pearson correlation coefficient was used to examine the correlation between variables and static balance ability. Simple linear regression analysis and multiple linear regression analysis were used to examine the effect of variables on static balance ability. Results In correlation analysis, plantarflexion ROM (AROM and PROM) and lower-extremity muscle strength (except hip extensor) were significantly correlated with postural sway (pregression analysis, plantar flexion PROM with eyes open significantly influenced sway length (B=0.681) and sway velocity (B=0.011). Conclusions Lower-extremity muscle strength and ankle plantarflexion ROM influenced static balance control ability, with ankle plantarflexion PROM showing the greatest influence. Therefore, both contractile structures and non-contractile structures should be of interest when considering static balance control ability improvement. PMID:29760375
Better Autologistic Regression
Directory of Open Access Journals (Sweden)
Mark A. Wolters
2017-11-01
Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.
Rhythm vs. rate control of atrial fibrillation meta-analysed by number needed to treat
Kumana, Cyrus R; Cheung, Bernard M Y; Cheung, Giselle T Y; Ovedal, Tori; Pederson, Bjorn; Lauder, Ian J
2005-01-01
Background: Whenever feasible, rhythm control of atrial fibrillation (AF) was generally preferred over rate control, in the belief that it offered better symptomatic relief and quality of life, and eliminated the need for anticoagulation. However, recent trials appear to challenge these assumptions. Aims: To explore the desirability of rhythm vs. rate control of AF by systematic review of pertinent, published, randomized controlled trials (RCTs) and a meta-analysis by number needed to treat (...
The control of a free-piston engine generator. Part 1: Fundamental analyses
Energy Technology Data Exchange (ETDEWEB)
Mikalsen, R.; Roskilly, A.P. [Sir Joseph Swan Institute for Energy Research, Newcastle University, Newcastle upon Tyne, NE1 7RU, England (United Kingdom)
2010-04-15
Free-piston engines are under investigation by a number of research groups due to potential fuel efficiency and exhaust emissions advantages over conventional technology. The main challenge with such engines is the control of the piston motion, and this has not yet been fully resolved for all types of free-piston engines. This paper discusses the basic features of a single piston free-piston engine generator under development at Newcastle University and investigates engine control issues using a full-cycle simulation model. Control variables and disturbances are identified, and a control strategy is proposed. It is found that the control of the free-piston engine is a challenge, but that the proposed control strategy is feasible. Engine speed control does, however, represent a challenge in the current design. (author)
Directory of Open Access Journals (Sweden)
Peter A Coventry
Full Text Available Collaborative care is a complex intervention based on chronic disease management models and is effective in the management of depression. However, there is still uncertainty about which components of collaborative care are effective. We used meta-regression to identify factors in collaborative care associated with improvement in patient outcomes (depressive symptoms and the process of care (use of anti-depressant medication.Systematic review with meta-regression. The Cochrane Collaboration Depression, Anxiety and Neurosis Group trials registers were searched from inception to 9th February 2012. An update was run in the CENTRAL trials database on 29th December 2013. Inclusion criteria were: randomised controlled trials of collaborative care for adults ≥18 years with a primary diagnosis of depression or mixed anxiety and depressive disorder. Random effects meta-regression was used to estimate regression coefficients with 95% confidence intervals (CIs between study level covariates and depressive symptoms and relative risk (95% CI and anti-depressant use. The association between anti-depressant use and improvement in depression was also explored. Seventy four trials were identified (85 comparisons, across 21,345 participants. Collaborative care that included psychological interventions predicted improvement in depression (β coefficient -0.11, 95% CI -0.20 to -0.01, p = 0.03. Systematic identification of patients (relative risk 1.43, 95% CI 1.12 to 1.81, p = 0.004 and the presence of a chronic physical condition (relative risk 1.32, 95% CI 1.05 to 1.65, p = 0.02 predicted use of anti-depressant medication.Trials of collaborative care that included psychological treatment, with or without anti-depressant medication, appeared to improve depression more than those without psychological treatment. Trials that used systematic methods to identify patients with depression and also trials that included patients with a chronic physical
Liefooghe, Baptist; De Houwer, Jan
2016-02-01
Cognitive control is an important mental ability that is examined using a multitude of cognitive control tasks and effects. The present paper presents the first steps in the elaboration of a functional approach, which aims to uncover the communalities and differences between different cognitive control tasks and their effects. Based on the idea that responses in cognitive control tasks qualify as operant behaviour, we propose to reinterpret cognitive control tasks in terms of operant contingencies and cognitive control effects as instances of moderated stimulus control. We illustrate how our approach can be used to uncover communalities between topographically different cognitive control tasks and can lead to novel questions about the processes underlying cognitive control. © 2015 International Union of Psychological Science.
Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.
1994-01-01
This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.
Elliott, Stephen J.; Cheer, Jordan; Bhan, Lam; Shi, Chuang; Gan, Woon-Seng
2018-04-01
The active control of an incident sound field with an array of secondary sources is a fundamental problem in active control. In this paper the optimal performance of an infinite array of secondary sources in controlling a plane incident sound wave is first considered in free space. An analytic solution for normal incidence plane waves is presented, indicating a clear cut-off frequency for good performance, when the separation distance between the uniformly-spaced sources is equal to a wavelength. The extent of the near field pressure close to the source array is also quantified, since this determines the positions of the error microphones in a practical arrangement. The theory is also extended to oblique incident waves. This result is then compared with numerical simulations of controlling the sound power radiated through an open aperture in a rigid wall, subject to an incident plane wave, using an array of secondary sources in the aperture. In this case the diffraction through the aperture becomes important when its size is compatible with the acoustic wavelength, in which case only a few sources are necessary for good control. When the size of the aperture is large compared to the wavelength, and diffraction is less important but more secondary sources need to be used for good control, the results then become similar to those for the free field problem with an infinite source array.
DEFF Research Database (Denmark)
Johansen, Søren
2008-01-01
The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...
International Nuclear Information System (INIS)
El-Madbouly, E.I.; Shaat, M.K.; Shokr, A.M.; Elrefaei, G.H.
2009-01-01
The functions of the Instrumentation and Control (I and C) system in research reactors, the changes in its design according to the advances in the technology, and the internationally established safety requirements on the design and operational performance of this system are reviewed. The main features of the communication networks commonly used in the Supervision and Control systems (SCS) are presented. A methodology for the performance analysis of the communication networks of computer-based distributed SCS is developed and presented along with discussions. Application of this methodology to a modern SCS of a typical research reactor is illustrated. (orig.)
Energy Technology Data Exchange (ETDEWEB)
El-Madbouly, E.I. [Menoufia Univ., Menouf (Egypt). Faculty of Electronics Engineering; Shaat, M.K.; Shokr, A.M.; Elrefaei, G.H. [Atomic Energy Authority, Abouzabal (Egypt). Egypt Second Research Reactor
2009-04-15
The functions of the Instrumentation and Control (I and C) system in research reactors, the changes in its design according to the advances in the technology, and the internationally established safety requirements on the design and operational performance of this system are reviewed. The main features of the communication networks commonly used in the Supervision and Control systems (SCS) are presented. A methodology for the performance analysis of the communication networks of computer-based distributed SCS is developed and presented along with discussions. Application of this methodology to a modern SCS of a typical research reactor is illustrated. (orig.)
DEFF Research Database (Denmark)
Sjolie, A.K.; Klein, R.; Porta, M.
2008-01-01
-group, placebo-controlled trial in 309 centres worldwide. We recruited normoalbuminuric, normotensive, or treated hypertensive people with type 2 diabetes with mild to moderately severe retinopathy and assigned them to candesartan 16 mg once a day or placebo. After a month, the dose was doubled to 32 mg once per...... day. Investigators and patients were unaware of the treatment allocation status. Progression of retinopathy was the primary endpoint, and regression was a secondary endpoint. Analysis was by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT00252694. FINDINGS: 1905...... or changes in blood pressure during the trial. An overall change towards less severe retinopathy by the end of the trial was observed in the candesartan group (odds 1.17, 95% CI 1.05-1.30, p=0.003). Adverse events did not differ between the treatment groups. INTERPRETATION: Treatment with candesartan in type...
DEFF Research Database (Denmark)
Henneberg, Morten; Jørgensen, Bent; Eriksen, René Lynge
2016-01-01
In this paper, we present an oil condition and wear debris evaluation method for ship thruster gears using T2 statistics to form control charts from a multi-sensor platform. The proposed method takes into account the different ambient conditions by multiple linear regression on the mean value...... only quasi-stationary data are included in phase I of the T2 statistics. Data from two thruster gears onboard two different ships are presented and analyzed, and the selection of the phase I data size is discussed. A graphic overview for quick localization of T2 signaling is also demonstrated using...... spider plots. Finally, progression and trending of the T2 statistics are investigated using orthogonal polynomials for a fix-sized data window....
Regression analysis with categorized regression calibrated exposure: some interesting findings
Directory of Open Access Journals (Sweden)
Hjartåker Anette
2006-07-01
Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a
Leucht, Stefan; Leucht, Claudia; Huhn, Maximilian; Chaimani, Anna; Mavridis, Dimitris; Helfer, Bartosz; Samara, Myrto; Rabaioli, Matteo; Bächer, Susanne; Cipriani, Andrea; Geddes, John R; Salanti, Georgia; Davis, John M
2017-10-01
Antipsychotic drug efficacy may have decreased over recent decades. The authors present a meta-analysis of all placebo-controlled trials in patients with acute exacerbations of schizophrenia, and they investigate which trial characteristics have changed over the years and which are moderators of drug-placebo efficacy differences. The search included multiple electronic databases. The outcomes were overall efficacy (primary outcome); responder and dropout rates; positive, negative, and depressive symptoms; quality of life; functioning; and major side effects. Potential moderators of efficacy were analyzed by meta-regression. The analysis included 167 double-blind randomized controlled trials with 28,102 mainly chronic participants. The standardized mean difference (SMD) for overall efficacy was 0.47 (95% credible interval 0.42, 0.51), but accounting for small-trial effects and publication bias reduced the SMD to 0.38. At least a "minimal" response occurred in 51% of the antipsychotic group versus 30% in the placebo group, and 23% versus 14% had a "good" response. Positive symptoms (SMD 0.45) improved more than negative symptoms (SMD 0.35) and depression (SMD 0.27). Quality of life (SMD 0.35) and functioning (SMD 0.34) improved even in the short term. Antipsychotics differed substantially in side effects. Of the response predictors analyzed, 16 trial characteristics changed over the decades. However, in a multivariable meta-regression, only industry sponsorship and increasing placebo response were significant moderators of effect sizes. Drug response remained stable over time. Approximately twice as many patients improved with antipsychotics as with placebo, but only a minority experienced a good response. Effect sizes were reduced by industry sponsorship and increasing placebo response, not decreasing drug response. Drug development may benefit from smaller samples but better-selected patients.
Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-01
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.
Analysing the control software of the Compact Muon Solenoid Experiment at the Large Hadron Collider
Hwong, Y.L.; Kusters, V.J.J.; Willemse, T.A.C.; Arbab, F.; Sirjani, M.
2012-01-01
The control software of the CERN Compact Muon Solenoid experiment contains over 30,000 finite state machines. These state machines are organised hierarchically: commands are sent down the hierarchy and state changes are sent upwards. The sheer size of the system makes it virtually impossible to
Backman, Erik
2011-01-01
Research indicates that outdoor teaching practices within a physical education (PE) context are controlled by several factors with the potential to weaken or strengthen PE teachers' communication of pedagogic messages. Drawing on 12 qualitative interviews with PE teachers in compulsory schools in Sweden, the findings in this study suggest that…
Social relations model analyses of perceived self-control and trust in families
Büyükcan Tetik, A.; Finkenauer, C.; Siersema, M.; Vander Heyden, K.; Krabbendam, L.
2015-01-01
How do people know which family member is trustworthy? In this study, the authors tested the hypothesis that people use their perception of a family member's self-control as an indicator of his or her trustworthiness. Eighty-four Dutch families consisting of 2 parents and 2 children completed
Analysing the Effectiveness of Wearable Wireless Sensors in Controlling Crowd Disasters
Teo, Y.H.A.; Viswanathan, V.; Lees, M.; Cai, W.
2014-01-01
The Love Parade disaster in Duisberg, Germany lead to several deaths and injuries. Disasters like this occur due to the existence of high densities in a limited area. We propose a wearable electronic device that helps reduce such disasters by directing people and thus controlling the density of the
Evaluation of strength-controlling defects in paper by stress concentration analyses
John M. Considine; David W. Vahey; James W. Evans; Kevin T. Turner; Robert E. Rowlands
2011-01-01
Cellulosic webs, such as paper materials, are composed of an interwoven, bonded network of cellulose fibers. Strength-controlling parameters in these webs are influenced by constituent fibers and method of processing and manufacture. Instead of estimating the effect on tensile strength of each processing/manufacturing variable, this study modifies and compares the...
Analysing potato late blight control as a social-ecological system using fuzzy cognitive mapping
Pacilly, Francine C.A.; Groot, Jeroen C.J.; Hofstede, Gert Jan; Schaap, Ben F.; Lammerts van Bueren, Edith
2016-01-01
Potato late blight, caused by Phytophthora infestans, is one of the main diseases in potato production, causing major losses in yield. Applying environmentally harmful fungicides is the prevailing and classical method for controlling late blight, thus contaminating food and water. There is
Zeng, Yi; Qi, Shulan; Meng, Xing; Chen, Yinyin
2018-03-12
By analysing the defect of control design in randomized controlled trials (RCTs) of simple obesity treated with acupuncture and using acupuncture as the contrast, presenting the essential factors which should be taken into account as designing the control of clinical trial to further improve the clinical research. Setting RCTs of acupuncture treating simple obesity as a example, we searched RCTs of acupuncture treating simple obesity with acupuncture control. According to the characteristics of acupuncture therapy, this research sorted and analysed the control approach of intervention from aspects of acupoint selection, the penetration of needle, the depth of insertion, etc, then calculated the amount of difference factor between the two groups and analyzed the rationality. In 15 RCTs meeting the inclusion criterias, 7 published in English, 8 in Chinese, the amount of difference factors between two groups greater than 1 was 6 (40%), 4 published in English abroad, 2 in Chinese, while only 1 was 9 (60%), 3 published in English, 6 in Chinese. Control design of acupuncture in some clinical RCTs is unreasonable for not considering the amount of difference factors between the two groups.
A survey on the VXIbus and validity analyses for instrumentation and control in NPPs
International Nuclear Information System (INIS)
Kwon, Kee Choon; Park, Won Man
1997-06-01
This document presents the technical status of the VXIbus system and its interface. VMEbus, while developed as a backplane for Motorola processors, can be used for data acquisition, control and other instrumentation applications. The VXIbus and its associated standard for form, fit, and electrical interface have simplified the process of putting together automated instrumentation systems. The VXIplug and play system alliance was founded in 1993, the alliance's charter to improve the effectiveness of VXI-based solutions by increasing ease-of-use and improving the interoperability of mainframes, computers, instrumentations, and software through open, multivendor standards and practices. This technical report surveys surveys the instrumentation and control in NPPs apply to the VXI-based instruments which are studied expendability, interoperability, maintainability and other features. (author). 10 refs., 4 tabs., 25 figs
Yi, Jun; Yang, Wenhong; Sun, Wen-Hua; Nomura, Kotohiro; Hada, Masahiko
2017-11-30
The NMR chemical shifts of vanadium ( 51 V) in (imido)vanadium(V) dichloride complexes with imidazolin-2-iminato and imidazolidin-2-iminato ligands were calculated by the density functional theory (DFT) method with GIAO. The calculated 51 V NMR chemical shifts were analyzed by the multiple linear regression (MLR) analysis (MLRA) method with a series of calculated molecular properties. Some of calculated NMR chemical shifts were incorrect using the optimized molecular geometries of the X-ray structures. After the global minimum geometries of all of the molecules were determined, the trend of the observed chemical shifts was well reproduced by the present DFT method. The MLRA method was performed to investigate the correlation between the 51 V NMR chemical shift and the natural charge, band energy gap, and Wiberg bond index of the V═N bond. The 51 V NMR chemical shifts obtained with the present MLR model were well reproduced with a correlation coefficient of 0.97.
Choice of Appropriate Control Values for Effective Analyses of Damage Detection
Directory of Open Access Journals (Sweden)
Venglár Michal
2017-03-01
Full Text Available The article is devoted to a suitable choice of input parameters for the efficient running of a self-developed code used for damage detection. The code was prepared in Office Excel VBA, which used a non-destructive vibration-based method. The primary aim of the code is to determine the change in bending stiffness by using the FE model updating method, and the aim of the paper is to determine the effect of the input data on the bending stiffness calculations. The code was applied for a numerical model of a steel bar. The steel bar was a simply supported beam with a span of 3.5 m. The time of the calculations and precision of the identification were investigated. The values of the time consumption depend on the input values, the desired limit of the accepted error, and the length of the step in every iteration. Data from an experimental model was analysed. The model was made of wooden and plaster boards. The calculations were done in accordance with suitable input data from a parametric study.
Glavind, J; Henriksen, T B; Kindberg, S F; Uldbjerg, N
2014-11-01
To evaluate women's preferences for timing of elective cesarean section (ECS) scheduled prior to versus after 39 completed weeks. Secondary analyses from a randomized controlled open-label trial were conducted at seven Danish tertiary hospitals from March 2009 to June 2011 with inclusion of singleton pregnant women with a healthy fetus. The women were allocated by a computerized telephone system to ECS scheduled at 38(+3) weeks or 39(+3) weeks of gestation. Dissatisfaction with timing of ECS and preferred timing of the procedure in a proposed future ECS delivery were evaluated. Data analyses were done by intention-to-treat, using logistic regression. A total of 1196 women (94%) completed an online questionnaire at follow-up eight weeks postpartum. In the 38 weeks group, 61 (10%) women 601 were dissatisfied with the timing of their ECS, whereas in the 39 weeks group 157 (26%) of 595 were dissatisfied (adjOR 3.18, 95% CI 2.30; 4.40). The proportion of women who preferred the same timing in a future ECS were 272 (45%) in the 38 weeks group compared to 232 (39%) in the 39(+3) weeks group (adjOR 0.75, 95% CI 0.60; 0.95). The women in this trial preferred ECS scheduled prior to 39 weeks of gestation.
Fungible weights in logistic regression.
Jones, Jeff A; Waller, Niels G
2016-06-01
In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Grape juice quality control by means of {sup 1}H NMR spectroscopy and chemometric analyses
Energy Technology Data Exchange (ETDEWEB)
Grandizoli, Caroline Werner Pereira da Silva; Campos, Francinete Ramos; Simonelli, Fabio; Barison, Andersson [Universidade Federal do Paraná (UFPR), Curitiba (Brazil). Departamento de Química
2014-07-01
This work shows the application of {sup 1}H NMR spectroscopy and chemometrics for quality control of grape juice. A wide range of quality assurance parameters were assessed by single {sup 1}H NMR experiments acquired directly from juice. The investigation revealed that conditions and time of storage should be revised and indicated on all labels. The sterilization process of homemade grape juices was efficient, making it possible to store them for long periods without additives. Furthermore, chemometric analysis classified the best commercial grape juices to be similar to homemade grape juices, indicating that this approach can be used to determine the authenticity after adulteration. (author)
Hernández, Domingo; Ruiz-Esteban, Pedro; Gaitán, Daniel; Burgos, Dolores; Mazuecos, Auxiliadora; Collantes, Rocío; Briceño, Eva; Palma, Eulalia; Cabello, Mercedes; González-Molina, Miguel; De Mora, Manuel
2014-04-23
Left ventricular hypertrophy (LVH) is common in kidney transplant (KT) recipients. LVH is associated with a worse outcome, though m-TOR therapy may help to revert this complication. We therefore conducted a longitudinal study to assess morphological and functional echocardiographic changes after conversion from CNI to m-TOR inhibitor drugs in nondiabetic KT patients who had previously received RAS blockers during the follow-up. We undertook a 1-year nonrandomized controlled study in 30 non-diabetic KT patients who were converted from calcineurin inhibitor (CNI) to m-TOR therapy. A control group received immunosuppressive therapy based on CNIs. Two echocardiograms were done during the follow-up. Nineteen patients were switched to SRL and 11 to EVL. The m-TOR group showed a significant reduction in LVMi after 1 year (from 62 ± 22 to 55 ± 20 g/m2.7; P=0.003, paired t-test). A higher proportion of patients showing LVMi reduction was observed in the m-TOR group (53.3 versus 29.3%, P=0.048) at the study end. In addition, only 56% of the m-TOR patients had LVH at the study end compared to 77% of the control group (P=0.047). A significant change from baseline in deceleration time in early diastole was observed in the m-TOR group compared with the control group (P=0.019). Switching from CNI to m-TOR therapy in non-diabetic KT patients may regress LVH, independently of blood pressure changes and follow-up time. This suggests a direct non-hemodynamic effect of m-TOR drugs on cardiac mass.
Abstract Expression Grammar Symbolic Regression
Korns, Michael F.
This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.
Control Rod Withdrawal Events Analyses for the Prototype Gen-IV SFR
Energy Technology Data Exchange (ETDEWEB)
Choi, Chiwoong; Ha, Kwiseo; Jeong, Taekyeong; Jeong, Jaeho; Chang, Wonpyo; Lee, Seungwon; An, Sangjun; Lee, Kwilim [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2016-10-15
To confirm the limiting condition, based on the maximum allowable reactivity insertion of 0.3 $, three cases from the end of cycle (EOC) are selected. In addition, assuming the failure of CRSS by earthquake, additional cases is defined at beginning of cycle (BOC). When the CRW occurs, the reactor can be protected by plant protection system (PPS). In this study, PPS mechanism is sequentially studied for all initiating events. For design basis accidents (DBA), the reactor can be scrammed by reactor protection system (RPS). The first and seconds RPS signals are checked during transients. When RPS is failed, so called as anticipated transient without scram (ATWS), the reactor will be protected by diverse protection system (DPS). In this study, in order to analyze various initiating events related control rod withdrawal, four kinds of operating condition is defined. TOP events are analyzed using MARS-LMR. The influence of various plant protection system such as RPS and DPS are investigated.
Isotopic footprint: ¿does the forensic analyses improve forest control?
Directory of Open Access Journals (Sweden)
Ulrich Melessa
2013-10-01
Full Text Available In the Ecuadorian market a high percentage of timber from tropical forests is of illegal origin. Illegal acts and infringments along the production chain are more frequent if the concern species is valuable such as mahogany (Swietenia macrophylla and cedar (Cedrela odorata. In this regard, one of the most frequently falsified data is the geographical origin of wood. At date there is no forensic scientific method for determining objectively and independently the geographic source stated in the documentation of traded timber. The analysis of the isotope composition, known as a isotope fingerprint, has a clear special pattern and is feasible for this purpose.From Ecuador samples of mahogany and cedar were contributed to build a geo-referenced database and improve the method to make it more operational to serve in control and surveillance programs. This article explains the problems related to the subject, the method and its potential use.
Jaremka, Lisa M.; Derry, Heather M.; Bornstein, Robert; Prakash, Ruchika Shaurya; Peng, Juan; Belury, Martha A.; Andridge, Rebecca R.; Malarkey, William B.; Kiecolt-Glaser, Janice K.
2014-01-01
Objective Loneliness enhances risk for episodic memory declines over time. Omega-3 supplementation can improve cognitive function for people experiencing mild cognitive difficulties. Accordingly, we explored whether omega-3 supplementation would attenuate loneliness-related episodic memory problems. Methods Participants (N=138) from a parent randomized controlled trial (RCT) were randomized to the placebo, 1.25 grams/day of omega-3, or 2.50 grams/day of omega-3 conditions for a 4-month period. They completed a baseline loneliness questionnaire and a battery of cognitive tests both at baseline and at the end of the RCT. Results Controlling for baseline verbal episodic memory scores, lonelier people within the placebo condition had poorer verbal episodic memory post-supplementation, as measured by immediate (b = −0.28, t(117) = −2.62, p = .010) and long-delay (b = −.06, t(116) = −2.07, p = .040) free recall, than their less lonely counterparts. This effect was not observed in the 1.25 grams/day and 2.50 grams/day supplementation groups, all p values > .10. The plasma omega-6:omega-3 ratio data mirrored these results. There were no loneliness-related effects of omega-3 supplementation on short-delay recall or the other cognitive tests, all p values > .32. Conclusion These results suggest that omega-3 supplementation attenuates loneliness-related verbal episodic memory declines over time and support the utility of exploring novel interventions for treating episodic memory problems among lonely people. ClinicalTrials.gov identifier: NCT00385723 PMID:25264972
Directory of Open Access Journals (Sweden)
Barker Bridget M
2012-02-01
Full Text Available Abstract Background Aspergillus fumigatus is a mold responsible for the majority of cases of aspergillosis in humans. To survive in the human body, A. fumigatus must adapt to microenvironments that are often characterized by low nutrient and oxygen availability. Recent research suggests that the ability of A. fumigatus and other pathogenic fungi to adapt to hypoxia contributes to their virulence. However, molecular mechanisms of A. fumigatus hypoxia adaptation are poorly understood. Thus, to better understand how A. fumigatus adapts to hypoxic microenvironments found in vivo during human fungal pathogenesis, the dynamic changes of the fungal transcriptome and proteome in hypoxia were investigated over a period of 24 hours utilizing an oxygen-controlled fermenter system. Results Significant increases in transcripts associated with iron and sterol metabolism, the cell wall, the GABA shunt, and transcriptional regulators were observed in response to hypoxia. A concomitant reduction in transcripts was observed with ribosome and terpenoid backbone biosynthesis, TCA cycle, amino acid metabolism and RNA degradation. Analysis of changes in transcription factor mRNA abundance shows that hypoxia induces significant positive and negative changes that may be important for regulating the hypoxia response in this pathogenic mold. Growth in hypoxia resulted in changes in the protein levels of several glycolytic enzymes, but these changes were not always reflected by the corresponding transcriptional profiling data. However, a good correlation overall (R2 = 0.2, p A. fumigatus. Conclusions Taken together, our data suggest a robust cellular response that is likely regulated both at the transcriptional and post-transcriptional level in response to hypoxia by the human pathogenic mold A. fumigatus. As with other pathogenic fungi, the induction of glycolysis and transcriptional down-regulation of the TCA cycle and oxidative phosphorylation appear to major
Risk Analyses of Charging Pump Control Improvements for Alternative RCP Seal Cooling
Energy Technology Data Exchange (ETDEWEB)
Lee, Eun-Chan [Korea Hydro and Nuclear Power Co. Ltd. Daejeon (Korea, Republic of)
2015-10-15
There are two events that significantly affect the plant risk during a TLOCCW event. One is an event in which the seal assembly of a reactor coolant pump (RCP) fails due to heating stress from the loss of cooling water; the other is an event in which the operators fail to conduct alternative cooling for the RCP seal during the accident. KHNP reviewed the replacement of the RCP seal with a qualified shutdown seal in order to remove the risk due to RCP seal failure during a TLOCCW. As an optional measure, a design improvement in the alternative cooling method for the RCP seal is being considered. This analysis presents the alternative RCP seal cooling improvement and its safety effect. K2 is a nuclear power plant with a Westinghouse design, and it has a relatively high CDF during TLOCCW events because it has a different CCW system design and difficulty in preparing alternative cooling water sources. This analysis confirmed that an operator action providing cold water to the RWST as RCP seal injection water during a TLOCCW event is very important in K2. The control circuit improvement plan for the auxiliary charging pump was established in order to reduce the failure probability of this operator action. This analysis modeled the improvement as a fault tree and evaluated the resulting CDF change. The consequence demonstrated that the RCP seal injection failure probability was reduced by 89%, and the CDF decreased by 28%.
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2012-01-01
Background Aspergillus fumigatus is a mold responsible for the majority of cases of aspergillosis in humans. To survive in the human body, A. fumigatus must adapt to microenvironments that are often characterized by low nutrient and oxygen availability. Recent research suggests that the ability of A. fumigatus and other pathogenic fungi to adapt to hypoxia contributes to their virulence. However, molecular mechanisms of A. fumigatus hypoxia adaptation are poorly understood. Thus, to better understand how A. fumigatus adapts to hypoxic microenvironments found in vivo during human fungal pathogenesis, the dynamic changes of the fungal transcriptome and proteome in hypoxia were investigated over a period of 24 hours utilizing an oxygen-controlled fermenter system. Results Significant increases in transcripts associated with iron and sterol metabolism, the cell wall, the GABA shunt, and transcriptional regulators were observed in response to hypoxia. A concomitant reduction in transcripts was observed with ribosome and terpenoid backbone biosynthesis, TCA cycle, amino acid metabolism and RNA degradation. Analysis of changes in transcription factor mRNA abundance shows that hypoxia induces significant positive and negative changes that may be important for regulating the hypoxia response in this pathogenic mold. Growth in hypoxia resulted in changes in the protein levels of several glycolytic enzymes, but these changes were not always reflected by the corresponding transcriptional profiling data. However, a good correlation overall (R2 = 0.2, p proteomics datasets for all time points. The lack of correlation between some transcript levels and their subsequent protein levels suggests another regulatory layer of the hypoxia response in A. fumigatus. Conclusions Taken together, our data suggest a robust cellular response that is likely regulated both at the transcriptional and post-transcriptional level in response to hypoxia by the human pathogenic mold A. fumigatus. As
Hagger, M.S.; Hardcastle, S.J.; Chater, A.; Mallett, C.; Pal, S.; Chatzisarantis, N.L.D.
2014-01-01
Self-determination theory has been applied to the prediction of a number of health-related behaviors with self-determined or autonomous forms of motivation generally more effective in predicting health behavior than non-self-determined or controlled forms. Research has been confined to examining the motivational predictors in single health behaviors rather than comparing effects across multiple behaviors. The present study addressed this gap in the literature by testing the relative contribution of autonomous and controlling motivation to the prediction of a large number of health-related behaviors, and examining individual differences in self-determined motivation as a moderator of the effects of autonomous and controlling motivation on health behavior. Participants were undergraduate students (N = 140) who completed measures of autonomous and controlled motivational regulations and behavioral intention for 20 health-related behaviors at an initial occasion with follow-up behavioral measures taken four weeks later. Path analysis was used to test a process model for each behavior in which motivational regulations predicted behavior mediated by intentions. Some minor idiosyncratic findings aside, between-participants analyses revealed significant effects for autonomous motivational regulations on intentions and behavior across the 20 behaviors. Effects for controlled motivation on intentions and behavior were relatively modest by comparison. Intentions mediated the effect of autonomous motivation on behavior. Within-participants analyses were used to segregate the sample into individuals who based their intentions on autonomous motivation (autonomy-oriented) and controlled motivation (control-oriented). Replicating the between-participants path analyses for the process model in the autonomy- and control-oriented samples did not alter the relative effects of the motivational orientations on intention and behavior. Results provide evidence for consistent effects
Goldie, Sue J; Daniels, Norman
2011-09-21
Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without
DEFF Research Database (Denmark)
Fitzenberger, Bernd; Wilke, Ralf Andreas
2015-01-01
if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...
Logic regression and its extensions.
Schwender, Holger; Ruczinski, Ingo
2010-01-01
Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.
Beerens, Moniek W; Ten Cate, Jacob M; Buijs, Mark J; van der Veen, Monique H
2017-11-17
Casein-phosphopeptide-amorphous-calcium-fluoride-phosphate (CPP-ACFP) can remineralize subsurface lesions. It is the active ingredient of MI-Paste-Plus® (MPP). The long-term remineralization efficacy is unknown. To evaluate the long-term effect of MPP versus a placebo paste on remineralization of enamel after fixed orthodontic treatment over a 12-month period. This trial was designed as a prospective, double-blinded, placebo-controlled RCT. Patients with subsurface lesions scheduled for removal of the appliance were included. They applied either MPP or control paste once a day at bedtime for 12 months, complementary to normal oral hygiene. Changes in enamel lesions (primary outcome) were fluorescence loss and lesion area determined by quantitative light-induced fluorescence (QLF). Secondary outcomes were Microbial composition, by conventional plating, and acidogenicity of plaque, by capillary ion analysis (CIA), and lesion changes scored visually on clinical photographs. Participants [age = 15.5 years (SD = 1.6)] were randomly assigned to either the MPP or the control group, as determined by a computer-randomization scheme, created and locked before the start of the study. Participants received neutral-coloured concealed toothpaste tubes marked A or B. The patients and the observers were blinded with respect to the content of tube A or B. A total of 51 patients were analysed; MPP (n = 25) versus control group (n = 26); data loss (n = 14). There was no significant difference between the groups over time for all the used outcome measures. There was a significant improvement in enamel lesions (fluorescence loss) over time in both groups (P orthodontic fixed appliance treatment did not improve these lesions during the 1 year following debonding. This trial is registered at the medical ethical committee of the VU Medical Centre in Amsterdam (NL.199226.029.07). © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society
Understanding logistic regression analysis
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...
Introduction to regression graphics
Cook, R Dennis
2009-01-01
Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava
Alternative Methods of Regression
Birkes, David
2011-01-01
Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s
Yamagishi, Kazumasa; Sato, Shinichi; Kitamura, Akihiko; Kiyama, Masahiko; Okada, Takeo; Tanigawa, Takeshi; Ohira, Tetsuya; Imano, Hironori; Kondo, Masahide; Okubo, Ichiro; Ishikawa, Yoshinori; Shimamoto, Takashi; Iso, Hiroyasu
2012-09-01
The nation-wide, community-based intensive hypertension detection and control program, as well as universal health insurance coverage, may well be contributing factors for helping Japan rank near the top among countries with the longest life expectancy. We sought to examine the cost-effectiveness of such a community-based intervention program, as no evidence has been available for this issue. The hypertension detection and control program was initiated in 1963 in full intervention and minimal intervention communities in Akita, Japan. We performed comparative cost-effectiveness and budget-impact analyses for the period 1964-1987 of the costs of public health services and treatment of patients with hypertension and stroke on the one hand, and incidence of stroke on the other in the full intervention and minimal intervention communities. The program provided in the full intervention community was found to be cost saving 13 years after the beginning of program in addition to the fact of effectiveness that; the prevalence and incidence of stroke were consistently lower in the full intervention community than in the minimal intervention community throughout the same period. The incremental cost was minus 28,358 yen per capita over 24 years. The community-based intensive hypertension detection and control program was found to be both effective and cost saving. The national government's policy to support this program may have contributed in part to the substantial decline in stroke incidence and mortality, which was largely responsible for the increase in Japanese life expectancy.
Directory of Open Access Journals (Sweden)
F.M.O. Borges
2003-12-01
que significou pouca influência da metodologia sobre essa medida. A FDN não mostrou ser melhor preditor de EM do que a FB.One experiment was run with broiler chickens, to obtain prediction equations for metabolizable energy (ME based on feedstuffs chemical analyses, and determined ME of wheat grain and its by-products, using four different methodologies. Seven wheat grain by-products were used in five treatments: wheat grain, wheat germ, white wheat flour, dark wheat flour, wheat bran for human use, wheat bran for animal use and rough wheat bran. Based on chemical analyses of crude fiber (CF, ether extract (EE, crude protein (CP, ash (AS and starch (ST of the feeds and the determined values of apparent energy (MEA, true energy (MEV, apparent corrected energy (MEAn and true energy corrected by nitrogen balance (MEVn in five treatments, prediction equations were obtained using the stepwise procedure. CF showed the best relationship with metabolizable energy values, however, this variable alone was not enough for a good estimate of the energy values (R² below 0.80. When EE and CP were included in the equations, R² increased to 0.90 or higher in most estimates. When the equations were calculated with all treatments, the equation for MEA were less precise and R² decreased. When ME data of the traditional or force-feeding methods were used separately, the precision of the equations increases (R² higher than 0.85. For MEV and MEVn values, the best multiple linear equations included CF, EE and CP (R²>0.90, independently of using all experimental data or separating by methodology. The estimates of MEVn values showed high precision and the linear coefficients (a of the equations were similar for all treatments or methodologies. Therefore, it explains the small influence of the different methodologies on this parameter. NDF was not a better predictor of ME than CF.
A Simulation Investigation of Principal Component Regression.
Allen, David E.
Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…
Directory of Open Access Journals (Sweden)
Matthias Schmid
Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-03-01
This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.
International Nuclear Information System (INIS)
Alahmer, A.; Omar, M.A.; Mayyas, A.; Dongri, Shan
2011-01-01
This manuscript discusses the effect of manipulating the Relative Humidity RH of in-cabin environment on the thermal comfort and human occupants' thermal sensation. The study uses thermodynamic and psychometric analyses, to incorporate the effect of changing RH along with the dry bulb temperature on human comfort. Specifically, the study computes the effect of changing the relative humidity on the amount of heat rejected from the passenger compartment and the effect of relative humidity on occupants comfort zone. A practical system implementation is also discussed in terms of an evaporative cooler design. The results show that changing the RH along with dry bulb temperature inside vehicular cabins can improve the air conditioning efficiency by reducing the heat removed while improving the Human comfort sensations as measured by the Predicted Mean Value PMV and the Predicted Percentage Dissatisfied PPD indices. - Highlights: → Investigates the effect of controlling the RH and dry bulb temperature on in-cabin thermal comfort and sensation. → Conducts the thermodynamic and psychometric analyses for changing the RH and temperature for in-cabin air conditioning. → Discusses a possible system implementation through an evaporative cooler design.
Understanding logistic regression analysis.
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.
Weisberg, Sanford
2013-01-01
Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus
Hosmer, David W; Sturdivant, Rodney X
2013-01-01
A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-
International Nuclear Information System (INIS)
Huang, Shao Hui; O'Sullivan, Brian; Xu, Wei; Zhao, Helen; Chen, Duo-duo; Ringash, Jolie; Hope, Andrew; Razak, Albiruni; Gilbert, Ralph; Irish, Jonathan; Kim, John; Dawson, Laura A.; Bayley, Andrew; Cho, B.C. John; Goldstein, David; Gullane, Patrick; Yu, Eugene; Perez-Ordonez, Bayardo; Weinreb, Ilan; Waldron, John
2013-01-01
Purpose: To compare the temporal lymph node (LN) regression and regional control (RC) after primary chemoradiation therapy/radiation therapy in human papillomavirus-related [HPV(+)] versus human papillomavirus-unrelated [HPV(−)] head-and-neck cancer (HNC). Methods and Materials: All cases of N2-N3 HNC treated with radiation therapy/chemoradiation therapy between 2003 and 2009 were reviewed. Human papillomavirus status was ascertained by p16 staining on all available oropharyngeal cancers. Larynx/hypopharynx cancers were considered HPV(−). Initial radiologic complete nodal response (CR) (≤1.0 cm 8-12 weeks after treatment), ultimate LN resolution, and RC were compared between HPV(+) and HPV(−) HNC. Multivariate analysis identified outcome predictors. Results: A total of 257 HPV(+) and 236 HPV(−) HNCs were identified. The initial LN size was larger (mean, 2.9 cm vs 2.5 cm; P<.01) with a higher proportion of cystic LNs (38% vs 6%, P<.01) in HPV(+) versus HPV(−) HNC. CR was achieved is 125 HPV(+) HNCs (49%) and 129 HPV(−) HNCs (55%) (P=.18). The mean post treatment largest LN was 36% of the original size in the HPV(+) group and 41% in the HPV(−) group (P<.01). The actuarial LN resolution was similar in the HPV(+) and HPV(−) groups at 12 weeks (42% and 43%, respectively), but it was higher in the HPV(+) group than in the HPV(−) group at 36 weeks (90% vs 77%, P<.01). The median follow-up period was 3.6 years. The 3-year RC rate was higher in the HPV(−) CR cases versus non-CR cases (92% vs 63%, P<.01) but was not different in the HPV(+) CR cases versus non-CR cases (98% vs 92%, P=.14). On multivariate analysis, HPV(+) status predicted ultimate LN resolution (odds ratio, 1.4 [95% confidence interval, 1.1-1.7]; P<.01) and RC (hazard ratio, 0.3 [95% confidence interval 0.2-0.6]; P<.01). Conclusions: HPV(+) LNs involute more quickly than HPV(−) LNs but undergo a more prolonged process to eventual CR beyond the time of initial assessment at 8 to 12
Energy Technology Data Exchange (ETDEWEB)
Huang, Shao Hui; O' Sullivan, Brian [Department of Radiation Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Xu, Wei; Zhao, Helen [Department of Biostatistics, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Chen, Duo-duo [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, Ontario (Canada); Ringash, Jolie; Hope, Andrew [Department of Radiation Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Razak, Albiruni [Division of Medical Oncology, Princess Margaret Cancer Centre, Toronto, Ontario (Canada); Gilbert, Ralph; Irish, Jonathan [Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Kim, John; Dawson, Laura A.; Bayley, Andrew; Cho, B.C. John [Department of Radiation Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Goldstein, David; Gullane, Patrick [Department of Otolaryngology-Head and Neck Surgery/Surgical Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada); Yu, Eugene [Department of Radiology, Princess Margaret Cancer Centre, Toronto, Ontario (Canada); Perez-Ordonez, Bayardo; Weinreb, Ilan [Department of Pathology, Princess Margaret Cancer Centre, Toronto, Ontario (Canada); Waldron, John, E-mail: John.Waldron@rmp.uhn.on.ca [Department of Radiation Oncology, Princess Margaret Cancer Centre/University of Toronto, Toronto, Ontario (Canada)
2013-12-01
Purpose: To compare the temporal lymph node (LN) regression and regional control (RC) after primary chemoradiation therapy/radiation therapy in human papillomavirus-related [HPV(+)] versus human papillomavirus-unrelated [HPV(−)] head-and-neck cancer (HNC). Methods and Materials: All cases of N2-N3 HNC treated with radiation therapy/chemoradiation therapy between 2003 and 2009 were reviewed. Human papillomavirus status was ascertained by p16 staining on all available oropharyngeal cancers. Larynx/hypopharynx cancers were considered HPV(−). Initial radiologic complete nodal response (CR) (≤1.0 cm 8-12 weeks after treatment), ultimate LN resolution, and RC were compared between HPV(+) and HPV(−) HNC. Multivariate analysis identified outcome predictors. Results: A total of 257 HPV(+) and 236 HPV(−) HNCs were identified. The initial LN size was larger (mean, 2.9 cm vs 2.5 cm; P<.01) with a higher proportion of cystic LNs (38% vs 6%, P<.01) in HPV(+) versus HPV(−) HNC. CR was achieved is 125 HPV(+) HNCs (49%) and 129 HPV(−) HNCs (55%) (P=.18). The mean post treatment largest LN was 36% of the original size in the HPV(+) group and 41% in the HPV(−) group (P<.01). The actuarial LN resolution was similar in the HPV(+) and HPV(−) groups at 12 weeks (42% and 43%, respectively), but it was higher in the HPV(+) group than in the HPV(−) group at 36 weeks (90% vs 77%, P<.01). The median follow-up period was 3.6 years. The 3-year RC rate was higher in the HPV(−) CR cases versus non-CR cases (92% vs 63%, P<.01) but was not different in the HPV(+) CR cases versus non-CR cases (98% vs 92%, P=.14). On multivariate analysis, HPV(+) status predicted ultimate LN resolution (odds ratio, 1.4 [95% confidence interval, 1.1-1.7]; P<.01) and RC (hazard ratio, 0.3 [95% confidence interval 0.2-0.6]; P<.01). Conclusions: HPV(+) LNs involute more quickly than HPV(−) LNs but undergo a more prolonged process to eventual CR beyond the time of initial assessment at 8 to 12
DEFF Research Database (Denmark)
Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders
2014-01-01
was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2...... for other clinical characteristics. RESULTS: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could...
Verster, Joris C; Veldhuijzen, Dieuwke S; Patat, Alain; Olivier, Berend; Volkerts, Edmund R
2006-01-01
Many people who use hypnotics are outpatients and are likely to drive a car the day after drug intake. The purpose of these meta-analyses was to determine whether or not this is safe. Placebo-controlled, randomized, double-blind trials were selected if using the on-the-road driving test to determine driving ability the day following one or two nights of treatment administration. Primary outcome measure of the driving test was the Standard Deviation of Lateral Position (SDLP); i.e., the weaving of the car. Fixed effects model meta-analyses were performed. Effect size (ES) was computed using mean standardized (weighted) difference scores between treatment and corresponding placebo SDLP values. Ten studies, published from 1984 to 2002 (207 subjects), were included in the meta-analyses. The morning following bedtime administration, i.e. 10-11 hours after dosing, significant driving impairment was found for the recommended dose of various benzodiazepine hypnotics (ES=0.42; 95% Confidence Interval (CI)=0.14 to 0.71). Twice the recommended dose impaired driving both in the morning (ES=0.68; CI=0.39 to 0.97) and afternoon, i.e. 16-17 hours after dosing (ES=0.57; CI=0.26 to 0.88). Zopiclone 7.5 mg also impaired driving in the morning (ES=0.89; CI=0.54 to 1.23). Zaleplon (10 and 20 mg) and zolpidem (10 mg) did not affect driving performance the morning after dosing. Following middle-of-the-night administration, significantly impaired driving performance was found for zopiclone 7.5 mg (ES=1.51, CI=0.85 to 2.17), zolpidem 10 mg (ES=0.66, CI=0.13 to 1.19) and zolpidem 20 mg (ES=1.16, CI=0.60 to 1.72). Zaleplon (10 and 20 mg) did not affect driving performance. The analyses show that driving a car the morning following nocturnal treatment with benzodiazepines and zopiclone is unsafe, whereas the recommended dose of zolpidem (10 mg) and zaleplon (10 mg) do not affect driving ability.
He, Yihan; Liu, Yihong; May, Brian H; Zhang, Anthony Lin; Zhang, Haibo; Lu, ChuanJian; Yang, Lihong; Guo, Xinfeng; Xue, Charlie Changli
2017-01-01
Introduction The National Comprehensive Cancer Network guidelines for adult cancer pain indicate that acupuncture and related therapies may be valuable additions to pharmacological interventions for pain management. Of the systematic reviews related to this topic, some concluded that acupuncture was promising for alleviating cancer pain, while others argued that the evidence was insufficient to support its effectiveness. Methods and analysis This review will consist of three components: (1) synthesis of findings from existing systematic reviews; (2) updated meta-analyses of randomised clinical trials and (3) analyses of results of other types of clinical studies. We will search six English and four Chinese biomedical databases, dissertations and grey literature to identify systematic reviews and primary clinical studies. Two reviewers will screen results of the literature searches independently to identify included reviews and studies. Data from included articles will be abstracted for assessment, analysis and summary. Two assessors will appraise the quality of systematic reviews using Assessment of Multiple Systematic Reviews; assess the randomised controlled trials using the Cochrane Collaboration’s risk of bias tool and other types of studies according to the Newcastle-Ottawa Scale. We will use ‘summary of evidence’ tables to present evidence from existing systematic reviews and meta-analyses. Using the primary clinical studies, we will conduct meta-analysis for each outcome, by grouping studies based on the type of acupuncture, the comparator and the specific type of pain. Sensitivity analyses are planned according to clinical factors, acupuncture method, methodological characteristics and presence of statistical heterogeneity as applicable. For the non-randomised studies, we will tabulate the characteristics, outcome measures and the reported results of each study. Consistencies and inconsistencies in evidence will be investigated and discussed. Finally
Multilingual speaker age recognition: regression analyses on the Lwazi corpus
CSIR Research Space (South Africa)
Feld, M
2009-12-01
Full Text Available Multilinguality represents an area of significant opportunities for automatic speech-processing systems: whereas multilingual societies are commonplace, the majority of speechprocessing systems are developed with a single language in mind. As a step...
Fleischmann, Robert; Tränkner, Steffi; Bathe-Peters, Rouven; Rönnefarth, Maria; Schmidt, Sein; Schreiber, Stephan J; Brandt, Stephan A
2018-03-01
The lack of objective disease markers is a major cause of misdiagnosis and nonstandardized approaches in delirium. Recent studies conducted in well-selected patients and confined study environments suggest that quantitative electroencephalography (qEEG) can provide such markers. We hypothesize that qEEG helps remedy diagnostic uncertainty not only in well-defined study cohorts but also in a heterogeneous hospital population. In this retrospective case-control study, EEG power spectra of delirious patients and age-/gender-matched controls (n = 31 and n = 345, respectively) were fitted in a linear model to test their performance as binary classifiers. We subsequently evaluated the diagnostic performance of the best classifiers in control samples with normal EEGs (n = 534) and real-world samples including pathologic findings (n = 4294). Test reliability was estimated through split-half analyses. We found that the combination of spectral power at F3-P4 at 2 Hz (area under the curve [AUC] = .994) and C3-O1 at 19 Hz (AUC = .993) provided a sensitivity of 100% and a specificity of 99% to identify delirious patients among normal controls. These classifiers also yielded a false positive rate as low as 5% and increased the pretest probability of being delirious by 57% in an unselected real-world sample. Split-half reliabilities were .98 and .99, respectively. This retrospective study yielded preliminary evidence that qEEG provides excellent diagnostic performance to identify delirious patients even outside confined study environments. It furthermore revealed reduced beta power as a novel specific finding in delirium and that a normal EEG excludes delirium. Prospective studies including parameters of pretest probability and delirium severity are required to elaborate on these promising findings.
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
International Nuclear Information System (INIS)
Mathew, P.J.
1994-01-01
Some recent developments in industrial nuclear gauging in Australia are briefly reviewed. Quality control, process control and automation in the mineral and coal industries are based on measurements of the composition and flows of critical process stream. Australia's vast mineral wealth and its importance to the national economy has resulted in CSIRO (Commonwealth Scientific and Industrial Research Organisation) successfully developing and commercializing a variety of nucleonic gauges to meet the needs of the mineral and coal industries. These include gauges for on-line determination of the ash content of coal on conveyor belts, the ash content of solids of weight fraction of coal in slurries, on-stream determination of iron, alumina and manganese in iron ore, bulk analysis of raw feed limestone in the cement industry, and gauges for the measurement of level, moisture, and interfaces. A variety of gauges based on natural radioactivity have also been developed. Instruments based on natural gamma radiation are relatively inexpensive, and free of artificial radiation sources. An on-stream analyser based on natural gamma ray activity has been developed for monitoring the soil content of sugar cane. Significant benefits accrued to industry in using nucleonic gauges are briefly discussed. (author). 18 refs., 8 figs
Assouline, Benjamin; Tramèr, Martin R; Kreienbühl, Lukas; Elia, Nadia
2016-12-01
Ketamine is often added to opioids in patient-controlled analgesia devices. We tested whether in surgical patients, ketamine added to an opioid patient-controlled analgesia decreased pain intensity by ≥25%, cumulative opioid consumption by ≥30%, the risk of postoperative nausea and vomiting by ≥30%, the risk of respiratory adverse effects by ≥50%, and increased the risk of hallucination not more than 2-fold. In addition, we searched for evidence of dose-responsiveness. Nineteen randomized trials (1349 adults, 104 children) testing different ketamine regimens added to various opioids were identified through searches in databases and bibliographies (to 04.2016). In 9 trials (595 patients), pain intensity at rest at 24 hours was decreased by 32% with ketamine (weighted mean difference -1.1 cm on the 0-10 cm visual analog scale [98% CI, -1.8 to -0.39], P ketamine (weighted mean difference -12.9 mg [-22.4 to -3.35], P = 0.002). In 7 trials (435 patients), the incidence of postoperative nausea and vomiting was decreased by 44% with ketamine (risk ratio 0.56 [0.40 to 0.78], P ketamine on pain intensity, cumulative morphine consumption, and postoperative nausea and vomiting and its inability to double the risk of hallucination. The available data did not allow us to make a conclusion on respiratory adverse events or to establish dose-responsiveness.
Directory of Open Access Journals (Sweden)
Mok Tik
2014-06-01
Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.
Multicollinearity and Regression Analysis
Daoud, Jamal I.
2017-12-01
In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.
Walter, Frank G; Stolz, Uwe; Shirazi, Farshad; McNally, Jude
2010-01-01
The only U.S. Food and Drug Administration-approved coral snake antivenom was officially discontinued in 2007, causing ever-diminishing supplies. This study describes the severity of U.S. coral snakebites during the last 25 years to determine trends in annual rates of these bites' medical outcomes. This study retrospectively analyzed all human coral snakebites voluntarily reported by the public and/or health care professionals to poison centers that were subsequently published in the Annual Reports of the American Association of Poison Control Centers (AAPCC) from 1983 through 2007. Annual rates of medical outcomes from coral snakebites were calculated by dividing the annual number of people bitten by coral snakes who developed fatal, major, moderate, minor, or no effect outcomes by the total annual number of people bitten by coral snakes. Negative binomial regression was used to examine trends in annual rates. From 1983 through 2007, the incidence rate of coral snakebites producing no effects significantly decreased by 4.7% per year [incidence rate ratio (IRR) = 0.953; 95% confidence interval (CI) = 0.920-0.987]. From 1985 through 2007, the incidence rates of minor and major outcomes did not significantly change; however, moderate outcomes significantly increased by 3.4% per year (IRR = 1.034; 95% CI = 1.004-1.064). No fatalities were reported from 1983 through 2007. Annual rates of coral snakebites producing no effects significantly decreased and those producing moderate outcomes significantly increased in our analyses of data from the last 25 years of published AAPCC Annual Reports. This study has important limitations that must be considered when interpreting these conclusions.
DEFF Research Database (Denmark)
Bache, Stefan Holst
A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....
Yeh, Mei-Ling; Ko, Shu-Hua; Wang, Mei-Hua; Chi, Ching-Chi; Chung, Yu-Chu
2017-12-01
There has be a large body of evidence on the pharmacological treatments for psoriasis, but whether nonpharmacological interventions are effective in managing psoriasis remains largely unclear. This systematic review conducted pairwise and network meta-analyses to determine the effects of acupuncture-related techniques on acupoint stimulation for the treatment of psoriasis and to determine the order of effectiveness of these remedies. This study searched the following databases from inception to March 15, 2016: Medline, PubMed, Cochrane Central Register of Controlled Trials, EBSCO (including Academic Search Premier, American Doctoral Dissertations, and CINAHL), Airiti Library, and China National Knowledge Infrastructure. Randomized controlled trials (RCTs) on the effects of acupuncture-related techniques on acupoint stimulation as intervention for psoriasis were independently reviewed by two researchers. A total of 13 RCTs with 1,060 participants were included. The methodological quality of included studies was not rigorous. Acupoint stimulation, compared with nonacupoint stimulation, had a significant treatment for psoriasis. However, the most common adverse events were thirst and dry mouth. Subgroup analysis was further done to confirm that the short-term treatment effect was superior to that of the long-term effect in treating psoriasis. Network meta-analysis identified acupressure or acupoint catgut embedding, compared with medication, and had a significant effect for improving psoriasis. It was noted that acupressure was the most effective treatment. Acupuncture-related techniques could be considered as an alternative or adjuvant therapy for psoriasis in short term, especially of acupressure and acupoint catgut embedding. This study recommends further well-designed, methodologically rigorous, and more head-to-head randomized trials to explore the effects of acupuncture-related techniques for treating psoriasis.
Prediction, Regression and Critical Realism
DEFF Research Database (Denmark)
Næss, Petter
2004-01-01
This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...
Energy Technology Data Exchange (ETDEWEB)
Landers, N.F.; Petrie, L.M.; Knight, J.R. [Oak Ridge National Lab., TN (United States)] [and others
1995-04-01
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries.
International Nuclear Information System (INIS)
Landers, N.F.; Petrie, L.M.; Knight, J.R.
1995-04-01
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. This manual is divided into three volumes: Volume 1--for the control module documentation, Volume 2--for the functional module documentation, and Volume 3 for the documentation of the data libraries and subroutine libraries
Multiple linear regression analysis
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
Ritz, Christian; Parmigiani, Giovanni
2009-01-01
R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.
Bayesian ARTMAP for regression.
Sasu, L M; Andonie, R
2013-10-01
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.
Bounded Gaussian process regression
DEFF Research Database (Denmark)
Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan
2013-01-01
We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....
and Multinomial Logistic Regression
African Journals Online (AJOL)
This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).
Mechanisms of neuroblastoma regression
Brodeur, Garrett M.; Bagatell, Rochelle
2014-01-01
Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179
International Nuclear Information System (INIS)
Darby, S.; Hill, D.; Doll, R.; Auvinen, A.; Barros Dios, J.M.; Ruano Ravina, A.; Baysson, H.; Tirmarche, M.; Bochicchio, F.; Deo, H.; Falk, R.; Forastiere, F.; Hakama, M.; Heid, I.; Schaffrath Rosario, A.; Wichmann, H.E.; Kreienbrock, L.; Kreuzer, M.; Lagarde, F.; Pershagen, G.; Makelainen, I.; Ruosteenoja, E.; Muirhead, C.; Oberaigner, W.; TomaBek, L.; Whitley, E.
2007-01-01
Objective: To determine the risk of lung cancer associated with exposure at home to the radioactive disintegration products of naturally occurring radon gas. Design: Collaborative analysis of individual data from 13 case-control studies of residential radon and lung cancer. Setting: Nine European countries. Subjects: 7148 cases of lung cancer and 14 208 controls. Main outcome measures: Relative risks of lung cancer and radon gas concentrations in homes inhabited during the previous 5-34 years measured in becquerels (radon disintegrations per second) per cubic metre (Bq/m3) of household air. Results: The mean measured radon concentration in homes of people in the control group was 97 Bq/m3, with 11% measuring > 200 and 4% measuring > 400 Bq/m3. For cases of lung cancer the mean concentration was 104 Bq/m3. The risk of lung cancer increased by 8.4% (95% confidence interval 3.0% to 15.8%) per 100 Bq/m3 increase in measured radon (P=0.0007). This corresponds to an increase of 16% (5% to 31%) per 100 Bq/m3 increase in usual radon- that is, after correction for the dilution caused by random uncertainties in measuring radon concentrations. The dose-response relation seemed to be linear with no threshold and remained significant (P = 0.04) in analyses limited to individuals from homes with measured radon < 200 Bq/m3. The proportionate excess risk did not differ significantly with study, age, sex, or smoking. In the absence of other causes of death, the absolute risks of lung cancer by age 75 years at usual radon concentrations of 0, 100, and 400 Bq/m3 would be about 0.4%, 0.5%, and 0.7%, respectively, for lifelong non-smokers, and about 25 times greater (10%, 12%, and 16%) for cigarette smokers. Conclusions: Collectively, though not separately, these studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers, and indicate that it is responsible for about 2% of all deaths from cancer in Europe. (author)
2004-01-01
Bobita, and Capulin Canyon drainages, and from Questa Ranger Station, and surface-water analyses from Straight Creek and the Red River (fig. 1). The...Straight Creek, Hansen, Hottentot, La Bobita, Capulin Canyon, and Questa Ranger Station, and surface water analyses from Straight Creek and the Red
Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; de Girolamo, A. M.; Lo Porto, A.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Froebrich, J.
2011-10-01
Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. The use of the aquatic fauna structural and functional characteristics to assess the ecological quality of a temporary stream reach can not therefore be made without taking into account the controls imposed by the hydrological regime. This paper develops some methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: flood, riffles, connected, pools, dry and arid. We used the water discharge records from gauging stations or simulations using rainfall-runoff models to infer the temporal patterns of occurrence of these states using the developed aquatic states frequency graph. The visual analysis of this graph is complemented by the development of two metrics based on the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of the aquatic regimes of temporary streams in terms of their influence over the development of aquatic life is put forward, defining Permanent, Temporary-pools, Temporary-dry and Episodic regime types. All these methods were tested with data from eight temporary streams around the Mediterranean from MIRAGE project and its application was a precondition to assess the ecological quality of these streams using the current methods prescribed in the European Water Framework Directive for macroinvertebrate communities.
Ridge Regression Signal Processing
Kuhl, Mark R.
1990-01-01
The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.
Subset selection in regression
Miller, Alan
2002-01-01
Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...
Regression in organizational leadership.
Kernberg, O F
1979-02-01
The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.
Classification and regression trees
Breiman, Leo; Olshen, Richard A; Stone, Charles J
1984-01-01
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
KELEŞ, Taliha; ALTUN, Murat
2016-01-01
Regression analysis is a statistical technique for investigating and modeling the relationship between variables. The purpose of this study was the trivial presentation of the equation for orthogonal regression (OR) and the comparison of classical linear regression (CLR) and OR techniques with respect to the sum of squared perpendicular distances. For that purpose, the analyses were shown by an example. It was found that the sum of squared perpendicular distances of OR is smaller. Thus, it wa...
Hilbe, Joseph M
2009-01-01
This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...
Aarnisalo, Kaarina; Vihavainen, Elina; Rantala, Leila; Maijala, Riitta; Suihko, Maija-Liisa; Hielm, Sebastian; Tuominen, Pirkko; Ranta, Jukka; Raaska, Laura
2008-02-10
Microbial risk assessment provides a means of estimating consumer risks associated with food products. The methods can also be applied at the plant level. In this study results of microbiological analyses were used to develop a robust single plant level risk assessment. Furthermore, the prevalence and numbers of Listeria monocytogenes in marinated broiler legs in Finland were estimated. These estimates were based on information on the prevalence, numbers and genotypes of L. monocytogenes in 186 marinated broiler legs from 41 retail stores. The products were from three main Finnish producers, which produce 90% of all marinated broiler legs sold in Finland. The prevalence and numbers of L. monocytogenes were estimated by Monte Carlo simulation using WinBUGS, but the model is applicable to any software featuring standard probability distributions. The estimated mean annual number of L. monocytogenes-positive broiler legs sold in Finland was 7.2x10(6) with a 95% credible interval (CI) 6.7x10(6)-7.7x10(6). That would be 34%+/-1% of the marinated broiler legs sold in Finland. The mean number of L. monocytogenes in marinated broiler legs estimated at the sell-by-date was 2 CFU/g, with a 95% CI of 0-14 CFU/g. Producer-specific L. monocytogenes strains were recovered from the products throughout the year, which emphasizes the importance of characterizing the isolates and identifying strains that may cause problems as part of risk assessment studies. As the levels of L. monocytogenes were low, the risk of acquiring listeriosis from these products proved to be insignificant. Consequently there was no need for a thorough national level risk assessment. However, an approach using worst-case and average point estimates was applied to produce an example of single producer level risk assessment based on limited data. This assessment also indicated that the risk from these products was low. The risk-based approach presented in this work can provide estimation of public health risk
Steganalysis using logistic regression
Lubenko, Ivans; Ker, Andrew D.
2011-02-01
We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.
SEPARATION PHENOMENA LOGISTIC REGRESSION
Directory of Open Access Journals (Sweden)
Ikaro Daniel de Carvalho Barreto
2014-03-01
Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...
Adaptive metric kernel regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
2000-01-01
Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...
Adaptive Metric Kernel Regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
1998-01-01
Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...
Sahu, Kiran Chandra
2016-01-01
Active structural acoustic control (ASAC) is a form of active noise control which focuses on the control of structural vibrations in a manner that minimizes acoustic radiation from a structure. The greatest difficulty ASAC suffers from is in finding an "optimal" error quantity, which can be easily implemented in a control algorithm. Volume velocity control (VVC) metric which is generally used in ASAC typically requires either a large number of sensors distributed across the entire structure, ...
DEFF Research Database (Denmark)
Hansen, Henrik; Tarp, Finn
2001-01-01
This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...
2013-10-24
... pharmaceutical industry and health care organizations, and others from the general public, about the use of meta-analyses of randomized trials as a tool for safety assessment in the regulation of pharmaceutical products... PDUFA Goals Letter, titled ``Enhancing Regulatory Science and Expediting Drug Development,'' includes an...
International Nuclear Information System (INIS)
Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu
2005-01-01
The dynamic informational system with datalogging and supervisory control module includes a motion control module and is a new conception used in tritium removal installation with isotopic exchange and cryogenic distillation. The control system includes an event-driven engine that maintains a real-time database, logs historical data, processes alarm information, and communicates with I/O devices. Also, it displays the operator interfaces and performs tasks that are defined for advanced control algorithms, supervisory control, analysis, and display with data transfer from data acquisition room to the control room. By using the parameters, we compute the deuterium and tritium concentration, respectively, of the liquid at the inlet of the isotopic exchange column and, consequently, we can compute at the outlet of the column, the tritium concentration in the water vapors. (authors)
Time-trend of melanoma screening practice by primary care physicians: A meta-regression analysis
Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni
2009-01-01
Objective To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Methods Meta-regression analyses of available data. Data Sources: MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Results Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%?82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening ten...
Modified Regression Correlation Coefficient for Poisson Regression Model
Kaengthong, Nattacha; Domthong, Uthumporn
2017-09-01
This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).
Flumignan, Danilo Luiz; Boralle, Nivaldo; Oliveira, José Eduardo de
2010-06-30
In this work, the combination of carbon nuclear magnetic resonance ((13)C NMR) fingerprinting with pattern-recognition analyses provides an original and alternative approach to screening commercial gasoline quality. Soft Independent Modelling of Class Analogy (SIMCA) was performed on spectroscopic fingerprints to classify representative commercial gasoline samples, which were selected by Hierarchical Cluster Analyses (HCA) over several months in retails services of gas stations, into previously quality-defined classes. Following optimized (13)C NMR-SIMCA algorithm, sensitivity values were obtained in the training set (99.0%), with leave-one-out cross-validation, and external prediction set (92.0%). Governmental laboratories could employ this method as a rapid screening analysis to discourage adulteration practices. Copyright 2010 Elsevier B.V. All rights reserved.
Magak, Philip; Chang-Cojulun, Alicia; Kadzo, Hilda; Ireri, Edmund; Muchiri, Eric; Kitron, Uriel; King, Charles H.
2015-01-01
Previous population-based studies have examined treatment impact on Schistosoma-associated urinary tract disease among children, but much less is known about longer-term treatment benefits for affected adult populations in areas where risk of recurrent infection is high. In communities in Msambweni, along the Kenya coast, we identified, using a portable ultrasound, 77 adults (aged 17–85) with moderate-to-severe obstructive uropathy or bladder disease due to Schistosoma haematobium. Treatment response was assessed by repeat ultrasound 1–2 years after praziquantel (PZQ) therapy and compared with interval changes among age- and sex-matched infected/treated control subjects who did not have urinary tract abnormalities at the time of initial examination. Of the 77 affected adults, 62 (81%) had improvement in bladder and/or kidney scores after treatment, 14 (18%) had no change, and one (1.3%) had progression of disease. Of the 77 controls, 75 (97%) remained disease free by ultrasound, while two (3%) had apparent progression with abnormal findings on follow-up examination. We conclude that PZQ therapy for S. haematobium is effective in significantly reducing urinary tract morbidity from urogenital schistosomiasis among adult age groups, and affected adults stand to benefit from inclusion in mass treatment campaigns. PMID:26013375
Directory of Open Access Journals (Sweden)
Dmitri Papatsenko
2015-08-01
Full Text Available Analyses of gene expression in single mouse embryonic stem cells (mESCs cultured in serum and LIF revealed the presence of two distinct cell subpopulations with individual gene expression signatures. Comparisons with published data revealed that cells in the first subpopulation are phenotypically similar to cells isolated from the inner cell mass (ICM. In contrast, cells in the second subpopulation appear to be more mature. Pluripotency Gene Regulatory Network (PGRN reconstruction based on single-cell data and published data suggested antagonistic roles for Oct4 and Nanog in the maintenance of pluripotency states. Integrated analyses of published genomic binding (ChIP data strongly supported this observation. Certain target genes alternatively regulated by OCT4 and NANOG, such as Sall4 and Zscan10, feed back into the top hierarchical regulator Oct4. Analyses of such incoherent feedforward loops with feedback (iFFL-FB suggest a dynamic model for the maintenance of mESC pluripotency and self-renewal.
Measurement Error in Education and Growth Regressions
Portela, M.; Teulings, C.N.; Alessie, R.
The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations
Measurement error in education and growth regressions
Portela, Miguel; Teulings, Coen; Alessie, R.
2004-01-01
The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations
Panel data specifications in nonparametric kernel regression
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard; Henningsen, Arne
parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...
Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun
2016-07-01
In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Should metacognition be measured by logistic regression?
Rausch, Manuel; Zehetleitner, Michael
2017-03-01
Are logistic regression slopes suitable to quantify metacognitive sensitivity, i.e. the efficiency with which subjective reports differentiate between correct and incorrect task responses? We analytically show that logistic regression slopes are independent from rating criteria in one specific model of metacognition, which assumes (i) that rating decisions are based on sensory evidence generated independently of the sensory evidence used for primary task responses and (ii) that the distributions of evidence are logistic. Given a hierarchical model of metacognition, logistic regression slopes depend on rating criteria. According to all considered models, regression slopes depend on the primary task criterion. A reanalysis of previous data revealed that massive numbers of trials are required to distinguish between hierarchical and independent models with tolerable accuracy. It is argued that researchers who wish to use logistic regression as measure of metacognitive sensitivity need to control the primary task criterion and rating criteria. Copyright © 2017 Elsevier Inc. All rights reserved.
Selle, V.; Schalkwijk, S.J.; Vazquez, G.H.; Baldessarini, R.J.
2014-01-01
BACKGROUND: Optimal treatments for bipolar depression, and the relative value of specific drugs for that purpose, remain uncertain, including agents other than antidepressants. METHODS: We searched for reports of placebo-controlled, monotherapy trials of mood-stabilizing anticonvulsants,
DEFF Research Database (Denmark)
Andreasen, Camilla Helene; Mogensen, Mette Sloth; Borch-Johnsen, Knut
2009-01-01
of obesity-related quantitative traits, and case-control studies in large study samples of Danes. METHODS: The FDFT1 rs7001819, CTNNBL1 rs6013029 and rs6020846 were genotyped, using TaqMan allelic discrimination, in a combined study sample comprising 18,014 participants ascertained from; the population...... and a previous study. FDFT1 rs7001819 showed no association with obesity, neither when analysing quantitative traits nor when performing case-control studies of obesity.......). The most significantly associating variants within CTNNBL1 including rs6013029 and rs6020846 were additionally confirmed to associate with morbid obesity in a French Caucasian case-control sample. The aim of this study was to investigate the impact of these three variants on obesity, through analyses...
Polynomial regression analysis and significance test of the regression function
International Nuclear Information System (INIS)
Gao Zhengming; Zhao Juan; He Shengping
2012-01-01
In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)
Hwong, Y.L.; Keiren, J.J.A.; Kusters, V.J.J.; Leemans, S.J.J.; Willemse, T.A.C.
2013-01-01
The control software of the CERN Compact Muon Solenoid experiment contains over 27 500 finite state machines. These state machines are organised hierarchically: commands are sent down the hierarchy and state changes are sent upwards. The sheer size of the system makes it virtually impossible to
Hacker, Thomas; Stone, Paul; MacBeth, Angus
2016-01-15
Acceptance and Commitment Therapy (ACT) has accrued a substantial evidence base. Recent systematic and meta-analytic reviews suggest that ACT is effective compared to control conditions. However, these reviews appraise the efficacy of ACT across a broad range of presenting problems, rather than addressing specific common mental health difficulties. Focussing on depression and anxiety we performed a meta-analysis of trials of ACT. We incorporated sequential meta-analysis (SMA) techniques to critically appraise the sufficiency of the existing evidence base. Findings suggest that ACT demonstrates at least moderate group and pre-post effects for symptom reductions for both anxiety and depression. However using SMA findings are more qualified. There is currently insufficient evidence to confidently conclude that ACT for anxiety is efficacious when compared to active control conditions or as primary treatment for anxiety. Similarly, using SMA, there is currently insufficient evidence to suggest a moderate efficacy of ACT for depression compared to active control conditions. To stimulate further research we offer specific estimates of additional numbers of participants required to reach sufficiency to help inform future studies. We also discuss the appropriate strategies for future research into ACT for anxiety given the current evidence suggests no differential efficacy of ACT in the treatment of anxiety compared to active control conditions. Copyright © 2015 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Shigeru Matsumoto, MD
2013-09-01
Conclusions: These findings suggest that control-released bFGF using gelatin sheet is effective for promoting wound healing. Such therapeutic strategy was considered to offer several clinical advantages including rapid healing and reduction of the dressing change with less patient discomfort.
van Roekel, Eeske; Heininga, Vera E; Vrijen, Charlotte; Snippe, Evelien; Oldehinkel, Albertine J
2018-04-19
Anhedonia reflects a dysfunction in the reward system, which can be manifested in an inability to enjoy pleasurable situations (i.e., lack of positive emotions), but also by a lack of motivation to engage in pleasurable activities (i.e., lack of motivation). Little is known about the interrelations between positive emotions and motivation in daily life, and whether these associations are altered in anhedonic individuals. In the present study, we used a network approach to explore the reciprocal, lagged associations between positive emotions and motivation in anhedonic individuals (N = 66) and controls (N = 68). Participants (aged between 18 and 24 years) filled out momentary assessments of affect 3 times per day for 30 consecutive days. Our results showed that (a) anhedonic individuals and controls had similar moment-to-moment transfer of positive emotions; (b) in the anhedonic network feeling cheerful was the node with the highest outstrength, both within this group and compared with the control group; (c) feeling relaxed had the highest outstrength in the control network, and (d) anhedonic individuals had stronger pathways from positive emotions to motivation than controls. Taken together, our findings suggest that low levels of positive emotions lead to decreased motivation in the anhedonic group, which could instigate a negative spiral of low pleasure and low motivation. On a more positive note, we showed that cheerfulness had the highest outstrength in the network of anhedonic participants. Hence, interventions may focus on increasing cheerfulness in anhedonic individuals, as this will likely have the greatest impact on other positive emotions and motivations. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Recursive Algorithm For Linear Regression
Varanasi, S. V.
1988-01-01
Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.
Kamphuis, P J G H; Verhey, F R J; Olde Rikkert, M G M; Twisk, J W R; Swinkels, S H N; Scheltens, P
2011-08-01
To investigate the extent that baseline cognitive impairment and intake adherence affected the 13-item Alzheimer's Disease Assessment Scale - cognitive subscale (ADAS-cog) intervention response of a medical food in Alzheimer's Disease (AD) patients. DESIGN/SETTING/PARTICIPANTS /INTERVENTION/MEASUREMENTS: This analysis was performed on data from a proof-of-concept study, consisting of a 12-week, double-blind, randomized, controlled, multicenter trial, followed by a similarly designed 12-week extension study. Patients with mild AD (Mini-Mental State Examination [MMSE] score of 20-26) were randomized to receive active or control product as a 125 ml daily drink. One of the co-primary outcome measures was the 13-item ADAS-cog. In this analysis, the study population was divided into two subgroups: patients with 'low' baseline ADAS-cog scores (Souvenaid intervention on ADAS-cog outcome. A higher intake of active study product was also associated with greater cognitive benefit. These findings highlight the potential benefits of Souvenaid in AD patients and warrant confirmation in larger, controlled studies.
International Nuclear Information System (INIS)
Kambe, Mitsuru; Tsunoda, Hirokazu; Mishima, Kaichiro; Iwamura, Takamichi
2002-01-01
The 200 kWe uranium-nitride fueled lithium cooled fast reactor concept 'RAPID-L' to achieve highly automated reactor operation has been demonstrated. RAPID-L is designed for Lunar base power system. It is one of the variants of RAPID (Refueling by All Pins Integrated Design), fast reactor concept, which enable quick and simplified refueling. The essential feature of RAPID concept is that the reactor core consists of an integrated fuel assembly instead of conventional fuel subassemblies. In this small size reactor core, 2700 fuel pins are integrated altogether and encased in a fuel cartridge. Refueling is conducted by replacing a fuel cartridge. The reactor can be operated without refueling for up to 10 years. Unique challenges in reactivity control systems design have been attempted in RAPID-L concept. The reactor has no control rod, but involves the following innovative reactivity control systems: Lithium Expansion Modules (LEM) for inherent reactivity feedback, Lithium Injection Modules (LIM) for inherent ultimate shutdown, and Lithium Release Modules (LRM) for automated reactor startup. All these systems adopt lithium-6 as a liquid poison instead of B 4 C rods. In combination with LEMs, LIMs and LRMs, RAPID-L can be operated without operator. This is the first reactor concept ever established in the world. This reactor concept is also applicable to the terrestrial fast reactors. In this paper, RAPID-L reactor concept and its transient characteristics are presented. (authors)
Paatero, Ilkka; Casals, Eudald; Niemi, Rasmus; Özliseli, Ezgi; Rosenholm, Jessica M; Sahlgren, Cecilia
2017-08-21
Mesoporous silica nanoparticles (MSNs) are extensively explored as drug delivery systems, but in depth understanding of design-toxicity relationships is still scarce. We used zebrafish (Danio rerio) embryos to study toxicity profiles of differently surface functionalized MSNs. Embryos with the chorion membrane intact, or dechoroniated embryos, were incubated or microinjected with amino (NH 2 -MSNs), polyethyleneimine (PEI-MSNs), succinic acid (SUCC-MSNs) or polyethyleneglycol (PEG-MSNs) functionalized MSNs. Toxicity was assessed by viability and cardiovascular function. NH 2 -MSNs, SUCC-MSNs and PEG-MSNs were well tolerated, 50 µg/ml PEI-MSNs induced 100% lethality 48 hours post fertilization (hpf). Dechoroniated embryos were more sensitive and 10 µg/ml PEI-MSNs reduced viability to 5% at 96hpf. Sensitivity to PEG- and SUCC-, but not NH 2 -MSNs, was also enhanced. Typically cardiovascular toxicity was evident prior to lethality. Confocal microscopy revealed that PEI-MSNs penetrated into the embryos whereas PEG-, NH2- and SUCC-MSNs remained aggregated on the skin surface. Direct exposure of inner organs by microinjecting NH 2 -MSNs and PEI-MSNs demonstrated that the particles displayed similar toxicity indicating that functionalization affects the toxicity profile by influencing penetrance through biological barriers. The data emphasize the need for careful analyses of toxicity mechanisms in relevant models and constitute an important knowledge step towards the development of safer and sustainable nanotherapies.
Combining Alphas via Bounded Regression
Directory of Open Access Journals (Sweden)
Zura Kakushadze
2015-11-01
Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.
Regression in autistic spectrum disorders.
Stefanatos, Gerry A
2008-12-01
A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.
Directory of Open Access Journals (Sweden)
Joseph Donroe
2008-09-01
Full Text Available Child pedestrian road traffic injuries (RTIs are an important cause of death and disability in poorer nations, however RTI prevention strategies in those countries largely draw upon studies conducted in wealthier countries. This research investigated personal and environmental risk factors for child pedestrian RTIs relevant to an urban, developing world setting.This is a case control study of personal and environmental risk factors for child pedestrian RTIs in San Juan de Miraflores, Lima, Perú. The analysis of personal risk factors included 100 cases of serious pedestrian RTIs and 200 age and gender matched controls. Demographic, socioeconomic, and injury data were collected. The environmental risk factor study evaluated vehicle and pedestrian movement and infrastructure at the sites in which 40 of the above case RTIs occurred and 80 control sites.After adjustment, factors associated with increased risk of child pedestrian RTIs included high vehicle volume (OR 7.88, 95%CI 1.97-31.52, absent lane demarcations (OR 6.59, 95% CI 1.65-26.26, high vehicle speed (OR 5.35, 95%CI 1.55-18.54, high street vendor density (OR 1.25, 95%CI 1.01-1.55, and more children living in the home (OR 1.25, 95%CI 1.00-1.56. Protective factors included more hours/day spent in school (OR 0.52, 95%CI 0.33-0.82 and years of family residence in the same home (OR 0.97, 95%CI 0.95-0.99.Reducing traffic volumes and speeds, limiting the number of street vendors on a given stretch of road, and improving lane demarcation should be evaluated as components of child pedestrian RTI interventions in poorer countries.
Mettot, Clément; Sipp, Denis; Bézard, Hervé
2014-04-01
This article presents a quasi-laminar stability approach to identify in high-Reynolds number flows the dominant low-frequencies and to design passive control means to shift these frequencies. The approach is based on a global linear stability analysis of mean-flows, which correspond to the time-average of the unsteady flows. Contrary to the previous work by Meliga et al. ["Sensitivity of 2-D turbulent flow past a D-shaped cylinder using global stability," Phys. Fluids 24, 061701 (2012)], we use the linearized Navier-Stokes equations based solely on the molecular viscosity (leaving aside any turbulence model and any eddy viscosity) to extract the least stable direct and adjoint global modes of the flow. Then, we compute the frequency sensitivity maps of these modes, so as to predict before hand where a small control cylinder optimally shifts the frequency of the flow. In the case of the D-shaped cylinder studied by Parezanović and Cadot [J. Fluid Mech. 693, 115 (2012)], we show that the present approach well captures the frequency of the flow and recovers accurately the frequency control maps obtained experimentally. The results are close to those already obtained by Meliga et al., who used a more complex approach in which turbulence models played a central role. The present approach is simpler and may be applied to a broader range of flows since it is tractable as soon as mean-flows — which can be obtained either numerically from simulations (Direct Numerical Simulation (DNS), Large Eddy Simulation (LES), unsteady Reynolds-Averaged-Navier-Stokes (RANS), steady RANS) or from experimental measurements (Particle Image Velocimetry - PIV) — are available. We also discuss how the influence of the control cylinder on the mean-flow may be more accurately predicted by determining an eddy-viscosity from numerical simulations or experimental measurements. From a technical point of view, we finally show how an existing compressible numerical simulation code may be used in
jQC-PET, an ImageJ macro to analyse the quality control of a PET/CT
International Nuclear Information System (INIS)
Cortes-Rodicio, J.; Sanchez-Merino, G.; Garcia-Fidalgo, A.
2015-01-01
An ImageJ macro has been developed to facilitate the analysis of three PET/CT quality control procedures included in the documents from the National Electrical Manufacturers Association (NU2-2007) and the International Atomic Energy Agency (Pub-1393): image quality, uniformity and spatial resolution. In them, the generation of the regions of interest and the analysis are automatized. The results obtained with the software have been compared with those of the commercial software and the literature. The use of jQC-PET allows a standard analysis and the independence of the commercial software. (Author)
International Nuclear Information System (INIS)
O'HARA, J.M.; BROWN, W.
2004-01-01
Many nuclear power plants are modernizing with digital instrumentation and control systems and computer-based human-system interfaces (HSIs). The purpose of this paper is to summarize the human factors engineering (HFE) activities that can help to ensure that the design meets personnel needs. HFE activities should be integrated into the design process as a regular part of the engineering effort of a plant modification. The HFE activities will help ensure that human performance issues are addressed, that new technology supports task performance, and that the HSIs are designed in a manner that is compatible with human physiological, cognitive and social characteristics
Directory of Open Access Journals (Sweden)
S. Ye
2012-11-01
Full Text Available The flow duration curve (FDC is a classical method used to graphically represent the relationship between the frequency and magnitude of streamflow. In this sense it represents a compact signature of temporal runoff variability that can also be used to diagnose catchment rainfall-runoff responses, including similarity and differences between catchments. This paper is aimed at extracting regional patterns of the FDCs from observed daily flow data and elucidating the physical controls underlying these patterns, as a way to aid towards their regionalization and predictions in ungauged basins. The FDCs of total runoff (TFDC using multi-decadal streamflow records for 197 catchments across the continental United States are separated into the FDCs of two runoff components, i.e., fast flow (FFDC and slow flow (SFDC. In order to compactly display these regional patterns, the 3-parameter mixed gamma distribution is employed to characterize the shapes of the normalized FDCs (i.e., TFDC, FFDC and SFDC over the entire data record. This is repeated to also characterize the between-year variability of "annual" FDCs for 8 representative catchments chosen across a climate gradient. Results show that the mixed gamma distribution can adequately capture the shapes of the FDCs and their variation between catchments and also between years. Comparison between the between-catchment and between-year variability of the FDCs revealed significant space-time symmetry. Possible relationships between the parameters of the fitted mixed gamma distribution and catchment climatic and physiographic characteristics are explored in order to decipher and point to the underlying physical controls. The baseflow index (a surrogate for the collective impact of geology, soils, topography and vegetation, as well as climate is found to be the dominant control on the shapes of the normalized TFDC and SFDC, whereas the product of maximum daily precipitation and the fraction of non-rainy days
DEFF Research Database (Denmark)
Jørgensen, Marie B; Faber, Anne; Jespersen, Tobias
2012-01-01
intervention effects, more research on implementation is needed. Trial registration: ISRCTN96241850. Practitioner summary: Both physical coordination training and cognitive behavioural training are potential effective workplace interventions among low educated job groups with high physical work demands......This study evaluates the implementation of physical coordination training (PCT) and cognitive behavioural training (CBTr) interventions in a randomised controlled trial at nine cleaners' workplaces. Female cleaners (n = 294) were randomised into a PCT, a CBTr or a reference (REF) group. Both 12...
The natural oscillation of two types of ENSO events based on analyses of CMIP5 model control runs
Xu, Kang; Su, Jingzhi; Zhu, Congwen
2014-07-01
The eastern- and central-Pacific El Niño-Southern Oscillation (EP- and CP-ENSO) have been found to be dominant in the tropical Pacific Ocean, and are characterized by interannual and decadal oscillation, respectively. In the present study, we defined the EP- and CP-ENSO modes by singular value decomposition (SVD) between SST and sea level pressure (SLP) anomalous fields. We evaluated the natural features of these two types of ENSO modes as simulated by the pre-industrial control runs of 20 models involved in phase five of the Coupled Model Intercomparison Project (CMIP5). The results suggested that all the models show good skill in simulating the SST and SLP anomaly dipolar structures for the EP-ENSO mode, but only 12 exhibit good performance in simulating the tripolar CP-ENSO modes. Wavelet analysis suggested that the ensemble principal components in these 12 models exhibit an interannual and multi-decadal oscillation related to the EP- and CP-ENSO, respectively. Since there are no changes in external forcing in the pre-industrial control runs, such a result implies that the decadal oscillation of CP-ENSO is possibly a result of natural climate variability rather than external forcing.
Pearce, J.M.
2006-01-01
Insertions and deletions (indels) result in sequences of various lengths when homologous gene regions are compared among individuals or species. Although indels are typically phylogenetically informative, occurrence and incorporation of these characters as gaps in intraspecific population genetic data sets are rarely discussed. Moreover, the impact of gaps on estimates of fixation indices, such as FST, has not been reviewed. Here, I summarize the occurrence and population genetic signal of indels among 60 published studies that involved alignments of multiple sequences from the mitochondrial DNA (mtDNA) control region of vertebrate taxa. Among 30 studies observing indels, an average of 12% of both variable and parsimony-informative sites were composed of these sites. There was no consistent trend between levels of population differentiation and the number of gap characters in a data block. Across all studies, the average influence on estimates of ??ST was small, explaining only an additional 1.8% of among population variance (range 0.0-8.0%). Studies most likely to observe an increase in ??ST with the inclusion of gap characters were those with control region DNA appears small, dependent upon total number of variable sites in the data block, and related to species-specific characteristics and the spatial distribution of mtDNA lineages that contain indels. ?? 2006 Blackwell Publishing Ltd.
Linear regression in astronomy. I
Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh
1990-01-01
Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.
Li, X.; Zhang, Y.; Zheng, B.; Zhang, Q.; He, K.
2013-12-01
Anthropogenic emissions have been controlled in recent years in China to mitigate fine particulate matter (PM2.5) pollution. Recent studies show that sulfate dioxide (SO2)-only control cannot reduce total PM2.5 levels efficiently. Other species such as nitrogen oxide, ammonia, black carbon, and organic carbon may be equally important during particular seasons. Furthermore, each species is emitted from several anthropogenic sectors (e.g., industry, power plant, transportation, residential and agriculture). On the other hand, contribution of one emission sector to PM2.5 represents contributions of all species in this sector. In this work, two model-based methods are used to identify the most influential emission sectors and areas to PM2.5. The first method is the source apportionment (SA) based on the Particulate Source Apportionment Technology (PSAT) available in the Comprehensive Air Quality Model with extensions (CAMx) driven by meteorological predictions of the Weather Research and Forecast (WRF) model. The second method is the source sensitivity (SS) based on an adjoint integration technique (AIT) available in the GEOS-Chem model. The SA method attributes simulated PM2.5 concentrations to each emission group, while the SS method calculates their sensitivity to each emission group, accounting for the non-linear relationship between PM2.5 and its precursors. Despite their differences, the complementary nature of the two methods enables a complete analysis of source-receptor relationships to support emission control policies. Our objectives are to quantify the contributions of each emission group/area to PM2.5 in the receptor areas and to intercompare results from the two methods to gain a comprehensive understanding of the role of emission sources in PM2.5 formation. The results will be compared in terms of the magnitudes and rankings of SS or SA of emitted species and emission groups/areas. GEOS-Chem with AIT is applied over East Asia at a horizontal grid
Advanced statistics: linear regression, part I: simple linear regression.
Marill, Keith A
2004-01-01
Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.
Selle, V; Schalkwijk, S; Vázquez, G H; Baldessarini, R J
2014-03-01
Optimal treatments for bipolar depression, and the relative value of specific drugs for that purpose, remain uncertain, including agents other than antidepressants. We searched for reports of placebo-controlled, monotherapy trials of mood-stabilizing anticonvulsants, second-generation antipsychotics, or lithium for acute major depressive episodes in patients diagnosed with type I or II bipolar disorder and applied random-effects meta-analysis to evaluate their efficacy, comparing outcomes based on standardized mean drug-placebo differences (SMD) in improvement, relative response rates (RR), and number-needed-to-treat (NNT). We identified 24 trials of 10 treatments (lasting 7.5 weeks, with ≥ 50 collaborating sites/trial) that met eligibility criteria: lamotrigine (5 trials), quetiapine (5), valproate (4), 2 each for aripiprazole, olanzapine, ziprasidone, and 1 each for carbamazepine, lithium, lurasidone, and olanzapine-fluoxetine. Overall, pooled drug-over-placebo responder-rate superiority (RR) was moderate (29% [CI: 19-40%]), and NNT was 8.2 (CI: 6.4-11). By SMD, apparent efficacy ranked: olanzapine + fluoxetine ≥ valproate > quetiapine > lurasidone > olanzapine, aripiprazole, and carbamazepine; ziprasidone was ineffective, and lithium remains inadequately studied. Notably, drugs were superior to placebo in only 11/24 trials (5/5 with quetiapine, 2/4 with valproate), and only lamotrigine, quetiapine and valproate had > 2 trials. Treatment-associated mania-like reactions were uncommon (drugs: 3.7%; placebo: 4.7%). Controlled trials of non-antidepressant treatments for bipolar depression remain scarce, but findings with olanzapine-fluoxetine, lurasidone, quetiapine, and perhaps carbamazepine and valproate were encouraging; lithium requires adequate testing. © Georg Thieme Verlag KG Stuttgart · New York.
Marsh, Fiona; Kremer, Christian; Duffy, Sean
2004-03-01
To examine the cost implications of outpatient versus daycase hysteroscopy to the National Health Service, the patient and their employer. Randomised controlled trial. The gynaecology clinic of a large teaching hospital. Ninety-seven women with abnormal uterine bleeding requiring investigation. Women were randomly allocated to either outpatient or daycase hysteroscopy. They were asked to complete diaries recording expenses and time off work. The National Health Service costs were calculated for a standard outpatient and daycase hysteroscopy. Costs to the National Health Service, costs to the employer, loss of income, childcare costs and travel expenses. The outpatient group required significantly less time off work compared with the daycase group (0.8 days vs 3.3 days), P Service approximately pound 53.88 more per patient, than performing an outpatient hysteroscopy. Purchasing the hysteroscopes necessary to perform an outpatient hysteroscopy is a more expensive outlay than those required for daycase hysteroscopy. However, there are so many other savings that only 38 patients need to undergo outpatient hysteroscopy (even with a 4% failure rate) rather than daycase hysteroscopy in order to recoup the extra money required to set up an outpatient hysteroscopy service. Outpatient hysteroscopy offers many benefits over its traditional counterpart including faster recovery, less time away from work and home and cost savings to the woman and her employer and the National Health Service. Resources need to be made available to rapidly develop this service across the UK in order to better serve both patient and taxpayer.
Linear regression in astronomy. II
Feigelson, Eric D.; Babu, Gutti J.
1992-01-01
A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.
Time-adaptive quantile regression
DEFF Research Database (Denmark)
Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik
2008-01-01
and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....
Time series regression model for infectious disease and weather.
Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro
2015-10-01
Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Retro-regression--another important multivariate regression improvement.
Randić, M
2001-01-01
We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.
Energy Technology Data Exchange (ETDEWEB)
Soltys, S. [Stanford Univ. (United States)
2015-06-15
Stereotactic Body Radiation Therapy (SBRT) was introduced clinically more than twenty years ago, and many subsequent publications have reported safety and efficacy data. The AAPM Working Group on Biological Effects of Hypofractionated Radiotherapy/SBRT (WGSBRT) extracted published treatment outcomes data from extensive literature searches to summarize and construct tumor control probability (TCP) and normal tissue complication probability (NTCP) models for six anatomical regions: Cranial, Head and Neck, Thoracic, Abdominal, Pelvic, and Spinal. In this session, we present the WGSBRT’s work for cranial sites, and recurrent head and neck cancer. From literature-based data and associated models, guidelines to aid with safe and effective hypofractionated radiotherapy treatment are being determined. Further, the ability of existing and proposed radiobiological models to fit these data is considered as to the ability to distinguish between the linear-quadratic and alternative radiobiological models such as secondary cell death from vascular damage, immunogenic, or bystander effects. Where appropriate, specific model parameters are estimated. As described in “The lessons of QUANTEC,” (1), lack of adequate reporting standards continues to limit the amount of useful quantitative information that can be extracted from peer-reviewed publications. Recommendations regarding reporting standards are considered, to enable such reviews to achieve more complete characterization of clinical outcomes. 1 Jackson A, Marks LB, Bentzen SM, Eisbruch A, Yorke ED, Ten Haken RK, Constine LS, Deasy JO. The lessons of QUANTEC: recommendations for reporting and gathering data on dose-volume dependencies of treatment outcome. Int J Radiat Oncol Biol Phys. 2010 Mar 1;76(3 Suppl):S155–60. Learning Objectives: Describe the techniques, types of cancer and dose schedules used in treating recurrent H&N cancers with SBRT List the radiobiological models that compete with the linear-quadratic model
Buddhachat, Kittisak; Osathanunkul, Maslin; Madesis, Panagiotis; Chomdej, Siriwadee; Ongchai, Siriwan
2015-11-15
efficient molecular tool for correct species identification. This molecular tool provides a noteworthy benefit for quality control of medicinal plants and industry plants for pharmacological prospects. Copyright © 2015 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Soltys, S.
2015-01-01
Stereotactic Body Radiation Therapy (SBRT) was introduced clinically more than twenty years ago, and many subsequent publications have reported safety and efficacy data. The AAPM Working Group on Biological Effects of Hypofractionated Radiotherapy/SBRT (WGSBRT) extracted published treatment outcomes data from extensive literature searches to summarize and construct tumor control probability (TCP) and normal tissue complication probability (NTCP) models for six anatomical regions: Cranial, Head and Neck, Thoracic, Abdominal, Pelvic, and Spinal. In this session, we present the WGSBRT’s work for cranial sites, and recurrent head and neck cancer. From literature-based data and associated models, guidelines to aid with safe and effective hypofractionated radiotherapy treatment are being determined. Further, the ability of existing and proposed radiobiological models to fit these data is considered as to the ability to distinguish between the linear-quadratic and alternative radiobiological models such as secondary cell death from vascular damage, immunogenic, or bystander effects. Where appropriate, specific model parameters are estimated. As described in “The lessons of QUANTEC,” (1), lack of adequate reporting standards continues to limit the amount of useful quantitative information that can be extracted from peer-reviewed publications. Recommendations regarding reporting standards are considered, to enable such reviews to achieve more complete characterization of clinical outcomes. 1 Jackson A, Marks LB, Bentzen SM, Eisbruch A, Yorke ED, Ten Haken RK, Constine LS, Deasy JO. The lessons of QUANTEC: recommendations for reporting and gathering data on dose-volume dependencies of treatment outcome. Int J Radiat Oncol Biol Phys. 2010 Mar 1;76(3 Suppl):S155–60. Learning Objectives: Describe the techniques, types of cancer and dose schedules used in treating recurrent H&N cancers with SBRT List the radiobiological models that compete with the linear-quadratic model
Martins, António A.; Cabral, João; Cunha, Pedro P.; Stokes, Martin; Borges, José; Caldeira, Bento; Martins, A. Cardoso
2017-01-01
This study examines the long profiles of tributaries of the Tagus and Zêzere rivers in Portugal (West Iberia) in order to provide new insights into patterns, timing, and controls on drainage development during the Quaternary incision stage. The studied streams are incised into a relict culminant fluvial surface, abandoned at the beginning of the incision stage. The streams flow through a landscape with bedrock variations in lithology (mainly granites and metasediments) and faulted blocks with distinct uplift rates. The long profiles of the analyzed streams record an older transitory knickpoint/knickzone separating (1) an upstream relict graded profile, with lower steepness and higher concavity, that reflects a long period of quasi-equilibrium conditions reached after the beginning of the incision stage, and (2) a downstream rejuvenated long profile, with steeper gradient and lower concavity, particularly for the final reach, which is often convex. The rejuvenated reaches testify to the upstream propagation of several incision waves, interpreted as the response of each stream to increasing crustal uplift and prolonged periods of base-level lowering by the trunk drainages, coeval with low sea level conditions. The morphological configurations of the long profiles enabled spatial and relative temporal patterns of incisions to be quantified. The incision values of streams flowing on the Portuguese Central Range (PCR; ca. 380-150 m) are variable but generally higher than the incision values of streams flowing on the adjacent South Portugal Planation Surface (SPPS; ca. 220-110 m), corroborating differential uplift of the PCR relative to the SPPS. Owing to the fact that the relict graded profiles can be correlated with the Tagus River T1 terrace (1.1-0.9 My) present in the study area, incision rates can be estimated (1) for the streams located in the PCR, 0.38-0.15 m/ky and (2) for the streams flowing on the SPPS, 0.22-0.12 m/ky. The differential uplift inferred in the
Quantile regression theory and applications
Davino, Cristina; Vistocco, Domenico
2013-01-01
A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and
Directory of Open Access Journals (Sweden)
Andreas Schmitt
Full Text Available To appraise the Diabetes Self-Management Questionnaire (DSMQ's measurement of diabetes self-management as a statistical predictor of glycaemic control relative to the widely used SDSCA.248 patients with type 1 diabetes and 182 patients with type 2 diabetes were cross-sectionally assessed using the two self-report measures of diabetes self-management DSMQ and SDSCA; the scales were used as competing predictors of HbA1c. We developed a structural equation model of self-management as measured by the DSMQ and analysed the amount of variation explained in HbA1c; an analogue model was developed for the SDSCA.The structural equation models of self-management and glycaemic control showed very good fit to the data. The DSMQ's measurement of self-management showed associations with HbA1c of -0.53 for type 1 and -0.46 for type 2 diabetes (both P < 0.001, explaining 21% and 28% of variation in glycaemic control, respectively. The SDSCA's measurement showed associations with HbA1c of -0.14 (P = 0.030 for type 1 and -0.31 (P = 0.003 for type 2 diabetes, explaining 2% and 10% of glycaemic variation. Predictive power for glycaemic control was significantly higher for the DSMQ (P < 0.001.This study supports the DSMQ as the preferred tool when analysing self-reported behavioural problems related to reduced glycaemic control. The scale may be useful for clinical assessments of patients with suboptimal diabetes outcomes or research on factors affecting associations between self-management behaviours and glycaemic control.
Magee, Laura A.; von Dadelszen, Peter; Singer, Joel; Lee, Terry; Rey, Evelyne; Ross, Susan; Asztalos, Elizabeth; Murphy, Kellie E.; Menzies, Jennifer; Sanchez, Johanna; Gafni, Amiram; Gruslin, Andrée; Helewa, Michael; Hutton, Eileen; Lee, Shoo K.; Logan, Alexander G.; Ganzevoort, Wessel; Welch, Ross; Thornton, Jim G.; Moutquin, Jean Marie
2016-01-01
Introduction. For women with chronic or gestational hypertension in CHIPS (Control of Hypertension In Pregnancy Study, NCT01192412), we aimed to examine whether clinical predictors collected at randomization could predict adverse outcomes. Material and methods. This was a planned, secondary analysis
International Nuclear Information System (INIS)
Bennett, J.T.; Crowder, C.A.; Connolly, M.J.
1994-01-01
Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in
Panel Smooth Transition Regression Models
DEFF Research Database (Denmark)
González, Andrés; Terasvirta, Timo; Dijk, Dick van
We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...
Testing discontinuities in nonparametric regression
Dai, Wenlin
2017-01-19
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Testing discontinuities in nonparametric regression
Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun
2017-01-01
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Logistic Regression: Concept and Application
Cokluk, Omay
2010-01-01
The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…
International Nuclear Information System (INIS)
Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei
2007-01-01
Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age
Tumor regression patterns in retinoblastoma
International Nuclear Information System (INIS)
Zafar, S.N.; Siddique, S.N.; Zaheer, N.
2016-01-01
To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)
Regression to Causality : Regression-style presentation influences causal attribution
DEFF Research Database (Denmark)
Bordacconi, Mats Joe; Larsen, Martin Vinæs
2014-01-01
of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...
Directory of Open Access Journals (Sweden)
Claudio Napolis Costa
2005-10-01
Full Text Available Registros individuais perfazendo 8.183 controles de produção de leite (PLC de 1.273 primeiras lactações de vacas da raça Gir de rebanhos supervisionados pela ABCZ no período 1994/2000 foram utilizados para se estimar componentes de variância e parâmetros genéticos para PLC usando-se REML. Foram comparados um modelo de repetibilidade e modelos de regressão aleatória ajustados com as funções logarítmica (Ali & Schaeffer, 1987, exponencial (Wilmink, 1987 e polinômios ortogonais de Legendre (LP de ordem 3 a 5, sob as pressuposições de homogeneidade e heterogeneidade de variância residual, definidas respectivamente por uma (ME=1 ou quatro classes de medidas de erro temporário (ME=4 ao longo do período de lactação. Também ajustou-se a produção de leite acumulada na lactação (P305 utilizando-se um modelo animal cuja estimativa de heritabilidade foi 0,22. As estimativas de heritabilidade e repetibilidade para a PLC foram 0,27 e 0,76, respectivamente. As estimativas de heritabilidade com a FAS e a FW alcançaram respectivamente 0,59 e 0,74 ao início da lactação e decresceram para valores próximos de 0,20 ao final do período. Exceto para o LP de quinta ordem com ME=1, as estimativas de heritabilidade diminuíram de 0,70 no início para 0,30 no final da lactação. Menores estimativas de VR foram obtidas para a FAS do que para a FW sob ambas as pressuposições de homogeneidade e heterogeneidade de variância. Em todos os estádios da lactação, as estimativas de VR diminuíram com o aumento da ordem do LP e dependeram da pressuposição sobre o ME. As estimativas das variâncias genética e de ambiente permanente não apresentaram nenhuma tendência com o aumento da ordem do LP e não se observaram diferenças significativas para tais estimativas sob a pressuposição de heterogeneidade de VR ao longo da lactação. Maiores valores de correlação genética entre as PLC foram obtidos com a FW, que também apresentou maior
van Tricht, Mirjam J; Bour, Lo J; Koelman, Johannes H T M; Derks, Eske M; Braff, David L; de Wilde, Odette M; Boerée, Thijs; Linszen, Don H; de Haan, Lieuwe; Nieman, Dorien H
2015-04-01
We aimed to determine profiles of information processing deficits in the pathway to first psychosis. Sixty-one subjects at ultrahigh risk (UHR) for psychosis were assessed, of whom 18 converted to a first episode of psychosis (FEP) within the follow-up period. Additionally, 47 FEP and 30 control subjects were included. Using 10 neurophysiological parameters associated with information processing, latent class analyses yielded three classes at baseline. Class membership was related to group status. Within the UHR sample, two classes were found. Transition to psychosis was nominally associated with class membership. Neurophysiological profiles were unstable over time, but associations between specific neurophysiological components at baseline and follow-up were found. We conclude that certain constellations of neurophysiological variables aid in the differentiation between controls and patients in the prodrome and after first psychosis. Copyright © 2014 Society for Psychophysiological Research.
Augmenting Data with Published Results in Bayesian Linear Regression
de Leeuw, Christiaan; Klugkist, Irene
2012-01-01
In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…
Predicting Word Reading Ability: A Quantile Regression Study
McIlraith, Autumn L.
2018-01-01
Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…
Advanced statistics: linear regression, part II: multiple linear regression.
Marill, Keith A
2004-01-01
The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.
Directory of Open Access Journals (Sweden)
Asres Berhan
Full Text Available The development of tipranavir and darunavir, second generation non-peptidic HIV protease inhibitors, with marked improved resistance profiles, has opened a new perspective on the treatment of antiretroviral therapy (ART experienced HIV patients with poor viral load control. The aim of this study was to determine the virologic response in ART experienced patients to tipranavir-ritonavir and darunavir-ritonavir based regimens.A computer based literature search was conducted in the databases of HINARI (Health InterNetwork Access to Research Initiative, Medline and Cochrane library. Meta-analysis was performed by including randomized controlled studies that were conducted in ART experienced patients with plasma viral load above 1,000 copies HIV RNA/ml. The odds ratios and 95% confidence intervals (CI for viral loads of <50 copies and <400 copies HIV RNA/ml at the end of the intervention were determined by the random effects model. Meta-regression, sensitivity analysis and funnel plots were done. The number of HIV-1 patients who were on either a tipranavir-ritonavir or darunavir-ritonavir based regimen and achieved viral load less than 50 copies HIV RNA/ml was significantly higher (overall OR = 3.4; 95% CI, 2.61-4.52 than the number of HIV-1 patients who were on investigator selected boosted comparator HIV-1 protease inhibitors (CPIs-ritonavir. Similarly, the number of patients with viral load less than 400 copies HIV RNA/ml was significantly higher in either the tipranavir-ritonavir or darunavir-ritonavir based regimen treated group (overall OR = 3.0; 95% CI, 2.15-4.11. Meta-regression showed that the viral load reduction was independent of baseline viral load, baseline CD4 count and duration of tipranavir-ritonavir or darunavir-ritonavir based regimen.Tipranavir and darunavir based regimens were more effective in patients who were ART experienced and had poor viral load control. Further studies are required to determine their consistent
Parsa, Mohammad; Maghsoudi, Abbas
2018-04-01
The Behabad district, located in the central Iranian microcontinent, contains numerous epigenetic stratabound carbonate-hosted Zn-Pb ore bodies. The mineralizations formed as fault, fracture and karst fillings in the Permian-Triassic formations, especially in Middle Triassic dolostones, and comprise mainly non-sulfides zinc ores. These are all interpreted as Mississippi Valley-type (MVT) base metal deposits. From an economic geological point of view, it is imperative to recognize the processes that have plausibly controlled the emplacement of MVT Zn-Pb mineralization in the Behabad district. To address the foregoing issue, analyses of the spatial distribution of mineral deposits comprising fry and fractal techniques and analysis of the spatial association of mineral deposits with geological features using distance distribution analysis were applied to assess the regional-scale processes that could have operated in the distribution of MVT Zn-Pb deposits in the district. The obtained results based on these analytical techniques show the main trends of the occurrences are NW-SE and NE-SW, which are parallel or subparallel to the major northwest and northeast trending faults, supporting the idea that these particular faults could have acted as the main conduits for transport of mineral-bearing fluids. The results of these analyses also suggest that Permian-Triassic brittle carbonate sedimentary rocks have served as the lithological controls on MVT mineralization in the Behabad district as they are spatially and temporally associated with mineralization.
Logistic regression applied to natural hazards: rare event logistic regression with replications
Guns, M.; Vanacker, Veerle
2012-01-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logisti...
Quantile Regression With Measurement Error
Wei, Ying; Carroll, Raymond J.
2009-01-01
. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a
From Rasch scores to regression
DEFF Research Database (Denmark)
Christensen, Karl Bang
2006-01-01
Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....
Testing Heteroscedasticity in Robust Regression
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2011-01-01
Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf
Regression methods for medical research
Tai, Bee Choo
2013-01-01
Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the
Forecasting with Dynamic Regression Models
Pankratz, Alan
2012-01-01
One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.
Poisson Mixture Regression Models for Heart Disease Prediction.
Mufudza, Chipo; Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.
Directory of Open Access Journals (Sweden)
Lee Myoungsook
2011-01-01
Full Text Available Abstract Background Probucol, a cholesterol-lowering agent that paradoxically also lowers high-density lipoprotein cholesterol has been shown to prevent progression of atherosclerosis. The antiplatelet agent cilostazol, which has diverse antiatherogenic properties, has also been shown to reduce restenosis in previous clinical trials. Recent experimental studies have suggested potential synergy between probucol and cilostazol in preventing atherosclerosis, possibly by suppressing inflammatory reactions and promoting cholesterol efflux. Methods/design The Synergistic Effect of combination therapy with Cilostazol and probUcol on plaque stabilization and lesion REgression (SECURE study is designed as a double-blind, randomised, controlled, multicenter clinical trial to investigate the effect of cilostazol and probucol combination therapy on plaque volume and composition in comparison with cilostazol monotherapy using intravascular ultrasound and Virtual Histology. The primary end point is the change in the plaque volume of index intermediate lesions between baseline and 9-month follow-up. Secondary endpoints include change in plaque composition, neointimal growth after implantation of stents at percutaneous coronary intervention target lesions, and serum levels of lipid components and biomarkers related to atherosclerosis and inflammation. A total of 118 patients will be included in the study. Discussion The SECURE study will deliver important information on the effects of combination therapy on lipid composition and biomarkers related to atherosclerosis, thereby providing insight into the mechanisms underlying the prevention of atherosclerosis progression by cilostazol and probucol. Trial registration number ClinicalTrials (NCT: NCT01031667
Logistic regression for dichotomized counts.
Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W
2016-12-01
Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.
Producing The New Regressive Left
DEFF Research Database (Denmark)
Crone, Christine
members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...
A Matlab program for stepwise regression
Directory of Open Access Journals (Sweden)
Yanhong Qi
2016-03-01
Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.
Correlation and simple linear regression.
Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G
2003-06-01
In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.
Regression filter for signal resolution
International Nuclear Information System (INIS)
Matthes, W.
1975-01-01
The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)
Nonparametric Mixture of Regression Models.
Huang, Mian; Li, Runze; Wang, Shaoli
2013-07-01
Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.
Directory of Open Access Journals (Sweden)
Lidy M Pelsser
Full Text Available Attention-deficit/hyperactivity disorder (ADHD is a debilitating mental health problem hampering the child's development. The underlying causes include both genetic and environmental factors and may differ between individuals. The efficacy of diet treatments in ADHD was recently evaluated in three reviews, reporting divergent and confusing conclusions based on heterogeneous studies and subjects. To address this inconsistency we conducted a systematic review of meta-analyses of double-blind placebo-controlled trials evaluating the effect of diet interventions (elimination and supplementation on ADHD.Our literature search resulted in 14 meta-analyses, six of which confined to double-blind placebo-controlled trials applying homogeneous diet interventions, i.e. artificial food color (AFC elimination, a few-foods diet (FFD and poly-unsaturated fatty acid (PUFA supplementation. Effect sizes (ES and Confidence intervals (CI of study outcomes were depicted in a forest plot. I2 was calculated to assess heterogeneity if necessary and additional random effects subgroup meta-regression was conducted if substantial heterogeneity was present.The AFC ESs were 0.44 (95% CI: 0.16-0.72, I2 = 11% and 0.21 (95% CI: -0.02-0.43, I2 = 68% [parent ratings], 0.08 (95% CI: -0.07-0.24, I2 = 0% [teacher ratings] and 0.11 (95% CI: -0.13-0.34, I2 = 12% [observer ratings]. The FFD ESs were 0.80 (95% CI: 0.41-1.19, I2 = 61% [parent ratings] and 0.51 (95% CI: -0.02-1.04, I2 = 72% [other ratings], while the PUFA ESs were 0.17 (95% CI: -0.03-0.38, I2 = 38% [parent ratings], -0.05 (95% CI: -0.27-0.18, I2 = 0% [teacher ratings] and 0.16 (95% CI: 0.01-0.31, I2 = 0% [parent and teacher ratings]. Three meta-analyses (two FFD and one AFC resulted in high I2 without presenting subgroup results. The FFD meta-analyses provided sufficient data to perform subgroup analyses on intervention type, resulting in a decrease of heterogeneity to 0% (diet design and 37.8% (challenge design
Farooq, Sabiha; Mazhar, Wardah; Siddiqui, Amna Jabbar; Ansari, Saqib Hussain; Musharraf, Syed Ghulam
2018-01-31
β-Thalassemia is one of the most common inherited disorders and is widely distributed throughout the world. Owing to severe deficiencies in red blood cell production, blood transfusion is required to correct anemia for normal growth and development but causes additional complications owing to iron overload. The aim of this study is to quantify the biometal dysregulations in β-thalassemia patients as compared with healthy controls. A total of 17 elements were analyzed in serum samples of β-thalassemia patients and healthy controls using ICP-MS followed by chemometric analyses. Out of these analyzed elements, 14 showed a significant difference between healthy and disease groups at p 3. A PLS-DA model revealed an excellent separation with 89.8% sensitivity and 97.2% specificity and the overall accuracy of the model was 92.2%. This metallomic study revealed that there is major difference in metallomic profiling of β-thalassemia patients specifically in Co, Mn, Ni, V and Ba, whereas the fold changes in Co, Mn, V and Ba were found to be greater than that in Fe, providing evidence that, in addition to Fe, other metals are also altered significantly and therefore chelation therapy for other metals may also needed in β-thalassemia patients. Copyright © 2018 John Wiley & Sons, Ltd.
Martin, José Luis R; Pérez, Víctor; Sacristán, Montse; Alvarez, Enric
2005-12-01
Systematic reviews in mental health have become useful tools for health professionals in view of the massive amount and heterogeneous nature of biomedical information available today. In order to determine the risk of bias in the studies evaluated and to avoid bias in generalizing conclusions from the reviews it is therefore important to use a very strict methodology in systematic reviews. One bias which may affect the generalization of results is publication bias, which is determined by the nature and direction of the study results. To control or minimize this type of bias, the authors of systematic reviews undertake comprehensive searches of medical databases and expand on the findings, often undertaking searches of grey literature (material which is not formally published). This paper attempts to show the consequences (and risk) of generalizing the implications of grey literature in the control of publication bias, as was proposed in a recent systematic work. By repeating the analyses for the same outcome from three different systematic reviews that included both published and grey literature our results showed that confusion between grey literature and publication bias may affect the results of a concrete meta-analysis.
Cactus: An Introduction to Regression
Hyde, Hartley
2008-01-01
When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…
Regression Models for Repairable Systems
Czech Academy of Sciences Publication Activity Database
Novák, Petr
2015-01-01
Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf
Survival analysis II: Cox regression
Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.
2011-01-01
In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the
Kernel regression with functional response
Ferraty, Frédéric; Laksaci, Ali; Tadj, Amel; Vieu, Philippe
2011-01-01
We consider kernel regression estimate when both the response variable and the explanatory one are functional. The rates of uniform almost complete convergence are stated as function of the small ball probability of the predictor and as function of the entropy of the set on which uniformity is obtained.
Targeting: Logistic Regression, Special Cases and Extensions
Directory of Open Access Journals (Sweden)
Helmut Schaeben
2014-12-01
Full Text Available Logistic regression is a classical linear model for logit-transformed conditional probabilities of a binary target variable. It recovers the true conditional probabilities if the joint distribution of predictors and the target is of log-linear form. Weights-of-evidence is an ordinary logistic regression with parameters equal to the differences of the weights of evidence if all predictor variables are discrete and conditionally independent given the target variable. The hypothesis of conditional independence can be tested in terms of log-linear models. If the assumption of conditional independence is violated, the application of weights-of-evidence does not only corrupt the predicted conditional probabilities, but also their rank transform. Logistic regression models, including the interaction terms, can account for the lack of conditional independence, appropriate interaction terms compensate exactly for violations of conditional independence. Multilayer artificial neural nets may be seen as nested regression-like models, with some sigmoidal activation function. Most often, the logistic function is used as the activation function. If the net topology, i.e., its control, is sufficiently versatile to mimic interaction terms, artificial neural nets are able to account for violations of conditional independence and yield very similar results. Weights-of-evidence cannot reasonably include interaction terms; subsequent modifications of the weights, as often suggested, cannot emulate the effect of interaction terms.
Linear regression and the normality assumption.
Schmidt, Amand F; Finan, Chris
2017-12-16
Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.
Kamphuis, P J G H; Verhey, F R J; Olde Rikkert, M G M; Twisk, J W R; Swinkels, S H N; Scheltens, P
2011-08-01
To investigate the effect of a medical food (Souvenaid) on body mass index (BMI) and functional abilities in patients with mild Alzheimer's disease (AD). DESIGN/SETTING/PARTICIPANTS/INTERVENTION /MEASUREMENTS: These analyses were performed on data from a 12-week, double-blind, randomized, controlled, multicenter, proof-of-concept study with a similarly designed and exploratory 12-week extension period. Patients with mild AD (Mini-Mental State Examination score of 20-26) were randomized to receive either the active product or an iso-caloric control product. While primary outcomes included measures of cognition, the 23-item Alzheimer's Disease Cooperative Study-Activities of Daily Living (ADCS-ADL) scale was included as a secondary outcome. Both ADCS-ADL and BMI were assessed at baseline and Weeks 6, 12 and 24. Data were analyzed using a repeated-measures mixed model. Overall, data suggested an increased BMI in the active versus the control group at Week 24 (ITT: p = 0.07; PP: p = 0.03), but no treatment effect on ADCS-ADL was observed. However, baseline BMI was found to be a significant treatment effect modifier (ITT: p = 0.04; PP: p = 0.05), and an increase in ADCS-ADL was observed at Week 12 in patients with a 'low' baseline BMI (ITT: p = 0.02; PP: p = 0.04). These data indicate that baseline BMI significantly impacts the effect of Souvenaid on functional abilities. In addition, there was a suggestion that Souvenaid increased BMI.
Directory of Open Access Journals (Sweden)
Kaatje Bollaerts
Full Text Available An increase in narcolepsy cases was observed in Finland and Sweden towards the end of the 2009 H1N1 influenza pandemic. Preliminary observational studies suggested a temporal link with the pandemic influenza vaccine Pandemrix™, leading to a number of additional studies across Europe. Given the public health urgency, these studies used readily available retrospective data from various sources. The potential for bias in such settings was generally acknowledged. Although generally advocated by key opinion leaders and international health authorities, no systematic quantitative assessment of the potential joint impact of biases was undertaken in any of these studies.We applied bias-level multiple-bias analyses to two of the published narcolepsy studies: a pediatric cohort study from Finland and a case-control study from France. In particular, we developed Monte Carlo simulation models to evaluate a potential cascade of biases, including confounding by age, by indication and by natural H1N1 infection, selection bias, disease- and exposure misclassification. All bias parameters were evidence-based to the extent possible.Given the assumptions used for confounding, selection bias and misclassification, the Finnish rate ratio of 13.78 (95% CI: 5.72-28.11 reduced to a median value of 6.06 (2.5th- 97.5th percentile: 2.49-15.1 and the French odds ratio of 5.43 (95% CI: 2.6-10.08 to 1.85 (2.5th-97.5th percentile: 0.85-4.08.We illustrate multiple-bias analyses using two studies on the Pandemrix™-narcolepsy association and advocate their use to better understand the robustness of study findings. Based on our multiple-bias models, the observed Pandemrix™-narcolepsy association consistently persists in the Finnish study. For the French study, the results of our multiple-bias models were inconclusive.
Energy Technology Data Exchange (ETDEWEB)
Kumaraswamy, Raji; Ebert, Sara; Fedorak, Phillip M.; Foght, Julia M. [Alberta Univ., Edmonton, AB (Canada). Biological Sciences; Gray, Murray R. [Alberta Univ., Edmonton, AB (Canada). Chemical and Materials Engineering
2011-03-15
Nitrate injection into oil fields is an alternative to biocide addition for controlling sulfide production ('souring') caused by sulfate-reducing bacteria (SRB). This study examined the suitability of several cultivation-dependent and cultivation-independent methods to assess potential microbial activities (sulfidogenesis and nitrate reduction) and the impact of nitrate amendment on oil field microbiota. Microcosms containing produced waters from two Western Canadian oil fields exhibited sulfidogenesis that was inhibited by nitrate amendment. Most probable number (MPN) and fluorescent in situ hybridization (FISH) analyses of uncultivated produced waters showed low cell numbers ({<=}10{sup 3} MPN/ml) dominated by SRB (>95% relative abundance). MPN analysis also detected nitrate-reducing sulfide-oxidizing bacteria (NRSOB) and heterotrophic nitrate-reducing bacteria (HNRB) at numbers too low to be detected by FISH or denaturing gradient gel electrophoresis (DGGE). In microcosms containing produced water fortified with sulfate, near-stoichiometric concentrations of sulfide were produced. FISH analyses of the microcosms after 55 days of incubation revealed that Gammaproteobacteria increased from undetectable levels to 5-20% abundance, resulting in a decreased proportion of Deltaproteobacteria (50-60% abundance). DGGE analysis confirmed the presence of Delta- and Gammaproteobacteria and also detected Bacteroidetes. When sulfate-fortified produced waters were amended with nitrate, sulfidogenesis was inhibited and Deltaproteobacteria decreased to levels undetectable by FISH, with a concomitant increase in Gammaproteobacteria from below detection to 50-60% abundance. DGGE analysis of these microcosms yielded sequences of Gamma- and Epsilonproteobacteria related to presumptive HNRB and NRSOB (Halomonas, Marinobacterium, Marinobacter, Pseudomonas and Arcobacter), thus supporting chemical data indicating that nitrate-reducing bacteria out-compete SRB when nitrate is
Hoos, Anne B.; McMahon, Gerard
2009-01-01
Understanding how nitrogen transport across the landscape varies with landscape characteristics is important for developing sound nitrogen management policies. We used a spatially referenced regression analysis (SPARROW) to examine landscape characteristics influencing delivery of nitrogen from sources in a watershed to stream channels. Modelled landscape delivery ratio varies widely (by a factor of 4) among watersheds in the southeastern United States—higher in the western part (Tennessee, Alabama, and Mississippi) than in the eastern part, and the average value for the region is lower compared to other parts of the nation. When we model landscape delivery ratio as a continuous function of local-scale landscape characteristics, we estimate a spatial pattern that varies as a function of soil and climate characteristics but exhibits spatial structure in residuals (observed load minus predicted load). The spatial pattern of modelled landscape delivery ratio and the spatial pattern of residuals coincide spatially with Level III ecoregions and also with hydrologic landscape regions. Subsequent incorporation into the model of these frameworks as regional scale variables improves estimation of landscape delivery ratio, evidenced by reduced spatial bias in residuals, and suggests that cross-scale processes affect nitrogen attenuation on the landscape. The model-fitted coefficient values are logically consistent with the hypothesis that broad-scale classifications of hydrologic response help to explain differential rates of nitrogen attenuation, controlling for local-scale landscape characteristics. Negative model coefficients for hydrologic landscape regions where the primary flow path is shallow ground water suggest that a lower fraction of nitrogen mass will be delivered to streams; this relation is reversed for regions where the primary flow path is overland flow.
DEFF Research Database (Denmark)
Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove
2007-01-01
the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...
Quantile Regression With Measurement Error
Wei, Ying
2009-08-27
Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.
Marques, Elsa M R; Jones, Hayley E; Elvers, Karen T; Pyke, Mark; Blom, Ashley W; Beswick, Andrew D
2014-07-05
Surgical pain is managed with multi-modal anaesthesia in total hip replacement (THR) and total knee replacement (TKR). It is unclear whether including local anaesthetic infiltration before wound closure provides additional pain control. We performed a systematic review of randomised controlled trials of local anaesthetic infiltration in patients receiving THR or TKR. We searched MEDLINE, Embase and Cochrane CENTRAL to December 2012. Two reviewers screened abstracts, extracted data, and contacted authors for unpublished outcomes and data. Outcomes collected were post-operative pain at rest and during activity after 24 and 48 hours, opioid requirement, mobilisation, hospital stay and complications. When feasible, we estimated pooled treatment effects using random effects meta-analyses. In 13 studies including 909 patients undergoing THR, patients receiving local anaesthetic infiltration experienced a greater reduction in pain at 24 hours at rest by standardised mean difference (SMD) -0.61 (95% CI -1.05, -0.16; p = 0.008) and by SMD -0.43 (95% CI -0.78 -0.09; p = 0.014) at 48 hours during activity.In TKR, diverse multi-modal regimens were reported. In 23 studies including 1439 patients undergoing TKR, local anaesthetic infiltration reduced pain on average by SMD -0.40 (95% CI -0.58, -0.22; p SMD -0.27 (95% CI -0.50, -0.05; p = 0.018) at 48 hours during activity, compared with patients receiving no infiltration or placebo. There was evidence of a larger reduction in studies delivering additional local anaesthetic after wound closure. There was no evidence of pain control additional to that provided by femoral nerve block.Patients receiving local anaesthetic infiltration spent on average an estimated 0.83 (95% CI 1.54, 0.12; p = 0.022) and 0.87 (95% CI 1.62, 0.11; p = 0.025) fewer days in hospital after THR and TKR respectively, had reduced opioid consumption, earlier mobilisation, and lower incidence of vomiting.Few studies reported long-term outcomes. Local
Multivariate and semiparametric kernel regression
Härdle, Wolfgang; Müller, Marlene
1997-01-01
The paper gives an introduction to theory and application of multivariate and semiparametric kernel smoothing. Multivariate nonparametric density estimation is an often used pilot tool for examining the structure of data. Regression smoothing helps in investigating the association between covariates and responses. We concentrate on kernel smoothing using local polynomial fitting which includes the Nadaraya-Watson estimator. Some theory on the asymptotic behavior and bandwidth selection is pro...
Regression algorithm for emotion detection
Berthelon , Franck; Sander , Peter
2013-01-01
International audience; We present here two components of a computational system for emotion detection. PEMs (Personalized Emotion Maps) store links between bodily expressions and emotion values, and are individually calibrated to capture each person's emotion profile. They are an implementation based on aspects of Scherer's theoretical complex system model of emotion~\\cite{scherer00, scherer09}. We also present a regression algorithm that determines a person's emotional feeling from sensor m...
Directional quantile regression in R
Czech Academy of Sciences Publication Activity Database
Boček, Pavel; Šiman, Miroslav
2017-01-01
Roč. 53, č. 3 (2017), s. 480-492 ISSN 0023-5954 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : multivariate quantile * regression quantile * halfspace depth * depth contour Subject RIV: BD - Theory of Information OBOR OECD: Applied mathematics Impact factor: 0.379, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/bocek-0476587.pdf
Polylinear regression analysis in radiochemistry
International Nuclear Information System (INIS)
Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.
1995-01-01
A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis
International Nuclear Information System (INIS)
2014-06-01
The IAEA supports Member State activities in advanced fast reactor technology development by providing a major fulcrum for information exchange and collaborative research programmes. The IAEA’s activities in this field are mainly carried out within the framework of the Technical Working Group on Fast Reactors (TWG-FR), which assists in the implementation of corresponding IAEA activities and ensures that all technical activities are in line with the expressed needs of Member States. In the broad range of activities, the IAEA proposes and establishes coordinated research projects (CRPs) aimed at improving Member States’ capabilities in fast reactor design and analysis. An important opportunity to conduct collaborative research activities was provided by the experimental campaign run by the French Alternative Energies and Atomic Energy Commission (CEA, Commissariat à l’énergie atomique et aux énergies alternatives) at the PHÉNIX, a prototype sodium cooled fast reactor. Before the definitive shutdown in 2009, end-of-life tests were conducted to gather additional experience on the operation of sodium cooled reactors. Thanks to the CEA opening the experiments to international cooperation, the IAEA decided in 2007 to launch the CRP entitled Control Rod Withdrawal and Sodium Natural Circulation Tests Performed during the PHÉNIX End-of-Life Experiments. The CRP, together with institutes from seven States, contributed to improving capabilities in sodium cooled fast reactor simulation through code verification and validation, with particular emphasis on temperature and power distribution calculations and the analysis of sodium natural circulation phenomena. The objective of this publication is to document the results and main achievements of the benchmark analyses on the control rod withdrawal test performed within the framework of the PHÉNIX end-of-life experimental campaign
Energy Technology Data Exchange (ETDEWEB)
NONE
2014-06-15
The IAEA supports Member State activities in advanced fast reactor technology development by providing a major fulcrum for information exchange and collaborative research programmes. The IAEA’s activities in this field are mainly carried out within the framework of the Technical Working Group on Fast Reactors (TWG-FR), which assists in the implementation of corresponding IAEA activities and ensures that all technical activities are in line with the expressed needs of Member States. In the broad range of activities, the IAEA proposes and establishes coordinated research projects (CRPs) aimed at improving Member States’ capabilities in fast reactor design and analysis. An important opportunity to conduct collaborative research activities was provided by the experimental campaign run by the French Alternative Energies and Atomic Energy Commission (CEA, Commissariat à l’énergie atomique et aux énergies alternatives) at the PHÉNIX, a prototype sodium cooled fast reactor. Before the definitive shutdown in 2009, end-of-life tests were conducted to gather additional experience on the operation of sodium cooled reactors. Thanks to the CEA opening the experiments to international cooperation, the IAEA decided in 2007 to launch the CRP entitled Control Rod Withdrawal and Sodium Natural Circulation Tests Performed during the PHÉNIX End-of-Life Experiments. The CRP, together with institutes from seven States, contributed to improving capabilities in sodium cooled fast reactor simulation through code verification and validation, with particular emphasis on temperature and power distribution calculations and the analysis of sodium natural circulation phenomena. The objective of this publication is to document the results and main achievements of the benchmark analyses on the control rod withdrawal test performed within the framework of the PHÉNIX end-of-life experimental campaign.
Gaussian Process Regression Model in Spatial Logistic Regression
Sofro, A.; Oktaviarina, A.
2018-01-01
Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.
Demarre, Liesbet; Verhaeghe, Sofie; Van Hecke, Ann; Clays, Els; Grypdonck, Maria; Beeckman, Dimitri
2015-02-01
To identify predictive factors associated with the development of pressure ulcers in patients at risk who receive standardized preventive care. Numerous studies have examined factors that predict risk for pressure ulcer development. Only a few studies identified risk factors associated with pressure ulcer development in hospitalized patients receiving standardized preventive care. Secondary analyses of data collected in a multicentre randomized controlled trial. The sample consisted of 610 consecutive patients at risk for pressure ulcer development (Braden Score Pressure ulcers in category II-IV were significantly associated with non-blanchable erythema, urogenital disorders and higher body temperature. Predictive factors significantly associated with superficial pressure ulcers were admission to an internal medicine ward, incontinence-associated dermatitis, non-blanchable erythema and a lower Braden score. Superficial sacral pressure ulcers were significantly associated with incontinence-associated dermatitis. Despite the standardized preventive measures they received, hospitalized patients with non-blanchable erythema, urogenital disorders and a higher body temperature were at increased risk for developing pressure ulcers. Improved identification of at-risk patients can be achieved by taking into account specific predictive factors. Even if preventive measures are in place, continuous assessment and tailoring of interventions is necessary in all patients at risk. Daily skin observation can be used to continuously monitor the effectiveness of the intervention. © 2014 John Wiley & Sons Ltd.
Tutorial on Using Regression Models with Count Outcomes Using R
Directory of Open Access Journals (Sweden)
A. Alexander Beaujean
2016-02-01
Full Text Available Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares either with or without transforming the count variables. In either case, using typical regression for count data can produce parameter estimates that are biased, thus diminishing any inferences made from such data. As count-variable regression models are seldom taught in training programs, we present a tutorial to help educational researchers use such methods in their own research. We demonstrate analyzing and interpreting count data using Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regression models. The count regression methods are introduced through an example using the number of times students skipped class. The data for this example are freely available and the R syntax used run the example analyses are included in the Appendix.
Directory of Open Access Journals (Sweden)
Natalia Castaño-Rodríguez
Full Text Available BACKGROUND: Currently, it is well established that cancer arises in chronically inflamed tissue. A number of NOD-like receptors (NLRs form inflammasomes, intracellular multiprotein complexes critical for generating mature pro-inflammatory cytokines (IL-1β and IL-18. As chronic inflammation of the gastric mucosa is a consequence of Helicobacter pylori infection, we investigated the role of genetic polymorphisms and expression of genes involved in the NLR signalling pathway in H. pylori infection and related gastric cancer (GC. MATERIALS AND METHODS: Fifty-one genetic polymorphisms were genotyped in 310 ethnic Chinese (87 non-cardia GC cases and 223 controls with functional dyspepsia. In addition, gene expression of 84 molecules involved in the NLR signalling pathway was assessed in THP-1 cells challenged with two H. pylori strains, GC026 (GC and 26695 (gastritis. RESULTS: CARD8-rs11672725, NLRP3-rs10754558, NLRP3-rs4612666, NLRP12-rs199475867 and NLRX1-rs10790286 showed significant associations with GC. On multivariate analysis, CARD8-rs11672725 remained a risk factor (OR: 4.80, 95% CI: 1.39-16.58. Further, NLRP12-rs2866112 increased the risk of H. pylori infection (OR: 2.13, 95% CI: 1.22-3.71. Statistical analyses assessing the joint effect of H. pylori infection and the selected polymorphisms revealed strong associations with GC (CARD8, NLRP3, CASP1 and NLRP12 polymorphisms. In gene expression analyses, five genes encoding NLRs were significantly regulated in H. pylori-challenged cells (NLRC4, NLRC5, NLRP9, NLRP12 and NLRX1. Interestingly, persistent up-regulation of NFKB1 with simultaneous down-regulation of NLRP12 and NLRX1 was observed in H. pylori GC026-challenged cells. Further, NF-κB target genes encoding pro-inflammatory cytokines, chemokines and molecules involved in carcinogenesis were markedly up-regulated in H. pylori GC026-challenged cells. CONCLUSIONS: Novel associations between polymorphisms in the NLR signalling pathway (CARD8
Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter
2017-04-01
U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure
Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T
2016-02-01
The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.
Spontaneous regression of pulmonary bullae
International Nuclear Information System (INIS)
Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.
2002-01-01
The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd
Quantum algorithm for linear regression
Wang, Guoming
2017-07-01
We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.
Interpretation of commonly used statistical regression models.
Kasza, Jessica; Wolfe, Rory
2014-01-01
A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.
Keith, Timothy Z
2014-01-01
Multiple Regression and Beyond offers a conceptually oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. Covers both MR and SEM, while explaining their relevance to one another Also includes path analysis, confirmatory factor analysis, and latent growth modeling Figures and tables throughout provide examples and illustrate key concepts and techniques For additional resources, please visit: http://tzkeith.com/.
Skoffer, Birgit; Dalgas, Ulrik; Maribo, Thomas; Søballe, Kjeld; Mechlenburg, Inger
2017-11-09
Preoperative progressive resistance training (PRT) is controversial in patients scheduled for total knee arthroplasty (TKA), because of the concern that it may exacerbate knee joint pain and effusion. To examine whether preoperative PRT initiated 5 weeks prior to TKA would exacerbate pain and knee effusion, and would allow a progressively increased training load throughout the training period that would subsequently increase muscle strength. Secondary analyses from a randomized controlled trial. University Hospital and a Regional Hospital. A total of 30 patients who were scheduled for TKA due to osteoarthritis and assigned as the intervention group. Patients underwent unilateral PRT (3 sessions per week). Exercise loading was 12 repetitions maximum (RM) with progression toward 8 RM. The training program consisted of 6 exercises performed unilaterally. Before and after each training session, knee joint pain was rated on an 11-point scale, effusion was assessed by measuring the knee joint circumference, and training load was recorded. The first and last training sessions were initiated by 1 RM testing of unilateral leg press, unilateral knee extension, and unilateral knee flexion. The median pain change score from before to after each training session was 0 at all training sessions. The average increase in knee joint effusion across the 12 training sessions was a mean 0.16 cm ± 0.23 cm. No consistent increase in knee joint effusion after training sessions during the training period was found (P = .21). Training load generally increased, and maximal muscle strength improved as follows: unilateral leg press: 18% ± 30% (P = .03); unilateral knee extension: 81% ± 156% (P knee flexion: 53% ± 57% (P knee joint pain and effusion, despite a substantial progression in loading and increased muscle strength. Concerns for side effects such as pain and effusion after PRT seem unfounded. To be determined. Copyright © 2017. Published by Elsevier Inc.
Alqaydi, Ahlam R; Kanavakis, Georgios; Naser-Ud-Din, Shazia; Athanasiou, Athanasios E
2017-12-08
This study was conducted to explore authorship characteristics and publication trends of all orthodontic randomized controlled trials (RCTs), systematic reviews (SRs), and meta-analyses (MAs) published in non-orthodontic journals with impact factor (IF). Appropriate research strategies were developed to search for all articles published until December 2015, without restrictions regarding language or publication status. The initial search generated 4524 results, but after application of the inclusion criteria, the final number of articles was reduced to 274 (SRs: 152; MAs: 36; and RCTs: 86). Various authorship characteristics were recorded for each article. Frequency distributions for all parameters were explored with Pearson chi-square for independence at the 0.05 level of significance. More than half of the included publications were SRs (55.5 per cent), followed by RCTs (31.4 per cent) and MAs (13.1 per cent); one hundred seventy-eight (65 per cent) appeared in dental journals and 96 (35 per cent) were published in non-dental journals. The last decade was significantly more productive than the period before 2006, with 236 (86.1 per cent) articles published between 2006 and 2015. European countries produced 51.5 per cent of the total number of publications, followed by Asia (18.6 per cent) and North America (USA and Canada; 16.8 per cent). Studies published in journals without IF were not included. Level-1 evidence orthodontic literature published in non-orthodontic journals has significantly increased during 2006-15. This indicates a larger interest of other specialty journals in orthodontic related studies and a trend for orthodontic authors to publish their work in journals with impact in broader fields of dentistry and medicine. © The Author(s) 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com
Directory of Open Access Journals (Sweden)
Ulrika Gillespie
Full Text Available Appropriateness of prescribing can be assessed by various measures and screening instruments. The aims of this study were to investigate the effects of pharmacists' interventions on appropriateness of prescribing in elderly patients, and to explore the relationship between these results and hospital care utilization during a 12-month follow-up period.The study population from a previous randomized controlled study, in which the effects of a comprehensive pharmacist intervention on re-hospitalization was investigated, was used. The criteria from the instruments MAI, STOPP and START were applied retrospectively to the 368 study patients (intervention group (I n = 182, control group (C n = 186. The assessments were done on admission and at discharge to detect differences over time and between the groups. Hospital care consumption was recorded and the association between scores for appropriateness, and hospitalization was analysed.The number of Potentially Inappropriate Medicines (PIMs per patient as identified by STOPP was reduced for I but not for C (1.42 to 0.93 vs. 1.46 to 1.66 respectively, p<0.01. The number of Potential Prescription Omissions (PPOs per patient as identified by START was reduced for I but not for C (0.36 to 0.09 vs. 0.42 to 0.45 respectively, p<0.001. The summated score for MAI was reduced for I but not for C (8.5 to 5.0 and 8.7 to 10.0 respectively, p<0.001. There was a positive association between scores for MAI and STOPP and drug-related readmissions (RR 8-9% and 30-34% respectively. No association was detected between the scores of the tools and total re-visits to hospital.The interventions significantly improved the appropriateness of prescribing for patients in the intervention group as evaluated by the instruments MAI, STOPP and START. High scores in MAI and STOPP were associated with a higher number of drug-related readmissions.
On Weighted Support Vector Regression
DEFF Research Database (Denmark)
Han, Xixuan; Clemmensen, Line Katrine Harder
2014-01-01
We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...
Regression testing in the TOTEM DCS
International Nuclear Information System (INIS)
Rodríguez, F Lucas; Atanassov, I; Burkimsher, P; Frost, O; Taskinen, J; Tulimaki, V
2012-01-01
The Detector Control System of the TOTEM experiment at the LHC is built with the industrial product WinCC OA (PVSS). The TOTEM system is generated automatically through scripts using as input the detector Product Breakdown Structure (PBS) structure and its pinout connectivity, archiving and alarm metainformation, and some other heuristics based on the naming conventions. When those initial parameters and automation code are modified to include new features, the resulting PVSS system can also introduce side-effects. On a daily basis, a custom developed regression testing tool takes the most recent code from a Subversion (SVN) repository and builds a new control system from scratch. This system is exported in plain text format using the PVSS export tool, and compared with a system previously validated by a human. A report is sent to the developers with any differences highlighted, in readiness for validation and acceptance as a new stable version. This regression approach is not dependent on any development framework or methodology. This process has been satisfactory during several months, proving to be a very valuable tool before deploying new versions in the production systems.
Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; De Girolamo, A. M.; Lo Porto, A.; Buffagni, A.; Erba, S.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Skoulikidis, N.; Gómez, R.; Sánchez-Montoya, M. M.; Froebrich, J.
2012-09-01
Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the availability of mesohabitats in given periods that
Energy Technology Data Exchange (ETDEWEB)
Brunet, M
1994-03-01
The study covered in this paper is designed to review the current situation in terms of functional analysis, in order to find a functional analysis method for mechanical parts able to serve as a substrate for expressing operating safety constraints, time-related performance or any other tag of function description. This paper comprises three parts: The first is devoted to general notions of the formats used by the various functional analyses. It attempts to explain the three types of format: behavioural, structural and functional. It tackles the notions of trees and bottom-up and top-down approaches. It proposes examining the link between the expected functions of the systems and the hardware supporting these functions. It attempts to make a distinction between ``operators`` and ``operands`` enabling the notion of object to be linked to that of the three types of format seen above. It ends with a reminder of the distinction between semi-formal and formal. The second part analyses the current situation of functional analysis of the mechanical and control-instrumentation parts of power production plants, through a bibliographical search. The results of this second part are however disappointing. The purpose of the third part of the study is a prototype format built up from the considerations of the first two parts. This format meets our requirements better than those of the bibliographical analysis, but it could doubtless be improved: application of this format to RCV highlights its advantages, but also underlines the improvements needed. Given the deadlines of the Mexsyco project, the decision was taken to suspend development of this format for the time being and use a method currently being produced and based on use of the current functional breakdown (basic plant systems) and of a modular tree-structure representation of the control-instrumentation of a basic plant system. (author). 12 refs., 4 annexes.
Credit Scoring Problem Based on Regression Analysis
Khassawneh, Bashar Suhil Jad Allah
2014-01-01
ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....
Variable selection and model choice in geoadditive regression models.
Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard
2009-06-01
Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.
An Original Stepwise Multilevel Logistic Regression Analysis of Discriminatory Accuracy
DEFF Research Database (Denmark)
Merlo, Juan; Wagner, Philippe; Ghith, Nermin
2016-01-01
BACKGROUND AND AIM: Many multilevel logistic regression analyses of "neighbourhood and health" focus on interpreting measures of associations (e.g., odds ratio, OR). In contrast, multilevel analysis of variance is rarely considered. We propose an original stepwise analytical approach that disting...
Interpreting Multiple Linear Regression: A Guidebook of Variable Importance
Nathans, Laura L.; Oswald, Frederick L.; Nimon, Kim
2012-01-01
Multiple regression (MR) analyses are commonly employed in social science fields. It is also common for interpretation of results to typically reflect overreliance on beta weights, often resulting in very limited interpretations of variable importance. It appears that few researchers employ other methods to obtain a fuller understanding of what…
Regularized Label Relaxation Linear Regression.
Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu
2018-04-01
Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.
Estimating the exceedance probability of rain rate by logistic regression
Chiu, Long S.; Kedem, Benjamin
1990-01-01
Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.
CSIR Research Space (South Africa)
Burger, CR
2011-11-01
Full Text Available Current certification criteria for safety-critical systems exclude non-deterministic control systems. This paper investigates the feasibility of using human-like monitoring strategies to achieve safe non-deterministic control using multiple...
Independent contrasts and PGLS regression estimators are equivalent.
Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary
2012-05-01
We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.
Linear and logistic regression analysis
Tripepi, G.; Jager, K. J.; Dekker, F. W.; Zoccali, C.
2008-01-01
In previous articles of this series, we focused on relative risks and odds ratios as measures of effect to assess the relationship between exposure to risk factors and clinical outcomes and on control for confounding. In randomized clinical trials, the random allocation of patients is hoped to
Kamphuis, P.J.G.H.; Verhey, F.R.; Olde Rikkert, M.G.; Twisk, J.W.; Swinkels, S.H.; Scheltens, P.
2011-01-01
Objectives: To investigate the effect of a medical food (Souvenaid) on body mass index (BMI) and functional abilities in patients with mild Alzheimer's disease (AD). Design/setting/participants/intervention /measurements: These analyses were performed on data from a 12-week, double-blind,
Kamphuis, P.J.; Verhey, F.R.J.; Olde Rikkert, M.G.M.; Twisk, J.W.R.; Swinkels, S.H.N.; Scheltens, P.
2011-01-01
OBJECTIVES: To investigate the effect of a medical food (Souvenaid) on body mass index (BMI) and functional abilities in patients with mild Alzheimer's disease (AD). DESIGN/SETTING/PARTICIPANTS/INTERVENTION /MEASUREMENTS: These analyses were performed on data from a 12-week, double-blind,
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Significance testing in ridge regression for genetic data
Directory of Open Access Journals (Sweden)
De Iorio Maria
2011-09-01
Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.
Comparing parametric and nonparametric regression methods for panel data
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard; Henningsen, Arne
We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...
Poisson Mixture Regression Models for Heart Disease Prediction
Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611
Buteau, Stephane; Goldberg, Mark S; Burnett, Richard T; Gasparrini, Antonio; Valois, Marie-France; Brophy, James M; Crouse, Dan L; Hatzopoulou, Marianne
2018-04-01
Persons with congestive heart failure may be at higher risk of the acute effects related to daily fluctuations in ambient air pollution. To meet some of the limitations of previous studies using grouped-analysis, we developed a cohort study of persons with congestive heart failure to estimate whether daily non-accidental mortality were associated with spatially-resolved, daily exposures to ambient nitrogen dioxide (NO 2 ) and ozone (O 3 ), and whether these associations were modified according to a series of indicators potentially reflecting complications or worsening of health. We constructed the cohort from the linkage of administrative health databases. Daily exposure was assigned from different methods we developed previously to predict spatially-resolved, time-dependent concentrations of ambient NO 2 (all year) and O 3 (warm season) at participants' residences. We performed two distinct types of analyses: a case-crossover that contrasts the same person at different times, and a nested case-control that contrasts different persons at similar times. We modelled the effects of air pollution and weather (case-crossover only) on mortality using distributed lag nonlinear models over lags 0 to 3 days. We developed from administrative health data a series of indicators that may reflect the underlying construct of "declining health", and used interactions between these indicators and the cross-basis function for air pollutant to assess potential effect modification. The magnitude of the cumulative as well as the lag-specific estimates of association differed in many instances according to the metric of exposure. Using the back-extrapolation method, which is our preferred exposure model, we found for the case-crossover design a cumulative mean percentage changes (MPC) in daily mortality per interquartile increment in NO 2 (8.8 ppb) of 3.0% (95% CI: -0.9, 6.9%) and for O 3 (16.5 ppb) 3.5% (95% CI: -4.5, 12.1). For O 3 there was strong confounding by weather
Learning Inverse Rig Mappings by Nonlinear Regression.
Holden, Daniel; Saito, Jun; Komura, Taku
2017-03-01
We present a framework to design inverse rig-functions-functions that map low level representations of a character's pose such as joint positions or surface geometry to the representation used by animators called the animation rig. Animators design scenes using an animation rig, a framework widely adopted in animation production which allows animators to design character poses and geometry via intuitive parameters and interfaces. Yet most state-of-the-art computer animation techniques control characters through raw, low level representations such as joint angles, joint positions, or vertex coordinates. This difference often stops the adoption of state-of-the-art techniques in animation production. Our framework solves this issue by learning a mapping between the low level representations of the pose and the animation rig. We use nonlinear regression techniques, learning from example animation sequences designed by the animators. When new motions are provided in the skeleton space, the learned mapping is used to estimate the rig controls that reproduce such a motion. We introduce two nonlinear functions for producing such a mapping: Gaussian process regression and feedforward neural networks. The appropriate solution depends on the nature of the rig and the amount of data available for training. We show our framework applied to various examples including articulated biped characters, quadruped characters, facial animation rigs, and deformable characters. With our system, animators have the freedom to apply any motion synthesis algorithm to arbitrary rigging and animation pipelines for immediate editing. This greatly improves the productivity of 3D animation, while retaining the flexibility and creativity of artistic input.
Unbalanced Regressions and the Predictive Equation
DEFF Research Database (Denmark)
Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo
Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...
Semiparametric regression during 2003–2007
Ruppert, David; Wand, M.P.; Carroll, Raymond J.
2009-01-01
Semiparametric regression is a fusion between parametric regression and nonparametric regression that integrates low-rank penalized splines, mixed model and hierarchical Bayesian methodology – thus allowing more streamlined handling of longitudinal and spatial correlation. We review progress in the field over the five-year period between 2003 and 2007. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application.
Gaussian process regression analysis for functional data
Shi, Jian Qing
2011-01-01
Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime
Regression Analysis by Example. 5th Edition
Chatterjee, Samprit; Hadi, Ali S.
2012-01-01
Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…
Standards for Standardized Logistic Regression Coefficients
Menard, Scott
2011-01-01
Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…
A Seemingly Unrelated Poisson Regression Model
King, Gary
1989-01-01
This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.
DEFF Research Database (Denmark)
Sharifzadeh, Sara; Skytte, Jacob Lercke; Nielsen, Otto Højager Attermann
2012-01-01
Statistical solutions find wide spread use in food and medicine quality control. We investigate the effect of different regression and sparse regression methods for a viscosity estimation problem using the spectro-temporal features from new Sub-Surface Laser Scattering (SLS) vision system. From...... with sparse LAR, lasso and Elastic Net (EN) sparse regression methods. Due to the inconsistent measurement condition, Locally Weighted Scatter plot Smoothing (Loess) has been employed to alleviate the undesired variation in the estimated viscosity. The experimental results of applying different methods show...
International Nuclear Information System (INIS)
Takeda, Tatsuoki
1985-01-01
In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)
Norrie, John; Davidson, Kate; Tata, Philip; Gumley, Andrew
2013-09-01
We investigated the treatment effects reported from a high-quality randomized controlled trial of cognitive behavioural therapy (CBT) for 106 people with borderline personality disorder attending community-based clinics in the UK National Health Service - the BOSCOT trial. Specifically, we examined whether the amount of therapy and therapist competence had an impact on our primary outcome, the number of suicidal acts, using instrumental variables regression modelling. Randomized controlled trial. Participants from across three sites (London, Glasgow, and Ayrshire/Arran) were randomized equally to CBT for personality disorders (CBTpd) plus Treatment as Usual or to Treatment as Usual. Treatment as Usual varied between sites and individuals, but was consistent with routine treatment in the UK National Health Service at the time. CBTpd comprised an average 16 sessions (range 0-35) over 12 months. We used instrumental variable regression modelling to estimate the impact of quantity and quality of therapy received (recording activities and behaviours that took place after randomization) on number of suicidal acts and inpatient psychiatric hospitalization. A total of 101 participants provided full outcome data at 2 years post randomization. The previously reported intention-to-treat (ITT) results showed on average a reduction of 0.91 (95% confidence interval 0.15-1.67) suicidal acts over 2 years for those randomized to CBT. By incorporating the influence of quantity of therapy and therapist competence, we show that this estimate of the effect of CBTpd could be approximately two to three times greater for those receiving the right amount of therapy from a competent therapist. Trials should routinely control for and collect data on both quantity of therapy and therapist competence, which can be used, via instrumental variable regression modelling, to estimate treatment effects for optimal delivery of therapy. Such estimates complement rather than replace the ITT results
Energy Technology Data Exchange (ETDEWEB)
Sugino, Kazuteru [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Iwai, Takehiko
1998-07-01
A standard data base for FBR core nuclear design is under development in order to improve the accuracy of FBR design calculation. As a part of the development, we investigated an improved treatment of double-heterogeneity and a method to calculate homogenized control rod cross sections in a commercial reactor geometry, for the betterment of the analytical accuracy of commercial FBR core characteristics. As an improvement in the treatment of double-heterogeneity, we derived a new method (the direct method) and compared both this and conventional methods with continuous energy Monte-Carlo calculations. In addition, we investigated the applicability of the reaction rate ratio preservation method as a advanced method to calculate homogenized control rod cross sections. The present studies gave the following information: (1) An improved treatment of double-heterogeneity: for criticality the conventional method showed good agreement with Monte-Carlo result within one sigma standard deviation; the direct method was consistent with conventional one. Preliminary evaluation of effects in core characteristics other than criticality showed that the effect of sodium void reactivity (coolant reactivity) due to the double-heterogeneity was large. (2) An advanced method to calculate homogenize control rod cross sections: for control rod worths the reaction rate ratio preservation method agreed with those produced by the calculations with the control rod heterogeneity included in the core geometry; in Monju control rod worth analysis, the present method overestimated control rod worths by 1 to 2% compared with the conventional method, but these differences were caused by more accurate model in the present method and it is considered that this method is more reliable than the conventional one. These two methods investigated in this study can be directly applied to core characteristics other than criticality or control rod worth. Thus it is concluded that these methods will
The purpose of this report is to provide a reference manual that could be used by investigators for making informed use of logistic regression using two methods (standard logistic regression and MARS). The details for analyses of relationships between a dependent binary response ...
Bao, Zhenzhou; Li, Dongping; Zhang, Wei; Wang, Yanhui
2015-01-01
School climate is the quality and character of school life and reflects the norms, goals, values, interpersonal relationships, teaching and learning practices, and the organizational structure of a school. There is substantial literature documenting the negative association between positive school climate and adolescent delinquency, but little is known about the moderating and mediating mechanisms underlying this relationship. The aim of this study was to examine whether the direct and indirect pathways between school climate and adolescent delinquency would be moderated by effortful control. A sample of 2,758 Chinese adolescents (M age = 13.53 years, SD = 1.06) from 10 middle schools completed anonymous questionnaires regarding school climate, effortful control, deviant peer affiliation, and delinquency. After gender, age, geographical area, and socioeconomic status were included as covariates, the results revealed that school climate was significantly associated with adolescent delinquent behavior. This direct association was moderated by effortful control, such that the negative relationship between positive school climate and delinquency was only significant among adolescents low in effortful control. Moreover, the indirect association between school climate and delinquency via deviant peer affiliation was also moderated by effortful control. Specifically, the moderating effect of effortful control was not only manifested in the relationship between school climate and deviant peer affiliation, but also in the relationship between deviant peer affiliation and delinquency. These findings contribute to understanding the mechanisms through which positive school climate might reduce delinquent behavior and have important implications for prevention efforts aimed at diminishing adolescent delinquency.
Regression with Sparse Approximations of Data
DEFF Research Database (Denmark)
Noorzad, Pardis; Sturm, Bob L.
2012-01-01
We propose sparse approximation weighted regression (SPARROW), a method for local estimation of the regression function that uses sparse approximation with a dictionary of measurements. SPARROW estimates the regression function at a point with a linear combination of a few regressands selected...... by a sparse approximation of the point in terms of the regressors. We show SPARROW can be considered a variant of \\(k\\)-nearest neighbors regression (\\(k\\)-NNR), and more generally, local polynomial kernel regression. Unlike \\(k\\)-NNR, however, SPARROW can adapt the number of regressors to use based...
Spontaneous regression of a congenital melanocytic nevus
Directory of Open Access Journals (Sweden)
Amiya Kumar Nath
2011-01-01
Full Text Available Congenital melanocytic nevus (CMN may rarely regress which may also be associated with a halo or vitiligo. We describe a 10-year-old girl who presented with CMN on the left leg since birth, which recently started to regress spontaneously with associated depigmentation in the lesion and at a distant site. Dermoscopy performed at different sites of the regressing lesion demonstrated loss of epidermal pigments first followed by loss of dermal pigments. Histopathology and Masson-Fontana stain demonstrated lymphocytic infiltration and loss of pigment production in the regressing area. Immunohistochemistry staining (S100 and HMB-45, however, showed that nevus cells were present in the regressing areas.
The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis
DEFF Research Database (Denmark)
Czekaj, Tomasz Gerard
and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...
Aziz, Kamran M A
2013-09-01
Ramadan fasting is an obligatory duty for Muslims. Unique physiologic and metabolic changes occur during fasting which requires adjustments of diabetes medications. Although challenging, successful fasting can be accomplished if pre-Ramadan extensive education is provided to the patients. Current research was conducted to study effective Ramadan fasting with different OHAs/insulins without significant risk of hypoglycemia in terms of HbA1c reductions after Ramadan. ANOVA model was used to assess HbA1c levels among different education statuses. Serum creatinine was used to measure renal functions. Pre-Ramadan diabetes education with alteration of therapy and dosage adjustments for OHAs/insulin was done. Regression models for HbA1c before Ramadan with FBS before sunset were also synthesized as a tool to prevent hypoglycemia and successful Ramadan fasting in future. Out of 1046 patients, 998 patients fasted successfully without any episodes of hypoglycemia. 48 patients (4.58%) experienced hypoglycemia. Χ(2) Test for CRD/CKD with hypoglycemia was also significant (p-value Ramadan diabetes management. Some relevant patents are also outlined in this paper.
Time-trend of melanoma screening practice by primary care physicians: a meta-regression analysis.
Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni
2009-01-01
To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Meta-regression analyses of available data. MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%-82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening tended to decrease by 1.72% per year (P =0.086). Corresponding annual changes in European, North American, and Australian settings were -0.68% (P =0.494), -2.02% (P =0.044), and +2.59% (P =0.010), respectively. Changes were not influenced by national guide-lines. Considering the increasing incidence of melanoma and other skin malignancies, as well as their relative potential consequences, the FBSE implementation time-trend we retrieved should be considered a worrisome phenomenon.
International Nuclear Information System (INIS)
1997-03-01
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system has been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system
Analyses of developmental rate isomorphy in ectotherms: Introducing the dirichlet regression
Czech Academy of Sciences Publication Activity Database
Boukal S., David; Ditrich, Tomáš; Kutcherov, D.; Sroka, Pavel; Dudová, Pavla; Papáček, M.
2015-01-01
Roč. 10, č. 6 (2015), e0129341 E-ISSN 1932-6203 R&D Projects: GA ČR GAP505/10/0096 Grant - others:European Fund(CZ) PERG04-GA-2008-239543; GA JU(CZ) 145/2013/P Institutional support: RVO:60077344 Keywords : ectotherms Subject RIV: ED - Physiology Impact factor: 3.057, year: 2015 http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0129341
The benefits of using quantile regression for analysing the effect of weeds on organic winter wheat
Casagrande, M.; Makowski, D.; Jeuffroy, M.H.; Valantin-Morison, M.; David, C.
2010-01-01
P>In organic farming, weeds are one of the threats that limit crop yield. An early prediction of weed effect on yield loss and the size of late weed populations could help farmers and advisors to improve weed management. Numerous studies predicting the effect of weeds on yield have already been
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
DEFF Research Database (Denmark)
Scott, Neil W.; Fayers, Peter M.; Aaronson, Neil K.
2010-01-01
Differential item functioning (DIF) methods can be used to determine whether different subgroups respond differently to particular items within a health-related quality of life (HRQoL) subscale, after allowing for overall subgroup differences in that scale. This article reviews issues that arise...
Espeland, Mark A; Carmichael, Owen; Hayden, Kathleen; Neiberg, Rebecca H; Newman, Anne B; Keller, Jeffery N; Wadden, Thomas A; Rapp, Stephen R; Hill, James O; Horton, Edward S; Johnson, Karen C; Wagenknecht, Lynne; Wing, Rena R
2018-03-14
Diabetes adversely impacts cognition. Lifestyle change can improve diabetes control and potentially improve cognition. We examined whether weight loss through reduced caloric intake and increased physical activity was associated with slower cognitive aging in older adults with type 2 diabetes mellitus. The Look AHEAD randomized controlled clinical trial delivered 10 years of intensive lifestyle intervention (ILI) that yielded long-term weight losses. During 5 years spanning the end of intervention and postintervention follow-up, repeated cognitive assessments were obtained in 1,091 individuals who had been assigned to ILI or a control condition of diabetes support and education (DSE). We compared the means and slopes of scores on cognitive testing over these repeated assessments. Compared with DSE, assignment to ILI was associated with a -0.082 SD deficit in mean global cognitive function across repeated assessments (p = .010). However, overweight (body mass index [BMI] memory. The behavioral weight loss intervention was associated with small relative deficits in cognitive function among individuals who were obese and marginally greater cognitive decline overall compared to control. ClinicalTrials.gov Identifier: NCT00017953.
Energy Technology Data Exchange (ETDEWEB)
Boecher Poulsen, K.; Christensen, Torkild; Mortensen, Jan; Runge Kristoffersen, J.; Kristmundsson, E. [Tech-wise A/S (Denmark); Moekbak, T.; Mortensen, Hans Peter [Elsam A/S (Denmark)
2002-12-15
This report focuses on the model-building part of the PSO project, 'Control Conditions Bio-boiler'. In the project, which treats dynamic simulation of biomass fired power plants via the MMS software, a certain number of straw/wood chip modules have been built, together with a combustion module which, when connected to the existing MMS modules, can be used for dynamic simulation. Furthermore, a certain number of continuous and discrete components have been built which, together with the remaining MMS modules will create a complete model of the active part of the control structure under the most common operation conditions. By means of the developed straw, combustion and control modules, a model of Enstedvaerket's bio-boiler is built (without wood chip super-heater), including practically all auxiliary control mechanisms. In the report the primary problems caused by the development of the model are discussed, typically numerical problems. First of all it is discussed how a steady state condition can be reached from a nonphysical condition at the start-up of the simulation. After reaching a steady state condition at full load, the boiler load is reduced by adjusting the desired live steam flow setpoint value and the load gradient (as it would be done from the control room). While reducing to part load, one straw line is closed down, as it would be done in 'real life'. From part load, the boiler load is once again in the same way increased to full. (BA) The simulation results are then compared to measurements made on the same boiler during load reduction and load increase.
Cancer burden trends in Umbria region using a joinpoint regression
Directory of Open Access Journals (Sweden)
Giuseppe Michele Masanotti
2015-09-01
Full Text Available INTRODUCTION. The analysis of the epidemiological data on cancer is an important tool to control and evaluate the outcomes of primary and secondary prevention, the effectiveness of health care and, in general, all cancer control activities. MATERIALS AND METHODS. The aim of the this paper is to analyze the cancer mortality in the Umbria region from 1978 to 2009 and incidence from 1994-2008. Sex and site-specific trends for standardized rates were analyzed by "joinpoint regression", using the surveillance epidemiology and end results (SEER software. RESULTS. Applying the jointpoint analyses by sex and cancer site, to incidence spanning from 1994 to 2008 and mortality from 1978 to 2009 for all sites, both in males and females, a significant joinpoint for mortality was found; moreover the trend shape was similar and the joinpoint years were very close. In males standardized rate significantly increased up to 1989 by 1.23% per year and significantly decreased thereafter by -1.31%; among females the mortality rate increased in average of 0.78% (not significant per year till 1988 and afterward significantly decreased by -0.92% per year. Incidence rate showed different trends among sexes. In males was practically constant over the period studied (not significant increase 0.14% per year, in females significantly increased by 1.49% per year up to 2001 and afterward slowly decreased (-0.71% n.s. estimated annual percent change − EAPC. CONCLUSIONS. For all sites combined trends for mortality decreased since late '80s, both in males and females; such behaviour is in line with national and European Union data. This work shows that, even compared to health systems that invest more resources, the Umbria public health system achieved good health outcomes.
Intermediate and advanced topics in multilevel logistic regression analysis.
Austin, Peter C; Merlo, Juan
2017-09-10
Multilevel data occur frequently in health services, population and public health, and epidemiologic research. In such research, binary outcomes are common. Multilevel logistic regression models allow one to account for the clustering of subjects within clusters of higher-level units when estimating the effect of subject and cluster characteristics on subject outcomes. A search of the PubMed database demonstrated that the use of multilevel or hierarchical regression models is increasing rapidly. However, our impression is that many analysts simply use multilevel regression models to account for the nuisance of within-cluster homogeneity that is induced by clustering. In this article, we describe a suite of analyses that can complement the fitting of multilevel logistic regression models. These ancillary analyses permit analysts to estimate the marginal or population-average effect of covariates measured at the subject and cluster level, in contrast to the within-cluster or cluster-specific effects arising from the original multilevel logistic regression model. We describe the interval odds ratio and the proportion of opposed odds ratios, which are summary measures of effect for cluster-level covariates. We describe the variance partition coefficient and the median odds ratio which are measures of components of variance and heterogeneity in outcomes. These measures allow one to quantify the magnitude of the general contextual effect. We describe an R 2 measure that allows analysts to quantify the proportion of variation explained by different multilevel logistic regression models. We illustrate the application and interpretation of these measures by analyzing mortality in patients hospitalized with a diagnosis of acute myocardial infarction. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Energy Technology Data Exchange (ETDEWEB)
Dolwick, P.D. [Lake Michigan Air Directors Consortium, Des Plaines, IL (United States); Kaleel, R.J. [Illinois Environmental Protection Agency, Springfield, IL (United States); Majewski, M.A. [Wisconsin Dept. of Natural Resources, Madison, WI (United States)
1994-12-31
The four states that border Lake Michigan are cooperatively applying a state-of-the-art nested photochemical grid model to assess the effects of potential emission control strategies on reducing elevated tropospheric ozone concentrations in the region to levels below the national ambient air quality standard. In order to provide an extensive database to support the application of the photochemical model, a substantial data collection effort known as the Lake Michigan Ozone Study (LMOS) was completed during the summer of 1991. The Lake Michigan Ozone Control Program (LMOP) was established by the States of Illinois, Wisconsin, Michigan, and Indiana to carry out the application of the modeling system developed from the LMOS, in terms of developing the attainment demonstrations required from this area by the Clean Air Act Amendments of 1990.
Sammler, Svenja; Ketmaier, Valerio; Havenstein, Katja; Krause, Ulrike; Curio, Eberhard; Tiedemann, Ralph
2012-01-01
Abstract Background The Visayan Tarictic Hornbill (Penelopides panini) and the Walden’s Hornbill (Aceros waldeni) are two threatened hornbill species endemic to the western islands of the Visayas that constitute - between Luzon and Mindanao - the central island group of the Philippine archipelago. In order to evaluate their genetic diversity and to support efforts towards their conservation, we analyzed genetic variation in ~ 600 base pairs (bp) of the mitochondrial control region I and at 12...
Applied regression analysis a research tool
Pantula, Sastry; Dickey, David
1998-01-01
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
Directory of Open Access Journals (Sweden)
Jobaida Akther
2016-01-01
Full Text Available Glutathione S-transferases (GSTs belong to a group of multigene detoxification enzymes, which defend cells against oxidative stress. Tannery workers are at risk of oxidative damage that is usually detoxified by GSTs. This study investigated the genotypic frequencies of GST Mu1 (GSTM1 and GST Theta1 (GSTT1 in Bangladeshi tannery workers and healthy controls followed by their status of oxidative stress and total GST activity. Of the 188 individuals, 50.0% had both GSTM1 and GSTT1 (+/+, 12.2% had GSTM1 (+/−, 31.4% had GSTT1 (−/+ alleles, and 6.4% had null genotypes (−/− with respect to both GSTM1 and GSTT1 alleles. Among 109 healthy controls, 54.1% were double positive, 9.2% had GSTM1 allele, 32.1% had GSTT1 allele, and 4.6% had null genotypes. Out of 79 tannery workers, 44.3% were +/+, 16.8% were +/−, 30.5% were −/+, and 8.4% were −/−. Though the polymorphic genotypes or allelic variants of GSTM1 and GSTT1 were distributed among the study subjects with different frequencies, the differences between the study groups were not statistically significant. GST activity did not vary significantly between the two groups and also among different genotypes while level of lipid peroxidation was significantly higher in tannery workers compared to controls irrespective of their GST genotypes.
Akther, Jobaida; Ebihara, Akio; Nakagawa, Tsutomu; Islam, Laila N.; Suzuki, Fumiaki; Hosen, Md. Ismail; Hossain, Mahmud; Nabi, A. H. M. Nurun
2016-01-01
Glutathione S-transferases (GSTs) belong to a group of multigene detoxification enzymes, which defend cells against oxidative stress. Tannery workers are at risk of oxidative damage that is usually detoxified by GSTs. This study investigated the genotypic frequencies of GST Mu1 (GSTM1) and GST Theta1 (GSTT1) in Bangladeshi tannery workers and healthy controls followed by their status of oxidative stress and total GST activity. Of the 188 individuals, 50.0% had both GSTM1 and GSTT1 (+/+), 12.2% had GSTM1 (+/−), 31.4% had GSTT1 (−/+) alleles, and 6.4% had null genotypes (−/−) with respect to both GSTM1 and GSTT1 alleles. Among 109 healthy controls, 54.1% were double positive, 9.2% had GSTM1 allele, 32.1% had GSTT1 allele, and 4.6% had null genotypes. Out of 79 tannery workers, 44.3% were +/+, 16.8% were +/−, 30.5% were −/+, and 8.4% were −/−. Though the polymorphic genotypes or allelic variants of GSTM1 and GSTT1 were distributed among the study subjects with different frequencies, the differences between the study groups were not statistically significant. GST activity did not vary significantly between the two groups and also among different genotypes while level of lipid peroxidation was significantly higher in tannery workers compared to controls irrespective of their GST genotypes. PMID:27294127
Regression models of reactor diagnostic signals
International Nuclear Information System (INIS)
Vavrin, J.
1989-01-01
The application is described of an autoregression model as the simplest regression model of diagnostic signals in experimental analysis of diagnostic systems, in in-service monitoring of normal and anomalous conditions and their diagnostics. The method of diagnostics is described using a regression type diagnostic data base and regression spectral diagnostics. The diagnostics is described of neutron noise signals from anomalous modes in the experimental fuel assembly of a reactor. (author)
Bulcock, J. W.
The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…
Multivariate Regression Analysis and Slaughter Livestock,
AGRICULTURE, *ECONOMICS), (*MEAT, PRODUCTION), MULTIVARIATE ANALYSIS, REGRESSION ANALYSIS , ANIMALS, WEIGHT, COSTS, PREDICTIONS, STABILITY, MATHEMATICAL MODELS, STORAGE, BEEF, PORK, FOOD, STATISTICAL DATA, ACCURACY
[From clinical judgment to linear regression model.
Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O
2013-01-01
When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R 2 ) indicates the importance of independent variables in the outcome.
Directory of Open Access Journals (Sweden)
Lucila Coelho Pamplona
2004-12-01
Full Text Available Royal jelly (RJ is used as a revitalizing tonic. In order to avoid rejection to its acid taste, it is added to honey. There are regulations for honey and for royal jelly separately but not for the mixture. The objective of this work is, therefore, to verify if the same methods used for pure honey quality control can be used for honey mixed with royal jelly and also the presence of RJ through 10-HDA determination. The methods used were: moisture, reducing sugars, apparent sucrose, ash, hydroxymethylfurfural, insoluble solids, diastase activity, acidity and 10-HDA. Samples were prepared by adding 0-100% of RJ in honey. The results showed that the ash method was the only suitable one to all the samples. The acidity analysis (direct titration was suitable to 0-30%RJ samples; the reducing sugar analysis was suitable to 0-20% RJ samples. Concerning moisture analysis the refractometric method is suitable to 0-10% RJ and the Infra Red method is suggested to be used for samples with more than 10% RJ. The methods for diastase activity, HMF, apparent sucrose and insoluble solids were inadequate for all samples with RJ. The presence of RJ in the samples was confirmed by the 10-HDA analyses.A geléia real (GR é utilizada como produto revitalizante com sabor ácido e adstringente. Para evitar rejeição a este sabor, existem misturas de mel com GR. Existe legislação específica para o mel e para a GR separadamente, mas não para a mistura. Os objetivos do trabalho são: verificar se os métodos usados para o controle de qualidade do mel puro podem ser utilizados no controle do mel com GR e verificar a presença de GR pela determinação do 10-HDA. As análises incluíram: umidade, açúcares redutores, sacarose aparente, cinzas, hydroxymethylfurfural, sólidos insolúveis, atividade diastásica, acidez e 10-HDA. As amostras foram preparadas com 0-100% de GR no mel. Os resultados obtidos sugerem que dos métodos citados na legislação do mel somente o de
Mainou, Maria; Madenidou, Anastasia-Vasiliki; Liakos, Aris; Paschos, Paschalis; Karagiannis, Thomas; Bekiari, Eleni; Vlachaki, Efthymia; Wang, Zhen; Murad, Mohammad Hassan; Kumar, Shaji; Tsapas, Apostolos
2017-06-01
We performed a systematic review and meta-regression analysis of randomized control trials to investigate the association between response to initial treatment and survival outcomes in patients with newly diagnosed multiple myeloma (MM). Response outcomes included complete response (CR) and the combined outcome of CR or very good partial response (VGPR), while survival outcomes were overall survival (OS) and progression-free survival (PFS). We used random-effect meta-regression models and conducted sensitivity analyses based on definition of CR and study quality. Seventy-two trials were included in the systematic review, 63 of which contributed data in meta-regression analyses. There was no association between OS and CR in patients without autologous stem cell transplant (ASCT) (regression coefficient: .02, 95% confidence interval [CI] -0.06, 0.10), in patients undergoing ASCT (-.11, 95% CI -0.44, 0.22) and in trials comparing ASCT with non-ASCT patients (.04, 95% CI -0.29, 0.38). Similarly, OS did not correlate with the combined metric of CR or VGPR, and no association was evident between response outcomes and PFS. Sensitivity analyses yielded similar results. This meta-regression analysis suggests that there is no association between conventional response outcomes and survival in patients with newly diagnosed MM. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Abe, Masatoshi; Nishigori, Chikako; Torii, Hideshi; Ihn, Hironobu; Ito, Kei; Nagaoka, Makoto; Isogawa, Naoki; Kawaguchi, Isao; Tomochika, Yukiko; Kobayashi, Mihoko; Tallman, Anna M; Papp, Kim A
2017-11-01
Tofacitinib is an oral Janus kinase inhibitor. These post-hoc analyses assessed tofacitinib efficacy and safety in Japanese patients with psoriasis enrolled in a 52-week global phase 3 study. Patients received tofacitinib 5 mg, tofacitinib 10 mg or placebo twice daily (b.i.d.); placebo-treated patients advanced to tofacitinib at week 16. Primary efficacy end-points were the proportions of patients with 75% or more reduction from baseline Psoriasis Area and Severity Index (PASI-75) and Physician's Global Assessment (PGA) of "clear" or "almost clear" (PGA response) at week 16. Other end-points included: Itch Severity Item (ISI), Dermatology Life Quality Index (DLQI) score and Nail Psoriasis Severity Index (NAPSI). Adverse events (AEs) were recorded throughout the study. Overall, 58 Japanese patients were included in this analysis (tofacitinib 5 mg b.i.d., n = 22; 10 mg b.i.d., n = 24; placebo, n = 12); 29 completed the study. At week 16, significantly more patients receiving tofacitinib 5 and 10 mg b.i.d. versus placebo achieved PASI-75 (50% and 75% vs 0%, P tofacitinib doses. Over 52 weeks, similar rates of AEs were reported across treatment groups; one serious AE occurred with tofacitinib 10 mg b.i.d. Herpes zoster occurred in three patients receiving tofacitinib 10 mg b.i.d. No deaths, serious infections, malignancies or gastrointestinal perforations were reported. Results were generally consistent with global analysis, suggesting sustained efficacy and a manageable safety profile, with increased herpes zoster incidence, of tofacitinib in Japanese patients with psoriasis. © 2017 The Authors. The Journal of Dermatology published by John Wiley & Sons Australia, Ltd on behalf of Japanese Dermatological Association.
Regression modeling methods, theory, and computation with SAS
Panik, Michael
2009-01-01
Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,
International Nuclear Information System (INIS)
Moraru, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Bucur, Ciprian; Stefan, Liviu; Bornea, Anisia; Stefanescu, Ioan
2008-01-01
The system responds to the monitoring requirements of the technological processes specific to the nuclear installation that processes radioactive substances, with severe consequences in case of technological failure, as is the case with a tritium processing nuclear plant. The big amount of data that needs to be processed, stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with an advanced technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be effected, to be then continued with optimization of the system, by implementing new and performing methods corresponding to the technological processes within the tritium removal processing nuclear plants. (authors)
DNBR Prediction Using a Support Vector Regression
International Nuclear Information System (INIS)
Yang, Heon Young; Na, Man Gyun
2008-01-01
PWRs (Pressurized Water Reactors) generally operate in the nucleate boiling state. However, the conversion of nucleate boiling into film boiling with conspicuously reduced heat transfer induces a boiling crisis that may cause the fuel clad melting in the long run. This type of boiling crisis is called Departure from Nucleate Boiling (DNB) phenomena. Because the prediction of minimum DNBR in a reactor core is very important to prevent the boiling crisis such as clad melting, a lot of research has been conducted to predict DNBR values. The object of this research is to predict minimum DNBR applying support vector regression (SVR) by using the measured signals of a reactor coolant system (RCS). The SVR has extensively and successfully been applied to nonlinear function approximation like the proposed problem for estimating DNBR values that will be a function of various input variables such as reactor power, reactor pressure, core mass flowrate, control rod positions and so on. The minimum DNBR in a reactor core is predicted using these various operating condition data as the inputs to the SVR. The minimum DBNR values predicted by the SVR confirm its correctness compared with COLSS values
Duan, Bao-Zhong; Wang, Ya-Ping; Fang, Hai-Lan; Xiong, Chao; Li, Xi-Wen; Wang, Ping; Chen, Shi-Lin
2018-01-01
Rhizoma Paridis (Chonglou) is a commonly used and precious traditional Chinese medicine. Paris polyphylla Smith var. yunnanensis (Franch.) Hand. -Mazz. and Paris polyphylla Smith var . chinensis (Franch.) Hara are the two main sources of Chonglou under the monograph of Rhizoma Paridis in Chinese Pharmacopoeia. In the local marketplace, however, this medicine is prone to be accidentally contaminated, deliberately substituted or admixed with other species that are similar to Rhizoma Paridis in shape and color. Consequently, these adulterations might compromise quality control and result in considerable health concerns for consumers. This study aims to develop a rapid and sensitive method for accurate identification of Rhizoma Paridis and its common adulterants. DNA barcoding coupled with high resolution melting analysis was applied in this research to distinguish Rhizoma Paridis from its adulteration. The internal transcribed spacer 2 (ITS2) barcode was selected for HRM analysis to produce standard melting profile of the selected species. DNA of the tested herbal medicines was isolated and their melting profiles were generated and compared with the standard melting profile of P. polyphylla var. chinensis . The results indicate that the ITS2 molecular regions coupled with HRM analysis can effectively differentiate nine herbal species, including two authentic origins of Chonglou and their seven common adulterants. Ten herbal medicines labeled "Chonglou" obtained from a local market were collected and identified with our methods, and their sequence information was analyzed to validate the accuracy of HRM analysis. DNA barcoding coupled with HRM analysis is a accurate, reliable, rapid, cost-effective and robust tool, which could contribute to the quality control of Rhizoma Paridis in the supply chain of the natural health product industry (NHP).
International Nuclear Information System (INIS)
Lee, Lucille N.; Barnswell, Carlton; Torre, Taryn; Fearn, Paul; Kattan, Michael; Potters, Louis
2002-01-01
Purpose: To compare PSA relapse-free survival (PSA-RFS) between African-American (AA) and white American (WA) males treated with permanent prostate brachytherapy (PPB) for clinically localized prostate cancer. Methods and materials: One thousand eighty-one consecutive patients, including 246 African-Americans, underwent PPB with 103 Pd or 125 I, alone or with external beam radiation therapy between September 1992 and September 1999. Computer-generated matching was performed to create two identical cohorts of WA and AA males, based on the use of neoadjuvant androgen ablation (NAAD), pretreatment PSA, and Gleason score. Presenting characteristics were used to define risk groups, as follows: Low risk had PSA ≤10 and Gleason score ≤6, intermediate risk had PSA >10 or Gleason score ≥7, and high risk had PSA >10 and Gleason score ≥7. PSA-RFS was calculated using the Kattan modification of the ASTRO definition, and the log-rank test was used to compare Kaplan-Meier PSA-RFS curves. Univariate and multivariate analyses were performed to determine predictors of PSA-RFS. Results: Overall, univariate analysis revealed that AA males at presentation had lower disease stage (p=0.01), had lower Gleason scores (p=0.017), were younger (p=0.001), and were more likely to receive NAAD (p=0.001) than their WA counterparts. There were no differences in pretreatment PSA, isotope selection, use of external beam radiation therapy, median follow-up, or risk group classification between AA and WA males. Pretreatment PSA and Gleason score were significant predictors of PSA-RFS in multivariate analysis, and race was not significant. There was no significant difference between the 5-year PSA-RFS for AA males (84.0%) and the matched cohort of WA males (81.2%) (p=0.384). Race was not a predictor of 5-year PSA-RFS among patients treated with or without NAAD and within low-, intermediate-, and high-risk groups. Conclusion: Race is not an independent predictor of 5-year PSA-RFS in patients
International Nuclear Information System (INIS)
Janssen, I.; Stebbings, J.H.
1990-01-01
In environmental epidemiology, trace and toxic substance concentrations frequently have very highly skewed distributions ranging over one or more orders of magnitude, and prediction by conventional regression is often poor. Classification and Regression Tree Analysis (CART) is an alternative in such contexts. To compare the techniques, two Pennsylvania data sets and three independent variables are used: house radon progeny (RnD) and gamma levels as predicted by construction characteristics in 1330 houses; and ∼200 house radon (Rn) measurements as predicted by topographic parameters. CART may identify structural variables of interest not identified by conventional regression, and vice versa, but in general the regression models are similar. CART has major advantages in dealing with other common characteristics of environmental data sets, such as missing values, continuous variables requiring transformations, and large sets of potential independent variables. CART is most useful in the identification and screening of independent variables, greatly reducing the need for cross-tabulations and nested breakdown analyses. There is no need to discard cases with missing values for the independent variables because surrogate variables are intrinsic to CART. The tree-structured approach is also independent of the scale on which the independent variables are measured, so that transformations are unnecessary. CART identifies important interactions as well as main effects. The major advantages of CART appear to be in exploring data. Once the important variables are identified, conventional regressions seem to lead to results similar but more interpretable by most audiences. 12 refs., 8 figs., 10 tabs
RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,
This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)
Hierarchical regression analysis in structural Equation Modeling
de Jong, P.F.
1999-01-01
In a hierarchical or fixed-order regression analysis, the independent variables are entered into the regression equation in a prespecified order. Such an analysis is often performed when the extra amount of variance accounted for in a dependent variable by a specific independent variable is the main
Categorical regression dose-response modeling
The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...
Variable importance in latent variable regression models
Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.
2014-01-01
The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable
Stepwise versus Hierarchical Regression: Pros and Cons
Lewis, Mitzi
2007-01-01
Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…
Suppression Situations in Multiple Linear Regression
Shieh, Gwowen
2006-01-01
This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis for the examination of their relationship. For the two-predictor regression, the author demonstrates that the previous results in the literature are…
Gibrat’s law and quantile regressions
DEFF Research Database (Denmark)
Distante, Roberta; Petrella, Ivan; Santoro, Emiliano
2017-01-01
The nexus between firm growth, size and age in U.S. manufacturing is examined through the lens of quantile regression models. This methodology allows us to overcome serious shortcomings entailed by linear regression models employed by much of the existing literature, unveiling a number of important...
Regression Analysis and the Sociological Imagination
De Maio, Fernando
2014-01-01
Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.
Repeated Results Analysis for Middleware Regression Benchmarking
Czech Academy of Sciences Publication Activity Database
Bulej, Lubomír; Kalibera, T.; Tůma, P.
2005-01-01
Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005
Principles of Quantile Regression and an Application
Chen, Fang; Chalhoub-Deville, Micheline
2014-01-01
Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…
ON REGRESSION REPRESENTATIONS OF STOCHASTIC-PROCESSES
RUSCHENDORF, L; DEVALK, [No Value
We construct a.s. nonlinear regression representations of general stochastic processes (X(n))n is-an-element-of N. As a consequence we obtain in particular special regression representations of Markov chains and of certain m-dependent sequences. For m-dependent sequences we obtain a constructive
Regression of environmental noise in LIGO data
International Nuclear Information System (INIS)
Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G
2015-01-01
We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)
Pathological assessment of liver fibrosis regression
Directory of Open Access Journals (Sweden)
WANG Bingqiong
2017-03-01
Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.
Mecocci, Patrizia; Bladström, Anna; Stender, Karina
2009-05-01
The post-hoc analyses reported here evaluate the specific effects of memantine treatment on ADAS-cog single-items or SIB subscales for patients with moderate to severe AD. Data from six multicentre, randomised, placebo-controlled, parallel-group, double-blind, 6-month studies were used as the basis for these post-hoc analyses. All patients with a Mini-Mental State Examination (MMSE) score of less than 20 were included. Analyses of patients with moderate AD (MMSE: 10-19), evaluated with the Alzheimer's disease Assessment Scale (ADAS-cog) and analyses of patients with moderate to severe AD (MMSE: 3-14), evaluated using the Severe Impairment Battery (SIB), were performed separately. The mean change from baseline showed a significant benefit of memantine treatment on both the ADAS-cog (p ADAS-cog single-item analyses showed significant benefits of memantine treatment, compared to placebo, for mean change from baseline for commands (p < 0.001), ideational praxis (p < 0.05), orientation (p < 0.01), comprehension (p < 0.05), and remembering test instructions (p < 0.05) for observed cases (OC). The SIB subscale analyses showed significant benefits of memantine, compared to placebo, for mean change from baseline for language (p < 0.05), memory (p < 0.05), orientation (p < 0.01), praxis (p < 0.001), and visuospatial ability (p < 0.01) for OC. Memantine shows significant benefits on overall cognitive abilities as well as on specific key cognitive domains for patients with moderate to severe AD. (c) 2009 John Wiley & Sons, Ltd.
Dai, Yu; Zeng, Tianshu; Xiao, Fei; Chen, Lulu; Kong, Wen
2017-01-01
We conducted a case/control study to assess the impact of SNP rs3087243 and rs231775 within the CTLA4 gene, on the susceptibility to Graves' disease (GD) in a Chinese Han dataset (271 cases and 298 controls). The frequency of G allele for rs3087243 and rs231775 was observed to be significantly higher in subjects with GD than in control subjects (p = 0.005 and p = 0.000, respectively). After logistic regression analysis, a significant association was detected between SNP rs3087243 and GD in the additive and recessive models. Similarly, association for the SNP rs231775 could also be detected in the additive model, dominant model and recessive model. A meta-analysis, including 27 published datasets along with the current dataset, was performed to further confirm the association. Consistent with our case/control results, rs3087243 and rs231775 showed a significant association with GD in all genetic models. Of note, ethnic stratification revealed that these two SNPs were associated with susceptibility to GD in populations of both Asian and European descent. In conclusion, our data support that the rs3087243 and rs231775 polymorphisms within the CTLA4 gene confer genetic susceptibility to GD. PMID:29299173
Directory of Open Access Journals (Sweden)
Schmitz Heinz
2010-11-01
Full Text Available Abstract Background When comparing active treatments, a non-inferiority (or one-sided equivalence study design is often used. This design requires the definition of a non-inferiority margin, the threshold value of clinical relevance. In recent studies, a non-inferiority margin of 15 mm has been used for the change in endometriosis-associated pelvic pain (EAPP on a visual analog scale (VAS. However, this value was derived from other chronic painful conditions and its validation in EAPP was lacking. Methods Data were analyzed from two placebo-controlled studies of active treatments in endometriosis, including 281 patients with laparoscopically-confirmed endometriosis and moderate-to-severe EAPP. Patients recorded EAPP on a VAS at baseline and the end of treatment. Patients also assessed their satisfaction with treatment on a modified Clinical Global Impression scale. Changes in VAS score were compared with patients' self-assessments to derive an empirically validated non-inferiority margin. This anchor-based value was compared to a non-inferiority margin derived using the conventional half standard deviation rule for minimal clinically important difference (MCID in patient-reported outcomes. Results Anchor-based and distribution-based MCIDs were-7.8 mm and-8.6 mm, respectively. Conclusions An empirically validated non-inferiority margin of 10 mm for EAPP measured on a VAS is appropriate to compare treatments in endometriosis.
Papandonatos, George D; Pan, Qing; Pajewski, Nicholas M; Delahanty, Linda M; Peter, Inga; Erar, Bahar; Ahmad, Shafqat; Harden, Maegan; Chen, Ling; Fontanillas, Pierre; Wagenknecht, Lynne E; Kahn, Steven E; Wing, Rena R; Jablonski, Kathleen A; Huggins, Gordon S; Knowler, William C; Florez, Jose C; McCaffery, Jeanne M; Franks, Paul W
2015-12-01
Clinically relevant weight loss is achievable through lifestyle modification, but unintentional weight regain is common. We investigated whether recently discovered genetic variants affect weight loss and/or weight regain during behavioral intervention. Participants at high-risk of type 2 diabetes (Diabetes Prevention Program [DPP]; N = 917/907 intervention/comparison) or with type 2 diabetes (Look AHEAD [Action for Health in Diabetes]; N = 2,014/1,892 intervention/comparison) were from two parallel arm (lifestyle vs. comparison) randomized controlled trials. The associations of 91 established obesity-predisposing loci with weight loss across 4 years and with weight regain across years 2-4 after a minimum of 3% weight loss were tested. Each copy of the minor G allele of MTIF3 rs1885988 was consistently associated with greater weight loss following lifestyle intervention over 4 years across the DPP and Look AHEAD. No such effect was observed across comparison arms, leading to a nominally significant single nucleotide polymorphism×treatment interaction (P = 4.3 × 10(-3)). However, this effect was not significant at a study-wise significance level (Bonferroni threshold P lifestyle. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.
Oude Voshaar, R.C.; Gorgels, W.J.M.J.; Mol, A.J.J.; Balkom, A.J.L.M. van; Mulder, J.; Lisdonk, E.H. van de; Breteler, M.H.M.; Zitman, F.G.
2006-01-01
OBJECTIVE: To identify predictors of resumed benzodiazepine use after participation in a benzodiazepine discontinuation trial. METHOD: We performed multiple Cox regression analyses to predict the long-term outcome of a 3-condition, randomized, controlled benzodiazepine discontinuation trial in
Oude Voshaar, R.C.; Gorgels, W.J.M.J.; Mol, A.J.J.; Balkom, A.J.L.M. van; Mulder, J.; Lisdonk, E.H. van de; Breteler, M.H.M.; Zitman, F.G.
2006-01-01
Objective: To identify predictors of resumed benzodiazepine use after participation in a benzodiazepine discontinuation trial. Method: We performed multiple Cox regression analyses to predict the long-term outcome of a 3-condition, randomized, controlled benzodiazepine discontinuation trial in
International Nuclear Information System (INIS)
Kostov, Marin
2000-01-01
In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The
National Research Council Canada - National Science Library
Pfleiderer, Elaine M; Scroggins, Cheryl L; Manning, Carol A
2009-01-01
Two separate logistic regression analyses were conducted for low- and high-altitude sectors to determine whether a set of dynamic sector characteristics variables could reliably discriminate between operational error (OE...
Quasi-experimental evidence on tobacco tax regressivity.
Koch, Steven F
2018-01-01
Tobacco taxes are known to reduce tobacco consumption and to be regressive, such that tobacco control policy may have the perverse effect of further harming the poor. However, if tobacco consumption falls faster amongst the poor than the rich, tobacco control policy can actually be progressive. We take advantage of persistent and committed tobacco control activities in South Africa to examine the household tobacco expenditure burden. For the analysis, we make use of two South African Income and Expenditure Surveys (2005/06 and 2010/11) that span a series of such tax increases and have been matched across the years, yielding 7806 matched pairs of tobacco consuming households and 4909 matched pairs of cigarette consuming households. By matching households across the surveys, we are able to examine both the regressivity of the household tobacco burden, and any change in that regressivity, and since tobacco taxes have been a consistent component of tobacco prices, our results also relate to the regressivity of tobacco taxes. Like previous research into cigarette and tobacco expenditures, we find that the tobacco burden is regressive; thus, so are tobacco taxes. However, we find that over the five-year period considered, the tobacco burden has decreased, and, most importantly, falls less heavily on the poor. Thus, the tobacco burden and the tobacco tax is less regressive in 2010/11 than in 2005/06. Thus, increased tobacco taxes can, in at least some circumstances, reduce the financial burden that tobacco places on households. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sparse Regression by Projection and Sparse Discriminant Analysis
Qi, Xin
2015-04-03
© 2015, © American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America. Recent years have seen active developments of various penalized regression methods, such as LASSO and elastic net, to analyze high-dimensional data. In these approaches, the direction and length of the regression coefficients are determined simultaneously. Due to the introduction of penalties, the length of the estimates can be far from being optimal for accurate predictions. We introduce a new framework, regression by projection, and its sparse version to analyze high-dimensional data. The unique nature of this framework is that the directions of the regression coefficients are inferred first, and the lengths and the tuning parameters are determined by a cross-validation procedure to achieve the largest prediction accuracy. We provide a theoretical result for simultaneous model selection consistency and parameter estimation consistency of our method in high dimension. This new framework is then generalized such that it can be applied to principal components analysis, partial least squares, and canonical correlation analysis. We also adapt this framework for discriminant analysis. Compared with the existing methods, where there is relatively little control of the dependency among the sparse components, our method can control the relationships among the components. We present efficient algorithms and related theory for solving the sparse regression by projection problem. Based on extensive simulations and real data analysis, we demonstrate that our method achieves good predictive performance and variable selection in the regression setting, and the ability to control relationships between the sparse components leads to more accurate classification. In supplementary materials available online, the details of the algorithms and theoretical proofs, and R codes for all simulation studies are provided.
Sammler, Svenja; Ketmaier, Valerio; Havenstein, Katja; Krause, Ulrike; Curio, Eberhard; Tiedemann, Ralph
2012-10-12
The Visayan Tarictic Hornbill (Penelopides panini) and the Walden's Hornbill (Aceros waldeni) are two threatened hornbill species endemic to the western islands of the Visayas that constitute - between Luzon and Mindanao - the central island group of the Philippine archipelago. In order to evaluate their genetic diversity and to support efforts towards their conservation, we analyzed genetic variation in ~ 600 base pairs (bp) of the mitochondrial control region I and at 12-19 nuclear microsatellite loci. The sampling covered extant populations, still occurring only on two islands (P. panini: Panay and Negros, A. waldeni: only Panay), and it was augmented with museum specimens of extinct populations from neighboring islands. For comparison, their less endangered (= more abundant) sister taxa, the Luzon Tarictic Hornbill (P. manillae) from the Luzon and Polillo Islands and the Writhed Hornbill (A. leucocephalus) from Mindanao Island, were also included in the study. We reconstructed the population history of the two Penelopides species and assessed the genetic population structure of the remaining wild populations in all four species. Mitochondrial and nuclear data concordantly show a clear genetic separation according to the island of origin in both Penelopides species, but also unravel sporadic over-water movements between islands. We found evidence that deforestation in the last century influenced these migratory events. Both classes of markers and the comparison to museum specimens reveal a genetic diversity loss in both Visayan hornbill species, P. panini and A. waldeni, as compared to their more abundant relatives. This might have been caused by local extinction of genetically differentiated populations together with the dramatic decline in the abundance of the extant populations. We demonstrated a loss in genetic diversity of P. panini and A. waldeni as compared to their sister taxa P. manillae and A. leucocephalus. Because of the low potential for gene flow
Directory of Open Access Journals (Sweden)
Mukesh Gautam
Full Text Available Reptiles are phylogenically important group of organisms as mammals have evolved from them. Wall lizard testis exhibits clearly distinct morphology during various phases of a reproductive cycle making them an interesting model to study regulation of spermatogenesis. Studies on reptile spermatogenesis are negligible hence this study will prove to be an important resource.Histological analyses show complete regression of seminiferous tubules during regressed phase with retracted Sertoli cells and spermatognia. In the recrudescent phase, regressed testis regain cellular activity showing presence of normal Sertoli cells and developing germ cells. In the active phase, testis reaches up to its maximum size with enlarged seminiferous tubules and presence of sperm in seminiferous lumen. Total RNA extracted from whole testis of regressed, recrudescent and active phase of wall lizard was hybridized on Mouse Whole Genome 8×60 K format gene chip. Microarray data from regressed phase was deemed as control group. Microarray data were validated by assessing the expression of some selected genes using Quantitative Real-Time PCR. The genes prominently expressed in recrudescent and active phase testis are cytoskeleton organization GO 0005856, cell growth GO 0045927, GTpase regulator activity GO: 0030695, transcription GO: 0006352, apoptosis GO: 0006915 and many other biological processes. The genes showing higher expression in regressed phase belonged to functional categories such as negative regulation of macromolecule metabolic process GO: 0010605, negative regulation of gene expression GO: 0010629 and maintenance of stem cell niche GO: 0045165.This is the first exploratory study profiling transcriptome of three drastically different conditions of any reptilian testis. The genes expressed in the testis during regressed, recrudescent and active phase of reproductive cycle are in concordance with the testis morphology during these phases. This study will pave
Easy methods for extracting individual regression slopes: Comparing SPSS, R, and Excel
Directory of Open Access Journals (Sweden)
Roland Pfister
2013-10-01
Full Text Available Three different methods for extracting coefficientsof linear regression analyses are presented. The focus is on automatic and easy-to-use approaches for common statistical packages: SPSS, R, and MS Excel / LibreOffice Calc. Hands-on examples are included for each analysis, followed by a brief description of how a subsequent regression coefficient analysis is performed.
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Geographically weighted regression and multicollinearity: dispelling the myth
Fotheringham, A. Stewart; Oshan, Taylor M.
2016-10-01
Geographically weighted regression (GWR) extends the familiar regression framework by estimating a set of parameters for any number of locations within a study area, rather than producing a single parameter estimate for each relationship specified in the model. Recent literature has suggested that GWR is highly susceptible to the effects of multicollinearity between explanatory variables and has proposed a series of local measures of multicollinearity as an indicator of potential problems. In this paper, we employ a controlled simulation to demonstrate that GWR is in fact very robust to the effects of multicollinearity. Consequently, the contention that GWR is highly susceptible to multicollinearity issues needs rethinking.
Directory of Open Access Journals (Sweden)
Sammler Svenja
2012-10-01
Full Text Available Abstract Background The Visayan Tarictic Hornbill (Penelopides panini and the Walden’s Hornbill (Aceros waldeni are two threatened hornbill species endemic to the western islands of the Visayas that constitute - between Luzon and Mindanao - the central island group of the Philippine archipelago. In order to evaluate their genetic diversity and to support efforts towards their conservation, we analyzed genetic variation in ~ 600 base pairs (bp of the mitochondrial control region I and at 12–19 nuclear microsatellite loci. The sampling covered extant populations, still occurring only on two islands (P. panini: Panay and Negros, A. waldeni: only Panay, and it was augmented with museum specimens of extinct populations from neighboring islands. For comparison, their less endangered (= more abundant sister taxa, the Luzon Tarictic Hornbill (P. manillae from the Luzon and Polillo Islands and the Writhed Hornbill (A. leucocephalus from Mindanao Island, were also included in the study. We reconstructed the population history of the two Penelopides species and assessed the genetic population structure of the remaining wild populations in all four species. Results Mitochondrial and nuclear data concordantly show a clear genetic separation according to the island of origin in both Penelopides species, but also unravel sporadic over-water movements between islands. We found evidence that deforestation in the last century influenced these migratory events. Both classes of markers and the comparison to museum specimens reveal a genetic diversity loss in both Visayan hornbill species, P. panini and A. waldeni, as compared to their more abundant relatives. This might have been caused by local extinction of genetically differentiated populations together with the dramatic decline in the abundance of the extant populations. Conclusions We demonstrated a loss in genetic diversity of P. panini and A. waldeni as compared to their sister taxa P. manillae and A
Logistic Regression in the Identification of Hazards in Construction
Drozd, Wojciech
2017-10-01
The construction site and its elements create circumstances that are conducive to the formation of risks to safety during the execution of works. Analysis indicates the critical importance of these factors in the set of characteristics that describe the causes of accidents in the construction industry. This article attempts to analyse the characteristics related to the construction site, in order to indicate their importance in defining the circumstances of accidents at work. The study includes sites inspected in 2014 - 2016 by the employees of the District Labour Inspectorate in Krakow (Poland). The analysed set of detailed (disaggregated) data includes both quantitative and qualitative characteristics. The substantive task focused on classification modelling in the identification of hazards in construction and identifying those of the analysed characteristics that are important in an accident. In terms of methodology, resource data analysis using statistical classifiers, in the form of logistic regression, was the method used.
Variable and subset selection in PLS regression
DEFF Research Database (Denmark)
Høskuldsson, Agnar
2001-01-01
The purpose of this paper is to present some useful methods for introductory analysis of variables and subsets in relation to PLS regression. We present here methods that are efficient in finding the appropriate variables or subset to use in the PLS regression. The general conclusion...... is that variable selection is important for successful analysis of chemometric data. An important aspect of the results presented is that lack of variable selection can spoil the PLS regression, and that cross-validation measures using a test set can show larger variation, when we use different subsets of X, than...
Applied Regression Modeling A Business Approach
Pardoe, Iain
2012-01-01
An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
FBH1 Catalyzes Regression of Stalled Replication Forks
Directory of Open Access Journals (Sweden)
Kasper Fugger
2015-03-01
Full Text Available DNA replication fork perturbation is a major challenge to the maintenance of genome integrity. It has been suggested that processing of stalled forks might involve fork regression, in which the fork reverses and the two nascent DNA strands anneal. Here, we show that FBH1 catalyzes regression of a model replication fork in vitro and promotes fork regression in vivo in response to replication perturbation. Cells respond to fork stalling by activating checkpoint responses requiring signaling through stress-activated protein kinases. Importantly, we show that FBH1, through its helicase activity, is required for early phosphorylation of ATM substrates such as CHK2 and CtIP as well as hyperphosphorylation of RPA. These phosphorylations occur prior to apparent DNA double-strand break formation. Furthermore, FBH1-dependent signaling promotes checkpoint control and preserves genome integrity. We propose a model whereby FBH1 promotes early checkpoint signaling by remodeling of stalled DNA replication forks.
Multiple regression for physiological data analysis: the problem of multicollinearity.
Slinker, B K; Glantz, S A
1985-07-01
Multiple linear regression, in which several predictor variables are related to a response variable, is a powerful statistical tool for gaining quantitative insight into complex in vivo physiological systems. For these insights to be correct, all predictor variables must be uncorrelated. However, in many physiological experiments the predictor variables cannot be precisely controlled and thus change in parallel (i.e., they are highly correlated). There is a redundancy of information about the response, a situation called multicollinearity, that leads to numerical problems in estimating the parameters in regression equations; the parameters are often of incorrect magnitude or sign or have large standard errors. Although multicollinearity can be avoided with good experimental design, not all interesting physiological questions can be studied without encountering multicollinearity. In these cases various ad hoc procedures have been proposed to mitigate multicollinearity. Although many of these procedures are controversial, they can be helpful in applying multiple linear regression to some physiological problems.
Vectors, a tool in statistical regression theory
Corsten, L.C.A.
1958-01-01
Using linear algebra this thesis developed linear regression analysis including analysis of variance, covariance analysis, special experimental designs, linear and fertility adjustments, analysis of experiments at different places and times. The determination of the orthogonal projection, yielding
Genetics Home Reference: caudal regression syndrome
... umbilical artery: Further support for a caudal regression-sirenomelia spectrum. Am J Med Genet A. 2007 Dec ... AK, Dickinson JE, Bower C. Caudal dysgenesis and sirenomelia-single centre experience suggests common pathogenic basis. Am ...
Dynamic travel time estimation using regression trees.
2008-10-01
This report presents a methodology for travel time estimation by using regression trees. The dissemination of travel time information has become crucial for effective traffic management, especially under congested road conditions. In the absence of c...
Sirenomelia and severe caudal regression syndrome.
Seidahmed, Mohammed Z; Abdelbasit, Omer B; Alhussein, Khalid A; Miqdad, Abeer M; Khalil, Mohammed I; Salih, Mustafa A
2014-12-01
To describe cases of sirenomelia and severe caudal regression syndrome (CRS), to report the prevalence of sirenomelia, and compare our findings with the literature. Retrospective data was retrieved from the medical records of infants with the diagnosis of sirenomelia and CRS and their mothers from 1989 to 2010 (22 years) at the Security Forces Hospital, Riyadh, Saudi Arabia. A perinatologist, neonatologist, pediatric neurologist, and radiologist ascertained the diagnoses. The cases were identified as part of a study of neural tube defects during that period. A literature search was conducted using MEDLINE. During the 22-year study period, the total number of deliveries was 124,933 out of whom, 4 patients with sirenomelia, and 2 patients with severe forms of CRS were identified. All the patients with sirenomelia had single umbilical artery, and none were the infant of a diabetic mother. One patient was a twin, and another was one of triplets. The 2 patients with CRS were sisters, their mother suffered from type II diabetes mellitus and morbid obesity on insulin, and neither of them had a single umbilical artery. Other associated anomalies with sirenomelia included an absent radius, thumb, and index finger in one patient, Potter's syndrome, abnormal ribs, microphthalmia, congenital heart disease, hypoplastic lungs, and diaphragmatic hernia. The prevalence of sirenomelia (3.2 per 100,000) is high compared with the international prevalence of one per 100,000. Both cases of CRS were infants of type II diabetic mother with poor control, supporting the strong correlation of CRS and maternal diabetes.
Directory of Open Access Journals (Sweden)
Anke Hüls
2017-05-01
Full Text Available Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model and (ii to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Discriminative Elastic-Net Regularized Linear Regression.
Zhang, Zheng; Lai, Zhihui; Xu, Yong; Shao, Ling; Wu, Jian; Xie, Guo-Sen
2017-03-01
In this paper, we aim at learning compact and discriminative linear regression models. Linear regression has been widely used in different problems. However, most of the existing linear regression methods exploit the conventional zero-one matrix as the regression targets, which greatly narrows the flexibility of the regression model. Another major limitation of these methods is that the learned projection matrix fails to precisely project the image features to the target space due to their weak discriminative capability. To this end, we present an elastic-net regularized linear regression (ENLR) framework, and develop two robust linear regression models which possess the following special characteristics. First, our methods exploit two particular strategies to enlarge the margins of different classes by relaxing the strict binary targets into a more feasible variable matrix. Second, a robust elastic-net regularization of singular values is introduced to enhance the compactness and effectiveness of the learned projection matrix. Third, the resulting optimization problem of ENLR has a closed-form solution in each iteration, which can be solved efficiently. Finally, rather than directly exploiting the projection matrix for recognition, our methods employ the transformed features as the new discriminate representations to make final image classification. Compared with the traditional linear regression model and some of its variants, our method is much more accurate in image classification. Extensive experiments conducted on publicly available data sets well demonstrate that the proposed framework can outperform the state-of-the-art methods. The MATLAB codes of our methods can be available at http://www.yongxu.org/lunwen.html.
Fuzzy multiple linear regression: A computational approach
Juang, C. H.; Huang, X. H.; Fleming, J. W.
1992-01-01
This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.
Computing multiple-output regression quantile regions
Czech Academy of Sciences Publication Activity Database
Paindaveine, D.; Šiman, Miroslav
2012-01-01
Roč. 56, č. 4 (2012), s. 840-853 ISSN 0167-9473 R&D Projects: GA MŠk(CZ) 1M06047 Institutional research plan: CEZ:AV0Z10750506 Keywords : halfspace depth * multiple-output regression * parametric linear programming * quantile regression Subject RIV: BA - General Mathematics Impact factor: 1.304, year: 2012 http://library.utia.cas.cz/separaty/2012/SI/siman-0376413.pdf
There is No Quantum Regression Theorem
International Nuclear Information System (INIS)
Ford, G.W.; OConnell, R.F.
1996-01-01
The Onsager regression hypothesis states that the regression of fluctuations is governed by macroscopic equations describing the approach to equilibrium. It is here asserted that this hypothesis fails in the quantum case. This is shown first by explicit calculation for the example of quantum Brownian motion of an oscillator and then in general from the fluctuation-dissipation theorem. It is asserted that the correct generalization of the Onsager hypothesis is the fluctuation-dissipation theorem. copyright 1996 The American Physical Society
Caudal regression syndrome : a case report
International Nuclear Information System (INIS)
Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun
1998-01-01
Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging
Caudal regression syndrome : a case report
Energy Technology Data Exchange (ETDEWEB)
Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun [Chungang Gil Hospital, Incheon (Korea, Republic of)
1998-07-01
Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging.
Spontaneous regression of metastatic Merkel cell carcinoma.
LENUS (Irish Health Repository)
Hassan, S J
2010-01-01
Merkel cell carcinoma is a rare aggressive neuroendocrine carcinoma of the skin predominantly affecting elderly Caucasians. It has a high rate of local recurrence and regional lymph node metastases. It is associated with a poor prognosis. Complete spontaneous regression of Merkel cell carcinoma has been reported but is a poorly understood phenomenon. Here we present a case of complete spontaneous regression of metastatic Merkel cell carcinoma demonstrating a markedly different pattern of events from those previously published.
Forecasting exchange rates: a robust regression approach
Preminger, Arie; Franck, Raphael
2005-01-01
The least squares estimation method as well as other ordinary estimation method for regression models can be severely affected by a small number of outliers, thus providing poor out-of-sample forecasts. This paper suggests a robust regression approach, based on the S-estimation method, to construct forecasting models that are less sensitive to data contamination by outliers. A robust linear autoregressive (RAR) and a robust neural network (RNN) models are estimated to study the predictabil...
Marginal longitudinal semiparametric regression via penalized splines
Al Kadiri, M.
2010-08-01
We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.
Marginal longitudinal semiparametric regression via penalized splines
Al Kadiri, M.; Carroll, R.J.; Wand, M.P.
2010-01-01
We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.
Post-processing through linear regression
van Schaeybroeck, B.; Vannitsem, S.
2011-03-01
Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
Post-processing through linear regression
Directory of Open Access Journals (Sweden)
B. Van Schaeybroeck
2011-03-01
Full Text Available Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS method, a new time-dependent Tikhonov regularization (TDTR method, the total least-square method, a new geometric-mean regression (GM, a recently introduced error-in-variables (EVMOS method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified.
These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise. At long lead times the regression schemes (EVMOS, TDTR which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
Unbalanced Regressions and the Predictive Equation
DEFF Research Database (Denmark)
Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo
Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...... in the theoretical predictive equation by suggesting a data generating process, where returns are generated as linear functions of a lagged latent I(0) risk process. The observed predictor is a function of this latent I(0) process, but it is corrupted by a fractionally integrated noise. Such a process may arise due...... to aggregation or unexpected level shifts. In this setup, the practitioner estimates a misspecified, unbalanced, and endogenous predictive regression. We show that the OLS estimate of this regression is inconsistent, but standard inference is possible. To obtain a consistent slope estimate, we then suggest...
Directory of Open Access Journals (Sweden)
Zohreh Razzaghi
2011-07-01
Full Text Available Objectives: Vitamin D deficiency is one of the most important health problems of any society. It is more common in elderly even in those dwelling in rest homes. By now, several studies have been conducted on vitamin D deficiency using current statistical models. In this study, corresponding proportional odds and stereotype regression methods were used to identify threatening factors related to vitamin D deficiency in elderly living in rest homes and comparing them with those who live out of the mentioned places. Methods & Materials: In this case-control study, there were 140 older persons living in rest homes and 140 ones not dwelling in these centers. In the present study, 25(OHD serum level variable and age, sex, body mass index, duration of exposure to sunlight variables were regarded as response and predictive variables to vitamin D deficiency, respectively. The analyses were carried out using corresponding proportional odds and stereotype regression methods and estimating parameters of these two models. Deviation statistics (AIC was used to evaluate and compare the mentioned methods. Stata.9.1 software was elected to conduct the analyses. Results: Average serum level of 25(OHD was 16.10±16.65 ng/ml and 39.62±24.78 ng/ml in individuals living in rest homes and those not living there, respectively (P=0.001. Prevalence of vitamin D deficiency (less than 20 ng/ml was observed in 75% of members of the group consisting of those living in rest homes and 23.78% of members of another group. Using corresponding proportional odds and stereotype regression methods, age, sex, body mass index, duration of exposure to sunlight variables and whether they are member of rest home were fitted. In both models, variables of group and duration of exposure to sunlight were regarded as meaningful (P<0.001. Stereotype regression model included group variable (odd ratio for a group suffering from severe vitamin D deficiency was 42.85, 95%CI:9.93-185.67 and
An introduction to using Bayesian linear regression with clinical data.
Baldwin, Scott A; Larson, Michael J
2017-11-01
Statistical training psychology focuses on frequentist methods. Bayesian methods are an alternative to standard frequentist methods. This article provides researchers with an introduction to fundamental ideas in Bayesian modeling. We use data from an electroencephalogram (EEG) and anxiety study to illustrate Bayesian models. Specifically, the models examine the relationship between error-related negativity (ERN), a particular event-related potential, and trait anxiety. Methodological topics covered include: how to set up a regression model in a Bayesian framework, specifying priors, examining convergence of the model, visualizing and interpreting posterior distributions, interval estimates, expected and predicted values, and model comparison tools. We also discuss situations where Bayesian methods can outperform frequentist methods as well has how to specify more complicated regression models. Finally, we conclude with recommendations about reporting guidelines for those using Bayesian methods in their own research. We provide data and R code for replicating our analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.
Covariate Imbalance and Adjustment for Logistic Regression Analysis of Clinical Trial Data
Ciolino, Jody D.; Martin, Reneé H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.
2014-01-01
In logistic regression analysis for binary clinical trial data, adjusted treatment effect estimates are often not equivalent to unadjusted estimates in the presence of influential covariates. This paper uses simulation to quantify the benefit of covariate adjustment in logistic regression. However, International Conference on Harmonization guidelines suggest that covariate adjustment be pre-specified. Unplanned adjusted analyses should be considered secondary. Results suggest that that if adjustment is not possible or unplanned in a logistic setting, balance in continuous covariates can alleviate some (but never all) of the shortcomings of unadjusted analyses. The case of log binomial regression is also explored. PMID:24138438
SOFC regulation at constant temperature: Experimental test and data regression study
International Nuclear Information System (INIS)
Barelli, L.; Bidini, G.; Cinti, G.; Ottaviano, A.
2016-01-01
Highlights: • SOFC operating temperature impacts strongly on its performance and lifetime. • Experimental tests were carried out varying electric load and feeding mixture gas. • Three different anodic inlet gases were tested maintaining constant temperature. • Cathodic air flow rate was used to maintain constant its operating temperature. • Regression law was defined from experimental data to regulate the air flow rate. - Abstract: The operating temperature of solid oxide fuel cell stack (SOFC) is an important parameter to be controlled, which impacts the SOFC performance and its lifetime. Rapid temperature change implies a significant temperature differences between the surface and the mean body leading to a state of thermal shock. Thermal shock and thermal cycling introduce stress in a material due to temperature differences between the surface and the interior, or between different regions of the cell. In this context, in order to determine a control law that permit to maintain constant the fuel cell temperature varying the electrical load and the infeed fuel mixture, an experimental activity were carried out on a planar SOFC short stack to analyse stack temperature. Specifically, three different anodic inlet gas compositions were tested: pure hydrogen, reformed natural gas with steam to carbon ratio equal to 2 and 2.5. By processing the obtained results, a regression law was defined to regulate the air flow rate to be provided to the fuel cell to maintain constant its operating temperature varying its operating conditions.
Economic Analyses of Ware Yam Production in Orlu Agricultural ...
African Journals Online (AJOL)
Economic Analyses of Ware Yam Production in Orlu Agricultural Zone of Imo State. ... International Journal of Agriculture and Rural Development ... statistics, gross margin analysis, marginal analysis and multiple regression analysis. Results ...
The best of both worlds: Phylogenetic eigenvector regression and mapping
Directory of Open Access Journals (Sweden)
José Alexandre Felizola Diniz Filho
2015-09-01
Full Text Available Eigenfunction analyses have been widely used to model patterns of autocorrelation in time, space and phylogeny. In a phylogenetic context, Diniz-Filho et al. (1998 proposed what they called Phylogenetic Eigenvector Regression (PVR, in which pairwise phylogenetic distances among species are submitted to a Principal Coordinate Analysis, and eigenvectors are then used as explanatory variables in regression, correlation or ANOVAs. More recently, a new approach called Phylogenetic Eigenvector Mapping (PEM was proposed, with the main advantage of explicitly incorporating a model-based warping in phylogenetic distance in which an Ornstein-Uhlenbeck (O-U process is fitted to data before eigenvector extraction. Here we compared PVR and PEM in respect to estimated phylogenetic signal, correlated evolution under alternative evolutionary models and phylogenetic imputation, using simulated data. Despite similarity between the two approaches, PEM has a slightly higher prediction ability and is more general than the original PVR. Even so, in a conceptual sense, PEM may provide a technique in the best of both worlds, combining the flexibility of data-driven and empirical eigenfunction analyses and the sounding insights provided by evolutionary models well known in comparative analyses.
Control of Atherosclerosis Regression by PRMT2 in Diabetes
2017-08-01
Macrophage Dysfunction in Obesity , Diabetes and Atherosclerosis Time Commitment: 3.0 Cal Months Supporting Agency: NHLBI Grants Officer: John Diggs...Project Goals: This sub-project focuses on the kinetics of the macrophage population in atherosclerotic plaques in a mouse model of psychological
Control of Atherosclerosis Regression by PRMT2 in Diabetes
2017-08-01
level of PRMT2, while high in healthy cells, is very low in cells from diabetics when blood sugar levels are elevated . Because PRTM2 isn’t around in... Introduction ……………………………………………………2 2. Keywords… .…………………………………………………..2 3. Accomplishments………………………………….......…….2 4. Impact…………………………………..……………………4 5...Special Reporting Requirements…………………… …………6 9. Appendices……………………………………………………..6 2 1. INTRODUCTION : High plasma cholesterol and diabetes are major
Regression analysis using dependent Polya trees.
Schörgendorfer, Angela; Branscum, Adam J
2013-11-30
Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.