WorldWideScience

Sample records for regression lines showed

  1. Testing hypotheses for differences between linear regression lines

    Science.gov (United States)

    Stanley J. Zarnoch

    2009-01-01

    Five hypotheses are identified for testing differences between simple linear regression lines. The distinctions between these hypotheses are based on a priori assumptions and illustrated with full and reduced models. The contrast approach is presented as an easy and complete method for testing for overall differences between the regressions and for making pairwise...

  2. Deriving the Regression Line with Algebra

    Science.gov (United States)

    Quintanilla, John A.

    2017-01-01

    Exploration with spreadsheets and reliance on previous skills can lead students to determine the line of best fit. To perform linear regression on a set of data, students in Algebra 2 (or, in principle, Algebra 1) do not have to settle for using the mysterious "black box" of their graphing calculators (or other classroom technologies).…

  3. On-line mixture-based alternative to logistic regression

    Czech Academy of Sciences Publication Activity Database

    Nagy, Ivan; Suzdaleva, Evgenia

    2016-01-01

    Roč. 26, č. 5 (2016), s. 417-437 ISSN 1210-0552 R&D Projects: GA ČR GA15-03564S Institutional support: RVO:67985556 Keywords : on-line modeling * on-line logistic regression * recursive mixture estimation * data dependent pointer Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.394, year: 2016 http://library.utia.cas.cz/separaty/2016/ZS/suzdaleva-0464463.pdf

  4. A regression-based Kansei engineering system based on form feature lines for product form design

    Directory of Open Access Journals (Sweden)

    Yan Xiong

    2016-06-01

    Full Text Available When developing new products, it is important for a designer to understand users’ perceptions and develop product form with the corresponding perceptions. In order to establish the mapping between users’ perceptions and product design features effectively, in this study, we presented a regression-based Kansei engineering system based on form feature lines for product form design. First according to the characteristics of design concept representation, product form features–product form feature lines were defined. Second, Kansei words were chosen to describe image perceptions toward product samples. Then, multiple linear regression and support vector regression were used to construct the models, respectively, that predicted users’ image perceptions. Using mobile phones as experimental samples, Kansei prediction models were established based on the front view form feature lines of the samples. From the experimental results, these two predict models were of good adaptability. But in contrast to multiple linear regression, the predict performance of support vector regression model was better, and support vector regression is more suitable for form regression prediction. The results of the case showed that the proposed method provided an effective means for designers to manipulate product features as a whole, and it can optimize Kansei model and improve practical values.

  5. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    International Nuclear Information System (INIS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-01-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine. - Highlights: • Both training and testing samples are considered for analytical lines selection. • The analytical lines are auto-selected based on the built-in characteristics of spectral lines. • The new method can achieve better prediction accuracy and modeling robustness. • Model predictions are given with confidence interval of probabilistic distribution

  6. Genomic prediction based on data from three layer lines using non-linear regression models.

    Science.gov (United States)

    Huang, Heyun; Windig, Jack J; Vereijken, Addie; Calus, Mario P L

    2014-11-06

    Most studies on genomic prediction with reference populations that include multiple lines or breeds have used linear models. Data heterogeneity due to using multiple populations may conflict with model assumptions used in linear regression methods. In an attempt to alleviate potential discrepancies between assumptions of linear models and multi-population data, two types of alternative models were used: (1) a multi-trait genomic best linear unbiased prediction (GBLUP) model that modelled trait by line combinations as separate but correlated traits and (2) non-linear models based on kernel learning. These models were compared to conventional linear models for genomic prediction for two lines of brown layer hens (B1 and B2) and one line of white hens (W1). The three lines each had 1004 to 1023 training and 238 to 240 validation animals. Prediction accuracy was evaluated by estimating the correlation between observed phenotypes and predicted breeding values. When the training dataset included only data from the evaluated line, non-linear models yielded at best a similar accuracy as linear models. In some cases, when adding a distantly related line, the linear models showed a slight decrease in performance, while non-linear models generally showed no change in accuracy. When only information from a closely related line was used for training, linear models and non-linear radial basis function (RBF) kernel models performed similarly. The multi-trait GBLUP model took advantage of the estimated genetic correlations between the lines. Combining linear and non-linear models improved the accuracy of multi-line genomic prediction. Linear models and non-linear RBF models performed very similarly for genomic prediction, despite the expectation that non-linear models could deal better with the heterogeneous multi-population data. This heterogeneity of the data can be overcome by modelling trait by line combinations as separate but correlated traits, which avoids the occasional

  7. Multiple regression equations modelling of groundwater of Ajmer-Pushkar railway line region, Rajasthan (India).

    Science.gov (United States)

    Mathur, Praveen; Sharma, Sarita; Soni, Bhupendra

    2010-01-01

    In the present work, an attempt is made to formulate multiple regression equations using all possible regressions method for groundwater quality assessment of Ajmer-Pushkar railway line region in pre- and post-monsoon seasons. Correlation studies revealed the existence of linear relationships (r 0.7) for electrical conductivity (EC), total hardness (TH) and total dissolved solids (TDS) with other water quality parameters. The highest correlation was found between EC and TDS (r = 0.973). EC showed highly significant positive correlation with Na, K, Cl, TDS and total solids (TS). TH showed highest correlation with Ca and Mg. TDS showed significant correlation with Na, K, SO4, PO4 and Cl. The study indicated that most of the contamination present was water soluble or ionic in nature. Mg was present as MgCl2; K mainly as KCl and K2SO4, and Na was present as the salts of Cl, SO4 and PO4. On the other hand, F and NO3 showed no significant correlations. The r2 values and F values (at 95% confidence limit, alpha = 0.05) for the modelled equations indicated high degree of linearity among independent and dependent variables. Also the error % between calculated and experimental values was contained within +/- 15% limit.

  8. Systematic review, meta-analysis, and meta-regression: Successful second-line treatment for Helicobacter pylori.

    Science.gov (United States)

    Muñoz, Neus; Sánchez-Delgado, Jordi; Baylina, Mireia; Puig, Ignasi; López-Góngora, Sheila; Suarez, David; Calvet, Xavier

    2018-06-01

    Multiple Helicobacter pylori second-line schedules have been described as potentially useful. It remains unclear, however, which are the best combinations, and which features of second-line treatments are related to better cure rates. The aim of this study was to determine that second-line treatments achieved excellent (>90%) cure rates by performing a systematic review and when possible a meta-analysis. A meta-regression was planned to determine the characteristics of treatments achieving excellent cure rates. A systematic review for studies evaluating second-line Helicobacter pylori treatment was carried out in multiple databases. A formal meta-analysis was performed when an adequate number of comparative studies was found, using RevMan5.3. A meta-regression for evaluating factors predicting cure rates >90% was performed using Stata Statistical Software. The systematic review identified 115 eligible studies, including 203 evaluable treatment arms. The results were extremely heterogeneous, with 61 treatment arms (30%) achieving optimal (>90%) cure rates. The meta-analysis favored quadruple therapies over triple (83.2% vs 76.1%, OR: 0.59:0.38-0.93; P = .02) and 14-day quadruple treatments over 7-day treatments (91.2% vs 81.5%, OR; 95% CI: 0.42:0.24-0.73; P = .002), although the differences were significant only in the per-protocol analysis. The meta-regression did not find any particular characteristics of the studies to be associated with excellent cure rates. Second-line Helicobacter pylori treatments achieving>90% cure rates are extremely heterogeneous. Quadruple therapy and 14-day treatments seem better than triple therapies and 7-day ones. No single characteristic of the treatments was related to excellent cure rates. Future approaches suitable for infectious diseases-thus considering antibiotic resistances-are needed to design rescue treatments that consistently achieve excellent cure rates. © 2018 John Wiley & Sons Ltd.

  9. Genomic prediction based on data from three layer lines using non-linear regression models

    NARCIS (Netherlands)

    Huang, H.; Windig, J.J.; Vereijken, A.; Calus, M.P.L.

    2014-01-01

    Background - Most studies on genomic prediction with reference populations that include multiple lines or breeds have used linear models. Data heterogeneity due to using multiple populations may conflict with model assumptions used in linear regression methods. Methods - In an attempt to alleviate

  10. [From clinical judgment to linear regression model.

    Science.gov (United States)

    Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O

    2013-01-01

    When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R 2 ) indicates the importance of independent variables in the outcome.

  11. On-line quantile regression in the RKHS (Reproducing Kernel Hilbert Space) for operational probabilistic forecasting of wind power

    International Nuclear Information System (INIS)

    Gallego-Castillo, Cristobal; Bessa, Ricardo; Cavalcante, Laura; Lopez-Garcia, Oscar

    2016-01-01

    Wind power probabilistic forecast is being used as input in several decision-making problems, such as stochastic unit commitment, operating reserve setting and electricity market bidding. This work introduces a new on-line quantile regression model based on the Reproducing Kernel Hilbert Space (RKHS) framework. Its application to the field of wind power forecasting involves a discussion on the choice of the bias term of the quantile models, and the consideration of the operational framework in order to mimic real conditions. Benchmark against linear and splines quantile regression models was performed for a real case study during a 18 months period. Model parameter selection was based on k-fold crossvalidation. Results showed a noticeable improvement in terms of calibration, a key criterion for the wind power industry. Modest improvements in terms of Continuous Ranked Probability Score (CRPS) were also observed for prediction horizons between 6 and 20 h ahead. - Highlights: • New online quantile regression model based on the Reproducing Kernel Hilbert Space. • First application to operational probabilistic wind power forecasting. • Modest improvements of CRPS for prediction horizons between 6 and 20 h ahead. • Noticeable improvements in terms of Calibration due to online learning.

  12. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  13. A case of parotid tumor showing remarkable regression following hyperthermo-chemo-radiotherapy

    International Nuclear Information System (INIS)

    Fujimura, Takashi; Yonemura, Yutaka; Kamata, Toru

    1987-01-01

    A 72-year-old woman developed adenocarcinoma of the left parotid gland. Because of the excessive size of her tumor and the fact that she suffered from severe liver dysfunction, she was treated by hyperthermo-chemo-radiotherapy (HCR therapy). After ten sessions of radiofrequency hyperthermia with HEH 500 (13.56 MHz radiofrequency wave), 50-Gy irradiation from a linac and administration of 33.0 g of tegafur in suppository form, the tumor mass showed remarkable regression decreasing in size by as much as 84 % on computed tomography. Histologically, the tumor which was resected under local anesthesia, showed almost total necrosis. The multidisciplinary HCR therapy was well tolerated and effective as a therapy for cancer in this case. (author)

  14. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  15. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    Science.gov (United States)

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and

  16. Few crystal balls are crystal clear : eyeballing regression

    International Nuclear Information System (INIS)

    Wittebrood, R.T.

    1998-01-01

    The theory of regression and statistical analysis as it applies to reservoir analysis was discussed. It was argued that regression lines are not always the final truth. It was suggested that regression lines and eyeballed lines are often equally accurate. The many conditions that must be fulfilled to calculate a proper regression were discussed. Mentioned among these conditions were the distribution of the data, hidden variables, knowledge of how the data was obtained, the need for causal correlation of the variables, and knowledge of the manner in which the regression results are going to be used. 1 tab., 13 figs

  17. Dihydrochalcone Compounds Isolated from Crabapple Leaves Showed Anticancer Effects on Human Cancer Cell Lines

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Qin

    2015-11-01

    Full Text Available Seven dihydrochalcone compounds were isolated from the leaves of Malus crabapples, cv. “Radiant”, and their chemical structures were elucidated by UV, IR, ESI-MS, 1H-NMR and 13C-NMR analyses. These compounds, which include trilobatin (A1, phloretin (A2, 3-hydroxyphloretin (A3, phloretin rutinoside (A4, phlorizin (A5, 6′′-O-coumaroyl-4′-O-glucopyranosylphloretin (A6, and 3′′′-methoxy-6′′-O-feruloy-4′-O-glucopyranosyl-phloretin (A7, all belong to the phloretin class and its derivatives. Compounds A6 and A7 are two new rare dihydrochalcone compounds. The results of a MTT cancer cell growth inhibition assay demonstrated that phloretin and these derivatives showed significant positive anticancer activities against several human cancer cell lines, including the A549 human lung cancer cell line, Bel 7402 liver cancer cell line, HepG2 human ileocecal cancer cell line, and HT-29 human colon cancer cell line. A7 had significant effects on all cancer cell lines, suggesting potential applications for phloretin and its derivatives. Adding a methoxyl group to phloretin dramatically increases phloretin’s anticancer activity.

  18. The reverse transcription inhibitor abacavir shows anticancer activity in prostate cancer cell lines.

    Directory of Open Access Journals (Sweden)

    Francesca Carlini

    Full Text Available BACKGROUND: Transposable Elements (TEs comprise nearly 45% of the entire genome and are part of sophisticated regulatory network systems that control developmental processes in normal and pathological conditions. The retroviral/retrotransposon gene machinery consists mainly of Long Interspersed Nuclear Elements (LINEs-1 and Human Endogenous Retroviruses (HERVs that code for their own endogenous reverse transcriptase (RT. Interestingly, RT is typically expressed at high levels in cancer cells. Recent studies report that RT inhibition by non-nucleoside reverse transcriptase inhibitors (NNRTIs induces growth arrest and cell differentiation in vitro and antagonizes growth of human tumors in animal model. In the present study we analyze the anticancer activity of Abacavir (ABC, a nucleoside reverse transcription inhibitor (NRTI, on PC3 and LNCaP prostate cancer cell lines. PRINCIPAL FINDINGS: ABC significantly reduces cell growth, migration and invasion processes, considerably slows S phase progression, induces senescence and cell death in prostate cancer cells. Consistent with these observations, microarray analysis on PC3 cells shows that ABC induces specific and dose-dependent changes in gene expression, involving multiple cellular pathways. Notably, by quantitative Real-Time PCR we found that LINE-1 ORF1 and ORF2 mRNA levels were significantly up-regulated by ABC treatment. CONCLUSIONS: Our results demonstrate the potential of ABC as anticancer agent able to induce antiproliferative activity and trigger senescence in prostate cancer cells. Noteworthy, we show that ABC elicits up-regulation of LINE-1 expression, suggesting the involvement of these elements in the observed cellular modifications.

  19. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  1. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  2. Confidence bands for inverse regression models

    International Nuclear Information System (INIS)

    Birke, Melanie; Bissantz, Nicolai; Holzmann, Hajo

    2010-01-01

    We construct uniform confidence bands for the regression function in inverse, homoscedastic regression models with convolution-type operators. Here, the convolution is between two non-periodic functions on the whole real line rather than between two periodic functions on a compact interval, since the former situation arguably arises more often in applications. First, following Bickel and Rosenblatt (1973 Ann. Stat. 1 1071–95) we construct asymptotic confidence bands which are based on strong approximations and on a limit theorem for the supremum of a stationary Gaussian process. Further, we propose bootstrap confidence bands based on the residual bootstrap and prove consistency of the bootstrap procedure. A simulation study shows that the bootstrap confidence bands perform reasonably well for moderate sample sizes. Finally, we apply our method to data from a gel electrophoresis experiment with genetically engineered neuronal receptor subunits incubated with rat brain extract

  3. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  4. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  5. The role of verbal memory in regressions during reading is modulated by the target word's recency in memory.

    Science.gov (United States)

    Guérard, Katherine; Saint-Aubin, Jean; Maltais, Marilyne; Lavoie, Hugo

    2014-10-01

    During reading, a number of eye movements are made backward, on words that have already been read. Recent evidence suggests that such eye movements, called regressions, are guided by memory. Several studies point to the role of spatial memory, but evidence for the role of verbal memory is more limited. In the present study, we examined the factors that modulate the role of verbal memory in regressions. Participants were required to make regressions on target words located in sentences displayed on one or two lines. Verbal interference was shown to affect regressions, but only when participants executed a regression on a word located in the first part of the sentence, irrespective of the number of lines on which the sentence was displayed. Experiments 2 and 3 showed that the effect of verbal interference on words located in the first part of the sentence disappeared when participants initiated the regression from the middle of the sentence. Our results suggest that verbal memory is recruited to guide regressions, but only for words read a longer time ago.

  6. Principal components based support vector regression model for on-line instrument calibration monitoring in NPPs

    International Nuclear Information System (INIS)

    Seo, In Yong; Ha, Bok Nam; Lee, Sung Woo; Shin, Chang Hoon; Kim, Seong Jun

    2010-01-01

    In nuclear power plants (NPPs), periodic sensor calibrations are required to assure that sensors are operating correctly. By checking the sensor's operating status at every fuel outage, faulty sensors may remain undetected for periods of up to 24 months. Moreover, typically, only a few faulty sensors are found to be calibrated. For the safe operation of NPP and the reduction of unnecessary calibration, on-line instrument calibration monitoring is needed. In this study, principal component based auto-associative support vector regression (PCSVR) using response surface methodology (RSM) is proposed for the sensor signal validation of NPPs. This paper describes the design of a PCSVR-based sensor validation system for a power generation system. RSM is employed to determine the optimal values of SVR hyperparameters and is compared to the genetic algorithm (GA). The proposed PCSVR model is confirmed with the actual plant data of Kori Nuclear Power Plant Unit 3 and is compared with the Auto-Associative support vector regression (AASVR) and the auto-associative neural network (AANN) model. The auto-sensitivity of AASVR is improved by around six times by using a PCA, resulting in good detection of sensor drift. Compared to AANN, accuracy and cross-sensitivity are better while the auto-sensitivity is almost the same. Meanwhile, the proposed RSM for the optimization of the PCSVR algorithm performs even better in terms of accuracy, auto-sensitivity, and averaged maximum error, except in averaged RMS error, and this method is much more time efficient compared to the conventional GA method

  7. Chicken lines divergently selected for antibody responses to sheep red blood cells show line-specific differences in sensitivity to immunomodulation by diet. Part I: Humoral parameters.

    Science.gov (United States)

    Adriaansen-Tennekes, R; de Vries Reilingh, G; Nieuwland, M G B; Parmentier, H K; Savelkoul, H F J

    2009-09-01

    Individual differences in nutrient sensitivity have been suggested to be related with differences in stress sensitivity. Here we used layer hens divergently selected for high and low specific antibody responses to SRBC (i.e., low line hens and high line hens), reflecting a genetically based differential immune competence. The parental line of these hens was randomly bred as the control line and was used as well. Recently, we showed that these selection lines differ in their stress reactivity; the low line birds show a higher hypothalamic-pituitary-adrenal (HPA) axis reactivity. To examine maternal effects and neonatal nutritional exposure on nutrient sensitivity, we studied 2 subsequent generations. This also created the opportunity to examine egg production in these birds. The 3 lines were fed 2 different nutritionally complete layer feeds for a period of 22 wk in the first generation. The second generation was fed from hatch with the experimental diets. At several time intervals, parameters reflecting humoral immunity were determined such as specific antibody to Newcastle disease and infectious bursal disease vaccines; levels of natural antibodies binding lipopolysaccharide, lipoteichoic acid, and keyhole limpet hemocyanin; and classical and alternative complement activity. The most pronounced dietary-induced effects were found in the low line birds of the first generation: specific antibody titers to Newcastle disease vaccine were significantly elevated by 1 of the 2 diets. In the second generation, significant differences were found in lipoteichoic acid natural antibodies of the control and low line hens. At the end of the observation period of egg parameters, a significant difference in egg weight was found in birds of the high line. Our results suggest that nutritional differences have immunomodulatory effects on innate and adaptive humoral immune parameters in birds with high HPA axis reactivity and affect egg production in birds with low HPA axis reactivity.

  8. How to show that unicorn milk is a chronobiotic: the regression-to-the-mean statistical artifact.

    Science.gov (United States)

    Atkinson, G; Waterhouse, J; Reilly, T; Edwards, B

    2001-11-01

    Few chronobiologists may be aware of the regression-to-the-mean (RTM) statistical artifact, even though it may have far-reaching influences on chronobiological data. With the aid of simulated measurements of the circadian rhythm phase of body temperature and a completely bogus stimulus (unicorn milk), we explain what RTM is and provide examples relevant to chronobiology. We show how RTM may lead to erroneous conclusions regarding individual differences in phase responses to rhythm disturbances and how it may appear as though unicorn milk has phase-shifting effects and can successfully treat some circadian rhythm disorders. Guidelines are provided to ensure RTM effects are minimized in chronobiological investigations.

  9. Modeling Group Differences in OLS and Orthogonal Regression: Implications for Differential Validity Studies

    Science.gov (United States)

    Kane, Michael T.; Mroch, Andrew A.

    2010-01-01

    In evaluating the relationship between two measures across different groups (i.e., in evaluating "differential validity") it is necessary to examine differences in correlation coefficients and in regression lines. Ordinary least squares (OLS) regression is the standard method for fitting lines to data, but its criterion for optimal fit…

  10. Multiple regression analysis of anthropometric measurements influencing the cephalic index of male Japanese university students.

    Science.gov (United States)

    Hossain, Md Golam; Saw, Aik; Alam, Rashidul; Ohtsuki, Fumio; Kamarul, Tunku

    2013-09-01

    Cephalic index (CI), the ratio of head breadth to head length, is widely used to categorise human populations. The aim of this study was to access the impact of anthropometric measurements on the CI of male Japanese university students. This study included 1,215 male university students from Tokyo and Kyoto, selected using convenient sampling. Multiple regression analysis was used to determine the effect of anthropometric measurements on CI. The variance inflation factor (VIF) showed no evidence of a multicollinearity problem among independent variables. The coefficients of the regression line demonstrated a significant positive relationship between CI and minimum frontal breadth (p regression analysis showed a greater likelihood for minimum frontal breadth (p regression analysis revealed bizygomatic breadth, head circumference, minimum frontal breadth, head height and morphological facial height to be the best predictor craniofacial measurements with respect to CI. The results suggest that most of the variables considered in this study appear to influence the CI of adult male Japanese students.

  11. Tumor regression patterns in retinoblastoma

    International Nuclear Information System (INIS)

    Zafar, S.N.; Siddique, S.N.; Zaheer, N.

    2016-01-01

    To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)

  12. Many Teenagers Can't Distinguish Harassment Lines, Research Shows

    Science.gov (United States)

    Sparks, Sarah D.

    2011-01-01

    A national survey finds that, when it comes to sexual harassment in school, many students do not know where to draw the line. Based on the first nationally representative survey in a decade of students in grades 7-12, the study conducted by the American Association of University Women (AAUW), found that 48 percent of nearly 2,000 students surveyed…

  13. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  14. Regression Techniques for Determining the Effective Impervious Area in Southern California Watersheds

    Science.gov (United States)

    Sultana, R.; Mroczek, M.; Dallman, S.; Sengupta, A.; Stein, E. D.

    2016-12-01

    The portion of the Total Impervious Area (TIA) that is hydraulically connected to the storm drainage network is called the Effective Impervious Area (EIA). The remaining fraction of impervious area, called the non-effective impervious area, drains onto pervious surfaces which do not contribute to runoff for smaller events. Using the TIA instead of EIA in models and calculations can lead to overestimates of runoff volumes peak discharges and oversizing of drainage system since it is assumed all impervious areas produce urban runoff that is directly connected to storm drains. This makes EIA a better predictor of actual runoff from urban catchments for hydraulic design of storm drain systems and modeling non-point source pollution. Compared to TIA, determining the EIA is considerably more difficult to calculate since it cannot be found by using remote sensing techniques, readily available EIA datasets, or aerial imagery interpretation alone. For this study, EIA percentages were calculated by two successive regression methods for five watersheds (with areas of 8.38 - 158mi2) located in Southern California using rainfall-runoff event data for the years 2004 - 2007. Runoff generated from the smaller storm events are considered to be emanating only from the effective impervious areas. Therefore, larger events that were considered to have runoff from both impervious and pervious surfaces were successively removed in the regression methods using a criterion of (1) 1mm and (2) a max (2 , 1mm) above the regression line. MSE is calculated from actual runoff and runoff predicted by the regression. Analysis of standard deviations showed that criterion of max (2 , 1mm) better fit the regression line and is the preferred method in predicting the EIA percentage. The estimated EIAs have shown to be approximately 78% to 43% of the TIA which shows use of EIA instead of TIA can have significant impact on the cost building urban hydraulic systems and stormwater capture devices.

  15. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  16. Estimating carbon and showing impacts of drought using satellite data in regression-tree models

    Science.gov (United States)

    Boyte, Stephen; Wylie, Bruce K.; Howard, Danny; Dahal, Devendra; Gilmanov, Tagir G.

    2018-01-01

    Integrating spatially explicit biogeophysical and remotely sensed data into regression-tree models enables the spatial extrapolation of training data over large geographic spaces, allowing a better understanding of broad-scale ecosystem processes. The current study presents annual gross primary production (GPP) and annual ecosystem respiration (RE) for 2000–2013 in several short-statured vegetation types using carbon flux data from towers that are located strategically across the conterminous United States (CONUS). We calculate carbon fluxes (annual net ecosystem production [NEP]) for each year in our study period, which includes 2012 when drought and higher-than-normal temperatures influence vegetation productivity in large parts of the study area. We present and analyse carbon flux dynamics in the CONUS to better understand how drought affects GPP, RE, and NEP. Model accuracy metrics show strong correlation coefficients (r) (r ≥ 94%) between training and estimated data for both GPP and RE. Overall, average annual GPP, RE, and NEP are relatively constant throughout the study period except during 2012 when almost 60% less carbon is sequestered than normal. These results allow us to conclude that this modelling method effectively estimates carbon dynamics through time and allows the exploration of impacts of meteorological anomalies and vegetation types on carbon dynamics.

  17. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  18. Online Support Vector Regression with Varying Parameters for Time-Dependent Data

    International Nuclear Information System (INIS)

    Omitaomu, Olufemi A.; Jeong, Myong K.; Badiru, Adedeji B.

    2011-01-01

    Support vector regression (SVR) is a machine learning technique that continues to receive interest in several domains including manufacturing, engineering, and medicine. In order to extend its application to problems in which datasets arrive constantly and in which batch processing of the datasets is infeasible or expensive, an accurate online support vector regression (AOSVR) technique was proposed. The AOSVR technique efficiently updates a trained SVR function whenever a sample is added to or removed from the training set without retraining the entire training data. However, the AOSVR technique assumes that the new samples and the training samples are of the same characteristics; hence, the same value of SVR parameters is used for training and prediction. This assumption is not applicable to data samples that are inherently noisy and non-stationary such as sensor data. As a result, we propose Accurate On-line Support Vector Regression with Varying Parameters (AOSVR-VP) that uses varying SVR parameters rather than fixed SVR parameters, and hence accounts for the variability that may exist in the samples. To accomplish this objective, we also propose a generalized weight function to automatically update the weights of SVR parameters in on-line monitoring applications. The proposed function allows for lower and upper bounds for SVR parameters. We tested our proposed approach and compared results with the conventional AOSVR approach using two benchmark time series data and sensor data from nuclear power plant. The results show that using varying SVR parameters is more applicable to time dependent data.

  19. Straight line fitting and predictions: On a marginal likelihood approach to linear regression and errors-in-variables models

    Science.gov (United States)

    Christiansen, Bo

    2015-04-01

    Linear regression methods are without doubt the most used approaches to describe and predict data in the physical sciences. They are often good first order approximations and they are in general easier to apply and interpret than more advanced methods. However, even the properties of univariate regression can lead to debate over the appropriateness of various models as witnessed by the recent discussion about climate reconstruction methods. Before linear regression is applied important choices have to be made regarding the origins of the noise terms and regarding which of the two variables under consideration that should be treated as the independent variable. These decisions are often not easy to make but they may have a considerable impact on the results. We seek to give a unified probabilistic - Bayesian with flat priors - treatment of univariate linear regression and prediction by taking, as starting point, the general errors-in-variables model (Christiansen, J. Clim., 27, 2014-2031, 2014). Other versions of linear regression can be obtained as limits of this model. We derive the likelihood of the model parameters and predictands of the general errors-in-variables model by marginalizing over the nuisance parameters. The resulting likelihood is relatively simple and easy to analyze and calculate. The well known unidentifiability of the errors-in-variables model is manifested as the absence of a well-defined maximum in the likelihood. However, this does not mean that probabilistic inference can not be made; the marginal likelihoods of model parameters and the predictands have, in general, well-defined maxima. We also include a probabilistic version of classical calibration and show how it is related to the errors-in-variables model. The results are illustrated by an example from the coupling between the lower stratosphere and the troposphere in the Northern Hemisphere winter.

  20. Biostatistics Series Module 6: Correlation and Linear Regression.

    Science.gov (United States)

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Correlation and linear regression are the most commonly used techniques for quantifying the association between two numeric variables. Correlation quantifies the strength of the linear relationship between paired variables, expressing this as a correlation coefficient. If both variables x and y are normally distributed, we calculate Pearson's correlation coefficient ( r ). If normality assumption is not met for one or both variables in a correlation analysis, a rank correlation coefficient, such as Spearman's rho (ρ) may be calculated. A hypothesis test of correlation tests whether the linear relationship between the two variables holds in the underlying population, in which case it returns a P correlation coefficient can also be calculated for an idea of the correlation in the population. The value r 2 denotes the proportion of the variability of the dependent variable y that can be attributed to its linear relation with the independent variable x and is called the coefficient of determination. Linear regression is a technique that attempts to link two correlated variables x and y in the form of a mathematical equation ( y = a + bx ), such that given the value of one variable the other may be predicted. In general, the method of least squares is applied to obtain the equation of the regression line. Correlation and linear regression analysis are based on certain assumptions pertaining to the data sets. If these assumptions are not met, misleading conclusions may be drawn. The first assumption is that of linear relationship between the two variables. A scatter plot is essential before embarking on any correlation-regression analysis to show that this is indeed the case. Outliers or clustering within data sets can distort the correlation coefficient value. Finally, it is vital to remember that though strong correlation can be a pointer toward causation, the two are not synonymous.

  1. Complex regression Doppler optical coherence tomography

    Science.gov (United States)

    Elahi, Sahar; Gu, Shi; Thrane, Lars; Rollins, Andrew M.; Jenkins, Michael W.

    2018-04-01

    We introduce a new method to measure Doppler shifts more accurately and extend the dynamic range of Doppler optical coherence tomography (OCT). The two-point estimate of the conventional Doppler method is replaced with a regression that is applied to high-density B-scans in polar coordinates. We built a high-speed OCT system using a 1.68-MHz Fourier domain mode locked laser to acquire high-density B-scans (16,000 A-lines) at high enough frame rates (˜100 fps) to accurately capture the dynamics of the beating embryonic heart. Flow phantom experiments confirm that the complex regression lowers the minimum detectable velocity from 12.25 mm / s to 374 μm / s, whereas the maximum velocity of 400 mm / s is measured without phase wrapping. Complex regression Doppler OCT also demonstrates higher accuracy and precision compared with the conventional method, particularly when signal-to-noise ratio is low. The extended dynamic range allows monitoring of blood flow over several stages of development in embryos without adjusting the imaging parameters. In addition, applying complex averaging recovers hidden features in structural images.

  2. Regression with Sparse Approximations of Data

    DEFF Research Database (Denmark)

    Noorzad, Pardis; Sturm, Bob L.

    2012-01-01

    We propose sparse approximation weighted regression (SPARROW), a method for local estimation of the regression function that uses sparse approximation with a dictionary of measurements. SPARROW estimates the regression function at a point with a linear combination of a few regressands selected...... by a sparse approximation of the point in terms of the regressors. We show SPARROW can be considered a variant of \\(k\\)-nearest neighbors regression (\\(k\\)-NNR), and more generally, local polynomial kernel regression. Unlike \\(k\\)-NNR, however, SPARROW can adapt the number of regressors to use based...

  3. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a

  4. Logistic regression applied to natural hazards: rare event logistic regression with replications

    OpenAIRE

    Guns, M.; Vanacker, Veerle

    2012-01-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logisti...

  5. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  6. Using Quartile-Quartile Lines as Linear Models

    Science.gov (United States)

    Gordon, Sheldon P.

    2015-01-01

    This article introduces the notion of the quartile-quartile line as an alternative to the regression line and the median-median line to produce a linear model based on a set of data. It is based on using the first and third quartiles of a set of (x, y) data. Dynamic spreadsheets are used as exploratory tools to compare the different approaches and…

  7. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Science.gov (United States)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  8. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  9. Study of multivariate analysis of quantitative traits in Iranian pumpkin lines

    Directory of Open Access Journals (Sweden)

    Yadegari Mehrab

    2017-01-01

    Full Text Available In this study, seed yield production and its different components fruit length, fruit diameter, fruit length/fruit diameter ratio (FL/FD, diameter of flesh, diameter of seed core, fruit weight, weight of 1000 seed from 24 lines of pumpkin grown in Iran was examined. Twenty-five characters in all plant lines were measured by Descriptor (UPOV and data were subjected to cluster analysis. Results showed that plants lines were divided in four groups. In all groups, regression comparisons were made for modeling the effect of different characters on seed yield, results also showed that fruit weight and fruit length in all groups had the most direct effect on seed yield. In conclusion, these traits are suggested as the best indirect selection criteria to improve the seed yield genetically in Cucurbita spp. genotypes especially in preliminary generation of breeding and selection programs.

  10. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Directory of Open Access Journals (Sweden)

    M. Guns

    2012-06-01

    Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  11. Magnitude conversion to unified moment magnitude using orthogonal regression relation

    Science.gov (United States)

    Das, Ranjit; Wason, H. R.; Sharma, M. L.

    2012-05-01

    Homogenization of earthquake catalog being a pre-requisite for seismic hazard assessment requires region based magnitude conversion relationships. Linear Standard Regression (SR) relations fail when both the magnitudes have measurement errors. To accomplish homogenization, techniques like Orthogonal Standard Regression (OSR) are thus used. In this paper a technique is proposed for using such OSR for preparation of homogenized earthquake catalog in moment magnitude Mw. For derivation of orthogonal regression relation between mb and Mw, a data set consisting of 171 events with observed body wave magnitudes (mb,obs) and moment magnitude (Mw,obs) values has been taken from ISC and GCMT databases for Northeast India and adjoining region for the period 1978-2006. Firstly, an OSR relation given below has been developed using mb,obs and Mw,obs values corresponding to 150 events from this data set. M=1.3(±0.004)m-1.4(±0.130), where mb,proxy are body wave magnitude values of the points on the OSR line given by the orthogonality criterion, for observed (mb,obs, Mw,obs) points. A linear relation is then developed between these 150 mb,obs values and corresponding mb,proxy values given by the OSR line using orthogonality criterion. The relation obtained is m=0.878(±0.03)m+0.653(±0.15). The accuracy of the above procedure has been checked with the rest of the data i.e., 21 events values. The improvement in the correlation coefficient value between mb,obs and Mw estimated using the proposed procedure compared to the correlation coefficient value between mb,obs and Mw,obs shows the advantage of OSR relationship for homogenization. The OSR procedure developed in this study can be used to homogenize any catalog containing various magnitudes (e.g., ML, mb, MS) with measurement errors, by their conversion to unified moment magnitude Mw. The proposed procedure also remains valid in case the magnitudes have measurement errors of different orders, i.e. the error variance ratio is

  12. A primer for biomedical scientists on how to execute model II linear regression analysis.

    Science.gov (United States)

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  13. Regression and Sparse Regression Methods for Viscosity Estimation of Acid Milk From it’s Sls Features

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara; Skytte, Jacob Lercke; Nielsen, Otto Højager Attermann

    2012-01-01

    Statistical solutions find wide spread use in food and medicine quality control. We investigate the effect of different regression and sparse regression methods for a viscosity estimation problem using the spectro-temporal features from new Sub-Surface Laser Scattering (SLS) vision system. From...... with sparse LAR, lasso and Elastic Net (EN) sparse regression methods. Due to the inconsistent measurement condition, Locally Weighted Scatter plot Smoothing (Loess) has been employed to alleviate the undesired variation in the estimated viscosity. The experimental results of applying different methods show...

  14. Construction of multiple linear regression models using blood biomarkers for selecting against abdominal fat traits in broilers.

    Science.gov (United States)

    Dong, J Q; Zhang, X Y; Wang, S Z; Jiang, X F; Zhang, K; Ma, G W; Wu, M Q; Li, H; Zhang, H

    2018-01-01

    Plasma very low-density lipoprotein (VLDL) can be used to select for low body fat or abdominal fat (AF) in broilers, but its correlation with AF is limited. We investigated whether any other biochemical indicator can be used in combination with VLDL for a better selective effect. Nineteen plasma biochemical indicators were measured in male chickens from the Northeast Agricultural University broiler lines divergently selected for AF content (NEAUHLF) in the fed state at 46 and 48 d of age. The average concentration of every parameter for the 2 d was used for statistical analysis. Levels of these 19 plasma biochemical parameters were compared between the lean and fat lines. The phenotypic correlations between these plasma biochemical indicators and AF traits were analyzed. Then, multiple linear regression models were constructed to select the best model used for selecting against AF content. and the heritabilities of plasma indicators contained in the best models were estimated. The results showed that 11 plasma biochemical indicators (triglycerides, total bile acid, total protein, globulin, albumin/globulin, aspartate transaminase, alanine transaminase, gamma-glutamyl transpeptidase, uric acid, creatinine, and VLDL) differed significantly between the lean and fat lines (P linear regression models based on albumin/globulin, VLDL, triglycerides, globulin, total bile acid, and uric acid, had higher R2 (0.73) than the model based only on VLDL (0.21). The plasma parameters included in the best models had moderate heritability estimates (0.21 ≤ h2 ≤ 0.43). These results indicate that these multiple linear regression models can be used to select for lean broiler chickens. © 2017 Poultry Science Association Inc.

  15. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  16. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  17. An Assessment of Polynomial Regression Techniques for the Relative Radiometric Normalization (RRN of High-Resolution Multi-Temporal Airborne Thermal Infrared (TIR Imagery

    Directory of Open Access Journals (Sweden)

    Mir Mustafizur Rahman

    2014-11-01

    Full Text Available Thermal Infrared (TIR remote sensing images of urban environments are increasingly available from airborne and satellite platforms. However, limited access to high-spatial resolution (H-res: ~1 m TIR satellite images requires the use of TIR airborne sensors for mapping large complex urban surfaces, especially at micro-scales. A critical limitation of such H-res mapping is the need to acquire a large scene composed of multiple flight lines and mosaic them together. This results in the same scene components (e.g., roads, buildings, green space and water exhibiting different temperatures in different flight lines. To mitigate these effects, linear relative radiometric normalization (RRN techniques are often applied. However, the Earth’s surface is composed of features whose thermal behaviour is characterized by complexity and non-linearity. Therefore, we hypothesize that non-linear RRN techniques should demonstrate increased radiometric agreement over similar linear techniques. To test this hypothesis, this paper evaluates four (linear and non-linear RRN techniques, including: (i histogram matching (HM; (ii pseudo-invariant feature-based polynomial regression (PIF_Poly; (iii no-change stratified random sample-based linear regression (NCSRS_Lin; and (iv no-change stratified random sample-based polynomial regression (NCSRS_Poly; two of which (ii and iv are newly proposed non-linear techniques. When applied over two adjacent flight lines (~70 km2 of TABI-1800 airborne data, visual and statistical results show that both new non-linear techniques improved radiometric agreement over the previously evaluated linear techniques, with the new fully-automated method, NCSRS-based polynomial regression, providing the highest improvement in radiometric agreement between the master and the slave images, at ~56%. This is ~5% higher than the best previously evaluated linear technique (NCSRS-based linear regression.

  18. Spontaneous regression of a congenital melanocytic nevus

    Directory of Open Access Journals (Sweden)

    Amiya Kumar Nath

    2011-01-01

    Full Text Available Congenital melanocytic nevus (CMN may rarely regress which may also be associated with a halo or vitiligo. We describe a 10-year-old girl who presented with CMN on the left leg since birth, which recently started to regress spontaneously with associated depigmentation in the lesion and at a distant site. Dermoscopy performed at different sites of the regressing lesion demonstrated loss of epidermal pigments first followed by loss of dermal pigments. Histopathology and Masson-Fontana stain demonstrated lymphocytic infiltration and loss of pigment production in the regressing area. Immunohistochemistry staining (S100 and HMB-45, however, showed that nevus cells were present in the regressing areas.

  19. Evaluation of Spring Wheat Recombinant Inbred Lines under Drought Stress

    Directory of Open Access Journals (Sweden)

    M. Moghaddaszadeh-Ahrabi

    2012-07-01

    Full Text Available Iran is one of arid and semi-arid regions of the world. Wheat as a strategic agricultural products faces water deficiency in most areas of the country. Therefore, identification of the resistant varieties to drought stress is one of main aims for breeders. To assess effect of drought stress at heading on 72 spring wheat recombinant inbred lines derived from American Yecora Rojo (high yielder, dwarf and early maturity as paternal parent and Iranian No. 49 line (tall and late maturiting as maternal parent cross were studied. The experiment was conducted at the Research Station of the University of Tabriz using a randomized complete block design with two replications during 2009 growing season. Based on the results from combined analysis of variance significant difference was observed among lines for all of traits studied, except for harvest index, grain number per spike and days to heading. There was significant difference between normal and drought stress conditions. Since the interaction between line and conditions was insignificant for all traits, it does therefore, provide the possibility of comparing the lines without regard to irrigation levels. Based on the means of, the traits it was found that the lines 96, 122, 123 and 155 were superior. MP, GMP and STI indices were recognized to be suitable indices to identify superior lines. With respect to these indices, lines 96, 122, 123, 138, 149 and 155 were found superior as compared with remaining lines. Based on stepwise regression analysis of grain yield with other traits, respectively grain number per spike, number of spikes/m2 and 1000 kernel weight were inserted into final model as effective variables on grain yield, which made 81/9 percent of the grain yield variation. Path analysis of grain yield and related traits, based on stepwise regression, demonstrated the significant positive direct effect for grain number per spike, number of spikes/m2 and 1000 kernel weight on grain yield

  20. On Solving Lq-Penalized Regressions

    Directory of Open Access Journals (Sweden)

    Tracy Zhou Wu

    2007-01-01

    Full Text Available Lq-penalized regression arises in multidimensional statistical modelling where all or part of the regression coefficients are penalized to achieve both accuracy and parsimony of statistical models. There is often substantial computational difficulty except for the quadratic penalty case. The difficulty is partly due to the nonsmoothness of the objective function inherited from the use of the absolute value. We propose a new solution method for the general Lq-penalized regression problem based on space transformation and thus efficient optimization algorithms. The new method has immediate applications in statistics, notably in penalized spline smoothing problems. In particular, the LASSO problem is shown to be polynomial time solvable. Numerical studies show promise of our approach.

  1. Auto-associative Kernel Regression Model with Weighted Distance Metric for Instrument Drift Monitoring

    International Nuclear Information System (INIS)

    Shin, Ho Cheol; Park, Moon Ghu; You, Skin

    2006-01-01

    Recently, many on-line approaches to instrument channel surveillance (drift monitoring and fault detection) have been reported worldwide. On-line monitoring (OLM) method evaluates instrument channel performance by assessing its consistency with other plant indications through parametric or non-parametric models. The heart of an OLM system is the model giving an estimate of the true process parameter value against individual measurements. This model gives process parameter estimate calculated as a function of other plant measurements which can be used to identify small sensor drifts that would require the sensor to be manually calibrated or replaced. This paper describes an improvement of auto associative kernel regression (AAKR) by introducing a correlation coefficient weighting on kernel distances. The prediction performance of the developed method is compared with conventional auto-associative kernel regression

  2. LINEAR REGRESSION MODEL ESTİMATİON FOR RIGHT CENSORED DATA

    Directory of Open Access Journals (Sweden)

    Ersin Yılmaz

    2016-05-01

    Full Text Available In this study, firstly we will define a right censored data. If we say shortly right-censored data is censoring values that above the exact line. This may be related with scaling device. And then  we will use response variable acquainted from right-censored explanatory variables. Then the linear regression model will be estimated. For censored data’s existence, Kaplan-Meier weights will be used for  the estimation of the model. With the weights regression model  will be consistent and unbiased with that.   And also there is a method for the censored data that is a semi parametric regression and this method also give  useful results  for censored data too. This study also might be useful for the health studies because of the censored data used in medical issues generally.

  3. Genome-wide prediction of discrete traits using bayesian regressions and machine learning

    Directory of Open Access Journals (Sweden)

    Forni Selma

    2011-02-01

    Full Text Available Abstract Background Genomic selection has gained much attention and the main goal is to increase the predictive accuracy and the genetic gain in livestock using dense marker information. Most methods dealing with the large p (number of covariates small n (number of observations problem have dealt only with continuous traits, but there are many important traits in livestock that are recorded in a discrete fashion (e.g. pregnancy outcome, disease resistance. It is necessary to evaluate alternatives to analyze discrete traits in a genome-wide prediction context. Methods This study shows two threshold versions of Bayesian regressions (Bayes A and Bayesian LASSO and two machine learning algorithms (boosting and random forest to analyze discrete traits in a genome-wide prediction context. These methods were evaluated using simulated and field data to predict yet-to-be observed records. Performances were compared based on the models' predictive ability. Results The simulation showed that machine learning had some advantages over Bayesian regressions when a small number of QTL regulated the trait under pure additivity. However, differences were small and disappeared with a large number of QTL. Bayesian threshold LASSO and boosting achieved the highest accuracies, whereas Random Forest presented the highest classification performance. Random Forest was the most consistent method in detecting resistant and susceptible animals, phi correlation was up to 81% greater than Bayesian regressions. Random Forest outperformed other methods in correctly classifying resistant and susceptible animals in the two pure swine lines evaluated. Boosting and Bayes A were more accurate with crossbred data. Conclusions The results of this study suggest that the best method for genome-wide prediction may depend on the genetic basis of the population analyzed. All methods were less accurate at correctly classifying intermediate animals than extreme animals. Among the different

  4. Regressive Progression: The Quest for Self-Transcendence in Western Tragedy

    Directory of Open Access Journals (Sweden)

    Bahee Hadaegh

    2009-07-01

    Full Text Available Regressive progression is a concept which interestingly describes the developmental process of Western tragedy based on the recurring motif of the quest for the higher self and Nietzsche’s understanding of Dionysian tragic hero. This motif reveals itself in three manifestations - action, imagination and inaction- respectively visible in the three major dramatic eras of the Renaissance tragedy, European nineteenth-century drama, and the Absurd Theatre. Although the approach of the quest regressively shifts from action to inaction, the degree of success of the tragic questers in approximating the wished-for higher self reveals a progressive line in the developmental process of Western tragedy.

  5. Multivariate Frequency-Severity Regression Models in Insurance

    Directory of Open Access Journals (Sweden)

    Edward W. Frees

    2016-02-01

    Full Text Available In insurance and related industries including healthcare, it is common to have several outcome measures that the analyst wishes to understand using explanatory variables. For example, in automobile insurance, an accident may result in payments for damage to one’s own vehicle, damage to another party’s vehicle, or personal injury. It is also common to be interested in the frequency of accidents in addition to the severity of the claim amounts. This paper synthesizes and extends the literature on multivariate frequency-severity regression modeling with a focus on insurance industry applications. Regression models for understanding the distribution of each outcome continue to be developed yet there now exists a solid body of literature for the marginal outcomes. This paper contributes to this body of literature by focusing on the use of a copula for modeling the dependence among these outcomes; a major advantage of this tool is that it preserves the body of work established for marginal models. We illustrate this approach using data from the Wisconsin Local Government Property Insurance Fund. This fund offers insurance protection for (i property; (ii motor vehicle; and (iii contractors’ equipment claims. In addition to several claim types and frequency-severity components, outcomes can be further categorized by time and space, requiring complex dependency modeling. We find significant dependencies for these data; specifically, we find that dependencies among lines are stronger than the dependencies between the frequency and average severity within each line.

  6. GEANT4 simulation diagram showing the architecture of the ATLAS test line: the detectors are positioned to receive the beam from the SPS. A muon particle which enters the magnet and crosses all detectors is shown (blue line).

    CERN Multimedia

    2004-01-01

    GEANT4 simulation diagram showing the architecture of the ATLAS test line: the detectors are positioned to receive the beam from the SPS. A muon particle which enters the magnet and crosses all detectors is shown (blue line).

  7. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  8. Regression of uveal malignant melanomas following cobalt-60 plaque. Correlates between acoustic spectrum analysis and tumor regression

    International Nuclear Information System (INIS)

    Coleman, D.J.; Lizzi, F.L.; Silverman, R.H.; Ellsworth, R.M.; Haik, B.G.; Abramson, D.H.; Smith, M.E.; Rondeau, M.J.

    1985-01-01

    Parameters derived from computer analysis of digital radio-frequency (rf) ultrasound scan data of untreated uveal malignant melanomas were examined for correlations with tumor regression following cobalt-60 plaque. Parameters included tumor height, normalized power spectrum and acoustic tissue type (ATT). Acoustic tissue type was based upon discriminant analysis of tumor power spectra, with spectra of tumors of known pathology serving as a model. Results showed ATT to be correlated with tumor regression during the first 18 months following treatment. Tumors with ATT associated with spindle cell malignant melanoma showed over twice the percentage reduction in height as those with ATT associated with mixed/epithelioid melanomas. Pre-treatment height was only weakly correlated with regression. Additionally, significant spectral changes were observed following treatment. Ultrasonic spectrum analysis thus provides a noninvasive tool for classification, prediction and monitoring of tumor response to cobalt-60 plaque

  9. A novel cell line derived from pleomorphic adenoma expresses MMP2, MMP9, TIMP1, TIMP2, and shows numeric chromosomal anomalies.

    Directory of Open Access Journals (Sweden)

    Aline Semblano Carreira Falcão

    Full Text Available Pleomorphic adenoma is the most common salivary gland neoplasm, and it can be locally invasive, despite its slow growth. This study aimed to establish a novel cell line (AP-1 derived from a human pleomorphic adenoma sample to better understand local invasiveness of this tumor. AP-1 cell line was characterized by cell growth analysis, expression of epithelial and myoepithelial markers by immunofluorescence, electron microscopy, 3D cell culture assays, cytogenetic features and transcriptomic study. Expression of matrix metalloproteinases (MMPs and their tissue inhibitors (TIMPs was also analyzed by immunofluorescence and zymography. Furthermore, epithelial and myoepithelial markers, MMPs and TIMPs were studied in the tumor that originated the cell line. AP-1 cells showed neoplastic epithelial and myoepithelial markers, such as cytokeratins, vimentin, S100 protein and smooth-muscle actin. These molecules were also found in vivo, in the tumor that originated the cell line. MMPs and TIMPs were observed in vivo and in AP-1 cells. Growth curve showed that AP-1 exhibited a doubling time of 3.342 days. AP-1 cells grown inside Matrigel recapitulated tumor architecture. Different numerical and structural chromosomal anomalies were visualized in cytogenetic analysis. Transcriptomic analysis addressed expression of 7 target genes (VIM, TIMP2, MMP2, MMP9, TIMP1, ACTA2 e PLAG1. Results were compared to transcriptomic profile of non-neoplastic salivary gland cells (HSG. Only MMP9 was not expressed in both libraries, and VIM was expressed solely in AP-1 library. The major difference regarding gene expression level between AP-1 and HSG samples occurred for MMP2. This gene was 184 times more expressed in AP-1 cells. Our findings suggest that AP-1 cell line could be a useful model for further studies on pleomorphic adenoma biology.

  10. Unbalanced Regressions and the Predictive Equation

    DEFF Research Database (Denmark)

    Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo

    Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...... in the theoretical predictive equation by suggesting a data generating process, where returns are generated as linear functions of a lagged latent I(0) risk process. The observed predictor is a function of this latent I(0) process, but it is corrupted by a fractionally integrated noise. Such a process may arise due...... to aggregation or unexpected level shifts. In this setup, the practitioner estimates a misspecified, unbalanced, and endogenous predictive regression. We show that the OLS estimate of this regression is inconsistent, but standard inference is possible. To obtain a consistent slope estimate, we then suggest...

  11. Predictors of course in obsessive-compulsive disorder: logistic regression versus Cox regression for recurrent events.

    Science.gov (United States)

    Kempe, P T; van Oppen, P; de Haan, E; Twisk, J W R; Sluis, A; Smit, J H; van Dyck, R; van Balkom, A J L M

    2007-09-01

    Two methods for predicting remissions in obsessive-compulsive disorder (OCD) treatment are evaluated. Y-BOCS measurements of 88 patients with a primary OCD (DSM-III-R) diagnosis were performed over a 16-week treatment period, and during three follow-ups. Remission at any measurement was defined as a Y-BOCS score lower than thirteen combined with a reduction of seven points when compared with baseline. Logistic regression models were compared with a Cox regression for recurrent events model. Logistic regression yielded different models at different evaluation times. The recurrent events model remained stable when fewer measurements were used. Higher baseline levels of neuroticism and more severe OCD symptoms were associated with a lower chance of remission, early age of onset and more depressive symptoms with a higher chance. Choice of outcome time affects logistic regression prediction models. Recurrent events analysis uses all information on remissions and relapses. Short- and long-term predictors for OCD remission show overlap.

  12. Comparison of multinomial logistic regression and logistic regression: which is more efficient in allocating land use?

    Science.gov (United States)

    Lin, Yingzhi; Deng, Xiangzheng; Li, Xing; Ma, Enjun

    2014-12-01

    Spatially explicit simulation of land use change is the basis for estimating the effects of land use and cover change on energy fluxes, ecology and the environment. At the pixel level, logistic regression is one of the most common approaches used in spatially explicit land use allocation models to determine the relationship between land use and its causal factors in driving land use change, and thereby to evaluate land use suitability. However, these models have a drawback in that they do not determine/allocate land use based on the direct relationship between land use change and its driving factors. Consequently, a multinomial logistic regression method was introduced to address this flaw, and thereby, judge the suitability of a type of land use in any given pixel in a case study area of the Jiangxi Province, China. A comparison of the two regression methods indicated that the proportion of correctly allocated pixels using multinomial logistic regression was 92.98%, which was 8.47% higher than that obtained using logistic regression. Paired t-test results also showed that pixels were more clearly distinguished by multinomial logistic regression than by logistic regression. In conclusion, multinomial logistic regression is a more efficient and accurate method for the spatial allocation of land use changes. The application of this method in future land use change studies may improve the accuracy of predicting the effects of land use and cover change on energy fluxes, ecology, and environment.

  13. Regression away from the mean: Theory and examples.

    Science.gov (United States)

    Schwarz, Wolf; Reike, Dennis

    2018-02-01

    Using a standard repeated measures model with arbitrary true score distribution and normal error variables, we present some fundamental closed-form results which explicitly indicate the conditions under which regression effects towards (RTM) and away from the mean are expected. Specifically, we show that for skewed and bimodal distributions many or even most cases will show a regression effect that is in expectation away from the mean, or that is not just towards but actually beyond the mean. We illustrate our results in quantitative detail with typical examples from experimental and biometric applications, which exhibit a clear regression away from the mean ('egression from the mean') signature. We aim not to repeal cautionary advice against potential RTM effects, but to present a balanced view of regression effects, based on a clear identification of the conditions governing the form that regression effects take in repeated measures designs. © 2017 The British Psychological Society.

  14. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    Science.gov (United States)

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  15. Yield Stability of Sorghum Hybrids and Parental Lines | Kenga ...

    African Journals Online (AJOL)

    Seventy-five sorghum hybrids and twenty parental lines were evaluated for two consecutive years at two locations. Our objective was to compare relative stability of grain yields among hybrids and parental lines. Mean grain yields and stability analysis of variance, which included linear regression coefficient (bi) and ...

  16. Multiple Lines of Evidence

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Venzin, Alexander M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bramer, Lisa M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-06-03

    This paper discusses the process of identifying factors that influence the contamination level of a given decision area and then determining the likelihood that the area remains unacceptable. This process is referred to as lines of evidence. These lines of evidence then serve as inputs for the stratified compliance sampling (SCS) method, which requires a decision area to be divided into strata based upon contamination expectations. This is done in order to focus sampling efforts more within stratum where contamination is more likely and to use the domain knowledge about these likelihoods of the stratum remaining unacceptable to buy down the number of samples necessary, if possible. Two different building scenarios were considered as an example (see Table 3.1). SME expertise was elicited concerning four lines of evidence factors (see Table 3.2): 1) amount of contamination that was seen before decontamination, 2) post-decontamination air sampling information, 3) the applied decontaminant information, and 4) the surface material. Statistical experimental design and logistic regression modelling were used to help determine the likelihood that example stratum remained unacceptable for a given example scenario. The number of samples necessary for clearance was calculated by applying the SCS method to the example scenario, using the estimated likelihood of each stratum remaining unacceptable as was determined using the lines of evidence approach. The commonly used simple random sampling (SRS) method was also used to calculate the number of samples necessary for clearance for comparison purposes. The lines of evidence with SCS approach resulted in a 19% to 43% reduction in total number of samples necessary for clearance (see Table 3.6). The reduction depended upon the building scenario, as well as the level of percent clean criteria. A sensitivity analysis was also performed showing how changing the estimated likelihoods of stratum remaining unacceptable affect the number

  17. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. THE ANALYSIS OF CORRELATIONS BETWEEN THE MAIN TRAITS OF WOOL PRODUCTION ON PALAS SHEEP LINE FOR MEAT, MILK AND HIGH PROLIFICACY

    Directory of Open Access Journals (Sweden)

    ANA ENCIU

    2008-10-01

    Full Text Available The aim of this paper was to analyze the coefficient of phenotypic correlation and regression between main wool production traits for the sheep belonging to the Palas line specialized for meat, milk and with high prolificacy. The study was performed on a 10 years interval, the phenotypic correlation and the regression being determined for age groups and body weight classes for the following traits: raw wool production, the staple length, wool diameter and body weight at shearing. The obtained results are showing that for the specialized sheep lines the efficiency of wool production is also higher for the sheep with moderate body weights but for these sheep lines the selection for body weight will be done based on the morphoproductive parameters specific to the purpose of exploitation (milk production, meat production or high prolificacy.

  19. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  20. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  1. Genetic analysis of partial egg production records in Japanese quail using random regression models.

    Science.gov (United States)

    Abou Khadiga, G; Mahmoud, B Y F; Farahat, G S; Emam, A M; El-Full, E A

    2017-08-01

    The main objectives of this study were to detect the most appropriate random regression model (RRM) to fit the data of monthly egg production in 2 lines (selected and control) of Japanese quail and to test the consistency of different criteria of model choice. Data from 1,200 female Japanese quails for the first 5 months of egg production from 4 consecutive generations of an egg line selected for egg production in the first month (EP1) was analyzed. Eight RRMs with different orders of Legendre polynomials were compared to determine the proper model for analysis. All criteria of model choice suggested that the adequate model included the second-order Legendre polynomials for fixed effects, and the third-order for additive genetic effects and permanent environmental effects. Predictive ability of the best model was the highest among all models (ρ = 0.987). According to the best model fitted to the data, estimates of heritability were relatively low to moderate (0.10 to 0.17) showed a descending pattern from the first to the fifth month of production. A similar pattern was observed for permanent environmental effects with greater estimates in the first (0.36) and second (0.23) months of production than heritability estimates. Genetic correlations between separate production periods were higher (0.18 to 0.93) than their phenotypic counterparts (0.15 to 0.87). The superiority of the selected line over the control was observed through significant (P egg production in earlier ages (first and second months) than later ones. A methodology based on random regression animal models can be recommended for genetic evaluation of egg production in Japanese quail. © 2017 Poultry Science Association Inc.

  2. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  3. Poisson Mixture Regression Models for Heart Disease Prediction.

    Science.gov (United States)

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  4. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  5. Comparison of Linear and Non-linear Regression Analysis to Determine Pulmonary Pressure in Hyperthyroidism.

    Science.gov (United States)

    Scarneciu, Camelia C; Sangeorzan, Livia; Rus, Horatiu; Scarneciu, Vlad D; Varciu, Mihai S; Andreescu, Oana; Scarneciu, Ioan

    2017-01-01

    This study aimed at assessing the incidence of pulmonary hypertension (PH) at newly diagnosed hyperthyroid patients and at finding a simple model showing the complex functional relation between pulmonary hypertension in hyperthyroidism and the factors causing it. The 53 hyperthyroid patients (H-group) were evaluated mainly by using an echocardiographical method and compared with 35 euthyroid (E-group) and 25 healthy people (C-group). In order to identify the factors causing pulmonary hypertension the statistical method of comparing the values of arithmetical means is used. The functional relation between the two random variables (PAPs and each of the factors determining it within our research study) can be expressed by linear or non-linear function. By applying the linear regression method described by a first-degree equation the line of regression (linear model) has been determined; by applying the non-linear regression method described by a second degree equation, a parabola-type curve of regression (non-linear or polynomial model) has been determined. We made the comparison and the validation of these two models by calculating the determination coefficient (criterion 1), the comparison of residuals (criterion 2), application of AIC criterion (criterion 3) and use of F-test (criterion 4). From the H-group, 47% have pulmonary hypertension completely reversible when obtaining euthyroidism. The factors causing pulmonary hypertension were identified: previously known- level of free thyroxin, pulmonary vascular resistance, cardiac output; new factors identified in this study- pretreatment period, age, systolic blood pressure. According to the four criteria and to the clinical judgment, we consider that the polynomial model (graphically parabola- type) is better than the linear one. The better model showing the functional relation between the pulmonary hypertension in hyperthyroidism and the factors identified in this study is given by a polynomial equation of second

  6. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  7. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design.

    Science.gov (United States)

    Meaney, Christopher; Moineddin, Rahim

    2014-01-24

    In biomedical research, response variables are often encountered which have bounded support on the open unit interval--(0,1). Traditionally, researchers have attempted to estimate covariate effects on these types of response data using linear regression. Alternative modelling strategies may include: beta regression, variable-dispersion beta regression, and fractional logit regression models. This study employs a Monte Carlo simulation design to compare the statistical properties of the linear regression model to that of the more novel beta regression, variable-dispersion beta regression, and fractional logit regression models. In the Monte Carlo experiment we assume a simple two sample design. We assume observations are realizations of independent draws from their respective probability models. The randomly simulated draws from the various probability models are chosen to emulate average proportion/percentage/rate differences of pre-specified magnitudes. Following simulation of the experimental data we estimate average proportion/percentage/rate differences. We compare the estimators in terms of bias, variance, type-1 error and power. Estimates of Monte Carlo error associated with these quantities are provided. If response data are beta distributed with constant dispersion parameters across the two samples, then all models are unbiased and have reasonable type-1 error rates and power profiles. If the response data in the two samples have different dispersion parameters, then the simple beta regression model is biased. When the sample size is small (N0 = N1 = 25) linear regression has superior type-1 error rates compared to the other models. Small sample type-1 error rates can be improved in beta regression models using bias correction/reduction methods. In the power experiments, variable-dispersion beta regression and fractional logit regression models have slightly elevated power compared to linear regression models. Similar results were observed if the

  8. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  9. Two Paradoxes in Linear Regression Analysis

    Science.gov (United States)

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  10. Variable and subset selection in PLS regression

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    2001-01-01

    The purpose of this paper is to present some useful methods for introductory analysis of variables and subsets in relation to PLS regression. We present here methods that are efficient in finding the appropriate variables or subset to use in the PLS regression. The general conclusion...... is that variable selection is important for successful analysis of chemometric data. An important aspect of the results presented is that lack of variable selection can spoil the PLS regression, and that cross-validation measures using a test set can show larger variation, when we use different subsets of X, than...

  11. Estimating Loess Plateau Average Annual Precipitation with Multiple Linear Regression Kriging and Geographically Weighted Regression Kriging

    Directory of Open Access Journals (Sweden)

    Qiutong Jin

    2016-06-01

    Full Text Available Estimating the spatial distribution of precipitation is an important and challenging task in hydrology, climatology, ecology, and environmental science. In order to generate a highly accurate distribution map of average annual precipitation for the Loess Plateau in China, multiple linear regression Kriging (MLRK and geographically weighted regression Kriging (GWRK methods were employed using precipitation data from the period 1980–2010 from 435 meteorological stations. The predictors in regression Kriging were selected by stepwise regression analysis from many auxiliary environmental factors, such as elevation (DEM, normalized difference vegetation index (NDVI, solar radiation, slope, and aspect. All predictor distribution maps had a 500 m spatial resolution. Validation precipitation data from 130 hydrometeorological stations were used to assess the prediction accuracies of the MLRK and GWRK approaches. Results showed that both prediction maps with a 500 m spatial resolution interpolated by MLRK and GWRK had a high accuracy and captured detailed spatial distribution data; however, MLRK produced a lower prediction error and a higher variance explanation than GWRK, although the differences were small, in contrast to conclusions from similar studies.

  12. Should metacognition be measured by logistic regression?

    Science.gov (United States)

    Rausch, Manuel; Zehetleitner, Michael

    2017-03-01

    Are logistic regression slopes suitable to quantify metacognitive sensitivity, i.e. the efficiency with which subjective reports differentiate between correct and incorrect task responses? We analytically show that logistic regression slopes are independent from rating criteria in one specific model of metacognition, which assumes (i) that rating decisions are based on sensory evidence generated independently of the sensory evidence used for primary task responses and (ii) that the distributions of evidence are logistic. Given a hierarchical model of metacognition, logistic regression slopes depend on rating criteria. According to all considered models, regression slopes depend on the primary task criterion. A reanalysis of previous data revealed that massive numbers of trials are required to distinguish between hierarchical and independent models with tolerable accuracy. It is argued that researchers who wish to use logistic regression as measure of metacognitive sensitivity need to control the primary task criterion and rating criteria. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    Science.gov (United States)

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  14. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  15. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  16. Accuracy of Bayes and Logistic Regression Subscale Probabilities for Educational and Certification Tests

    Science.gov (United States)

    Rudner, Lawrence

    2016-01-01

    In the machine learning literature, it is commonly accepted as fact that as calibration sample sizes increase, Naïve Bayes classifiers initially outperform Logistic Regression classifiers in terms of classification accuracy. Applied to subtests from an on-line final examination and from a highly regarded certification examination, this study shows…

  17. Dinosaur incubation periods directly determined from growth-line counts in embryonic teeth show reptilian-grade development.

    Science.gov (United States)

    Erickson, Gregory M; Zelenitsky, Darla K; Kay, David Ian; Norell, Mark A

    2017-01-17

    Birds stand out from other egg-laying amniotes by producing relatively small numbers of large eggs with very short incubation periods (average 11-85 d). This aspect promotes high survivorship by limiting exposure to predation and environmental perturbation, allows for larger more fit young, and facilitates rapid attainment of adult size. Birds are living dinosaurs; their rapid development has been considered to reflect the primitive dinosaurian condition. Here, nonavian dinosaurian incubation periods in both small and large ornithischian taxa are empirically determined through growth-line counts in embryonic teeth. Our results show unexpectedly slow incubation (2.8 and 5.8 mo) like those of outgroup reptiles. Developmental and physiological constraints would have rendered tooth formation and incubation inherently slow in other dinosaur lineages and basal birds. The capacity to determine incubation periods in extinct egg-laying amniotes has implications for dinosaurian embryology, life history strategies, and survivorship across the Cretaceous-Paleogene mass extinction event.

  18. Chicken lines divergently selected for antibody responses to sheep red blood cells show line-specific differences in sensitivity to immunomodulation by diet. Part I: Humoral parameters

    NARCIS (Netherlands)

    Adriaansen-Tennekes, R.; Vries Reilingh, de G.; Nieuwland, M.G.B.; Parmentier, H.K.; Savelkoul, H.F.J.

    2009-01-01

    Individual differences in nutrient sensitivity have been suggested to be related with differences in stress sensitivity. Here we used layer hens divergently selected for high and low specific antibody responses to SRBC (i.e., low line hens and high line hens), reflecting a genetically based

  19. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  20. Least median of squares and iteratively re-weighted least squares as robust linear regression methods for fluorimetric determination of α-lipoic acid in capsules in ideal and non-ideal cases of linearity.

    Science.gov (United States)

    Korany, Mohamed A; Gazy, Azza A; Khamis, Essam F; Ragab, Marwa A A; Kamal, Miranda F

    2018-03-26

    This study outlines two robust regression approaches, namely least median of squares (LMS) and iteratively re-weighted least squares (IRLS) to investigate their application in instrument analysis of nutraceuticals (that is, fluorescence quenching of merbromin reagent upon lipoic acid addition). These robust regression methods were used to calculate calibration data from the fluorescence quenching reaction (∆F and F-ratio) under ideal or non-ideal linearity conditions. For each condition, data were treated using three regression fittings: Ordinary Least Squares (OLS), LMS and IRLS. Assessment of linearity, limits of detection (LOD) and quantitation (LOQ), accuracy and precision were carefully studied for each condition. LMS and IRLS regression line fittings showed significant improvement in correlation coefficients and all regression parameters for both methods and both conditions. In the ideal linearity condition, the intercept and slope changed insignificantly, but a dramatic change was observed for the non-ideal condition and linearity intercept. Under both linearity conditions, LOD and LOQ values after the robust regression line fitting of data were lower than those obtained before data treatment. The results obtained after statistical treatment indicated that the linearity ranges for drug determination could be expanded to lower limits of quantitation by enhancing the regression equation parameters after data treatment. Analysis results for lipoic acid in capsules, using both fluorimetric methods, treated by parametric OLS and after treatment by robust LMS and IRLS were compared for both linearity conditions. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Adaptation of bread-wheat lines across different environment of Pakistan

    International Nuclear Information System (INIS)

    Mujhid, M.Y.; Ahmad, Z.; Khan, M.A.; Qamar, M.; Kisana, N.S.; Asif, M.

    2009-01-01

    Ten advance wheat-lines developed by National Agricultural Research Centre (NARC), Islamabad, were evaluated for stability of grain-yield over five locations. The experiment was conducted during 2006-07 at NARC, Islamabad, AARI, Faisalabad, RARI, Bahawalpur, CCRI, Pirsabak and NIFA, Peshawar, by following randomized complete block design with three replications. At maturity, grain-yield was taken from standard plot and data were analyzed statistically. Genotypes x locations interactions were found highly significant. Predictable (linear) portion of variation was important, but non-linear component was non significant. None of the regression coefficients differ significantly from unity. Hence deviation from regression and average grain-yield was used to identify superior genotypes. Above average grain-yields were observed in five genotypes. V4 and V8 were stable across environments with low deviation from regression and gave above-average yield. The study provides valuable information for selecting advance wheat-lines under different locations of the country, to be considered potential as breeding material for release as varieties. (author)

  2. Similarity law for Widom lines and coexistence lines.

    Science.gov (United States)

    Banuti, D T; Raju, M; Ihme, M

    2017-05-01

    The coexistence line of a fluid separates liquid and gaseous states at subcritical pressures, ending at the critical point. Only recently, it became clear that the supercritical state space can likewise be divided into regions with liquidlike and gaslike properties, separated by an extension to the coexistence line. This crossover line is commonly referred to as the Widom line, and is characterized by large changes in density or enthalpy, manifesting as maxima in the thermodynamic response functions. Thus, a reliable representation of the coexistence line and the Widom line is important for sub- and supercritical applications that depend on an accurate prediction of fluid properties. While it is known for subcritical pressures that nondimensionalization with the respective species critical pressures p_{cr} and temperatures T_{cr} only collapses coexistence line data for simple fluids, this approach is used for Widom lines of all fluids. However, we show here that the Widom line does not adhere to the corresponding states principle, but instead to the extended corresponding states principle. We resolve this problem in two steps. First, we propose a Widom line functional based on the Clapeyron equation and derive an analytical, species specific expression for the only parameter from the Soave-Redlich-Kwong equation of state. This parameter is a function of the acentric factor ω and compares well with experimental data. Second, we introduce the scaled reduced pressure p_{r}^{*} to replace the previously used reduced pressure p_{r}=p/p_{cr}. We show that p_{r}^{*} is a function of the acentric factor only and can thus be readily determined from fluid property tables. It collapses both subcritical coexistence line and supercritical Widom line data over a wide range of species with acentric factors ranging from -0.38 (helium) to 0.34 (water), including alkanes up to n-hexane. By using p_{r}^{*}, the extended corresponding states principle can be applied within

  3. Shorter lines facilitate reading in those who struggle.

    Directory of Open Access Journals (Sweden)

    Matthew H Schneps

    Full Text Available People with dyslexia, who ordinarily struggle to read, sometimes remark that reading is easier when e-readers are used. Here, we used eye tracking to observe high school students with dyslexia as they read using these devices. Among the factors investigated, we found that reading using a small device resulted in substantial benefits, improving reading speeds by 27%, reducing the number of fixations by 11%, and importantly, reducing the number of regressive saccades by more than a factor of 2, with no cost to comprehension. Given that an expected trade-off between horizontal and vertical regression was not observed when line lengths were altered, we speculate that these effects occur because sluggish attention spreads perception to the left as the gaze shifts during reading. Short lines eliminate crowded text to the left, reducing regression. The effects of attention modulation by the hand, and of increased letter spacing to reduce crowding, were also found to modulate the oculomotor dynamics in reading, but whether these factors resulted in benefits or costs depended on characteristics, such as visual attention span, that varied within our sample.

  4. The Prediction and Prevention of Dysphagia After Occipitospinal Fusion by Use of the S-line (Swallowing Line).

    Science.gov (United States)

    Kaneyama, Shuichi; Sumi, Masatoshi; Takabatake, Masato; Kasahara, Koichi; Kanemura, Aritetsu; Hirata, Hiroaki; Darden, Bruce V

    2017-05-15

    Clinical case series and risk factor analysis of dysphagia after occipitospinal fusion (OSF). The aim of this study was to develop new criteria to avoid postoperative dysphagia by analyzing the relationship among the craniocervical alignment, the oropharyngeal space, and the incidence of dysphagia after OSF. Craniocervical malalignment after OSF is considered to be one of the primary triggers of postoperative dysphagia. However, ideal craniocervical alignment has not been confirmed. Thirty-eight patients were included. We measured the O-C2 angle (O-C2A) and the pharyngeal inlet angle (PIA) on the lateral cervical radiogram at follow-up. PIA is defined as the angle between McGregor's line and the line that links the center of the C1 anterior arch and the apex of cervical sagittal curvature. The impact of these two parameters on the diameter of pharyngeal airway space (PAS) and the incidence of the dysphagia were analyzed. Six of 38 cases (15.8%) exhibited the dysphagia. A multiple regression analysis showed that PIA was significantly correlated with PAS (β = 0.714, P = 0.005). Receiver-operating characteristic curves showed that PIA had a high accuracy as a predictor of the dysphagia with an AUC (area under the curve) of 0.90. Cases with a PIA less than 90 degrees showed significantly higher incidence of dysphagia (31.6%) than those with a 90 or more degrees of PIA (0.0%) (P = 0.008). Our results indicated that PIA had the high possibility to predict postoperative dysphagia by OSF with the condition of PIA dysphagia. 4.

  5. A flexible fuzzy regression algorithm for forecasting oil consumption estimation

    International Nuclear Information System (INIS)

    Azadeh, A.; Khakestani, M.; Saberi, M.

    2009-01-01

    Oil consumption plays a vital role in socio-economic development of most countries. This study presents a flexible fuzzy regression algorithm for forecasting oil consumption based on standard economic indicators. The standard indicators are annual population, cost of crude oil import, gross domestic production (GDP) and annual oil production in the last period. The proposed algorithm uses analysis of variance (ANOVA) to select either fuzzy regression or conventional regression for future demand estimation. The significance of the proposed algorithm is three fold. First, it is flexible and identifies the best model based on the results of ANOVA and minimum absolute percentage error (MAPE), whereas previous studies consider the best fitted fuzzy regression model based on MAPE or other relative error results. Second, the proposed model may identify conventional regression as the best model for future oil consumption forecasting because of its dynamic structure, whereas previous studies assume that fuzzy regression always provide the best solutions and estimation. Third, it utilizes the most standard independent variables for the regression models. To show the applicability and superiority of the proposed flexible fuzzy regression algorithm the data for oil consumption in Canada, United States, Japan and Australia from 1990 to 2005 are used. The results show that the flexible algorithm provides accurate solution for oil consumption estimation problem. The algorithm may be used by policy makers to accurately foresee the behavior of oil consumption in various regions.

  6. A naturally derived gastric cancer cell line shows latency I Epstein-Barr virus infection closely resembling EBV-associated gastric cancer

    International Nuclear Information System (INIS)

    Oh, Sang Taek; Seo, Jung Seon; Moon, Uk Yeol; Kang, Kyeong Hee; Shin, Dong-Jik; Yoon, Sungjoo Kim; Kim, Woo Ho; Park, Jae-Gahb; Lee, Suk Kyeong

    2004-01-01

    In a process seeking out a good model cell line for Epstein-Barr virus (EBV)-associated gastric cancer, we found that one previously established gastric adenocarcinoma cell line is infected with type 1 EBV. This SNU-719 cell line from a Korean patient expressed cytokeratin without CD19 or CD21 expression. In SNU-719, EBNA1 and LMP2A were expressed, while LMP1 and EBNA2 were not. None of the tested lytic EBV proteins were detected in this cell line unless stimulated with phorbol ester. EBV infection was also shown in the original carcinoma tissue of SNU-719 cell line. Our results support the possibility of a CD21-independent EBV infection of gastric epithelial cells in vivo. As the latent EBV gene expression pattern of SNU-719 closely resembles that of the EBV-associated gastric cancer, this naturally derived cell line may serve as a valuable model system to clarify the precise role of EBV in gastric carcinogenesis

  7. Response of Barley Double Haploid Lines to the Grain Yield and Morphological Traits under Water Deficit Stress Conditions

    Directory of Open Access Journals (Sweden)

    Maroof Khalily

    2017-04-01

    Full Text Available To study the relationships of grain yield and some of agro-morphological traits in 40 doubled haploid (DH lines along with parental and three check genotypes in a randomized complete block design with two replications under two water regimes (normal and stress were evaluated during 2011-2012 and 2012-2013 growing seasons. Combined analysis of variance showed significant difference for all the traits in terms of the year, water regimes, lines, and and line × year. Comparison of group means, between non-stress and stress conditions, showed that DH lines had the lowest reduction percentage for the number of grains per spike, thousand grain weight, grain yield and biological yield as opposed to check genotypes. The correlation between grain yield with biological yield, harvest index, thousand grain weight, and hectoliter of kernel weight in both conditions, were highly significant and positive. Based on stepwise regression the peduncle length, number of seeds per spike, thousand seed weight, and hectoliter of kernel weight had important effect on increasing seed yield. The result of path analysis showed that these traits had the highest direct effect on grain yield. Based on mean comparisons of morphological characters as well as STI and GMP indices it can be concluded that lines No.11, 13, 14, 24, 29, 30, 35 and 39 were distinguished to be desirable lines for grain yield and their related traits and also tolerant lines in terms of response to drought stress conditions.

  8. Photographic but not line-drawn faces show early perceptual neural sensitivity to eye gaze direction

    Directory of Open Access Journals (Sweden)

    Alejandra eRossi

    2015-04-01

    Full Text Available Our brains readily decode facial movements and changes in social attention, reflected in earlier and larger N170 event-related potentials (ERPs to viewing gaze aversions vs. direct gaze in real faces (Puce et al. 2000. In contrast, gaze aversions in line-drawn faces do not produce these N170 differences (Rossi et al., 2014, suggesting that physical stimulus properties or experimental context may drive these effects. Here we investigated the role of stimulus-induced context on neurophysiological responses to dynamic gaze. Sixteen healthy adults viewed line-drawn and real faces, with dynamic eye aversion and direct gaze transitions, and control stimuli (scrambled arrays and checkerboards while continuous electroencephalographic (EEG activity was recorded. EEG data from 2 temporo-occipital clusters of 9 electrodes in each hemisphere where N170 activity is known to be maximal were selected for analysis. N170 peak amplitude and latency, and temporal dynamics from event-related spectral perturbations (ERSPs were measured in 16 healthy subjects. Real faces generated larger N170s for averted vs. direct gaze motion, however, N170s to real and direct gaze were as large as those to respective controls. N170 amplitude did not differ across line-drawn gaze changes. Overall, bilateral mean gamma power changes for faces relative to control stimuli occurred between 150-350 ms, potentially reflecting signal detection of facial motion.Our data indicate that experimental context does not drive N170 differences to viewed gaze changes. Low-level stimulus properties, such as the high sclera/iris contrast change in real eyes likely drive the N170 changes to viewed aversive movements.

  9. FBH1 Catalyzes Regression of Stalled Replication Forks

    Directory of Open Access Journals (Sweden)

    Kasper Fugger

    2015-03-01

    Full Text Available DNA replication fork perturbation is a major challenge to the maintenance of genome integrity. It has been suggested that processing of stalled forks might involve fork regression, in which the fork reverses and the two nascent DNA strands anneal. Here, we show that FBH1 catalyzes regression of a model replication fork in vitro and promotes fork regression in vivo in response to replication perturbation. Cells respond to fork stalling by activating checkpoint responses requiring signaling through stress-activated protein kinases. Importantly, we show that FBH1, through its helicase activity, is required for early phosphorylation of ATM substrates such as CHK2 and CtIP as well as hyperphosphorylation of RPA. These phosphorylations occur prior to apparent DNA double-strand break formation. Furthermore, FBH1-dependent signaling promotes checkpoint control and preserves genome integrity. We propose a model whereby FBH1 promotes early checkpoint signaling by remodeling of stalled DNA replication forks.

  10. Polylinear regression analysis in radiochemistry

    International Nuclear Information System (INIS)

    Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.

    1995-01-01

    A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis

  11. Rescuing Alu: recovery of new inserts shows LINE-1 preserves Alu activity through A-tail expansion.

    Directory of Open Access Journals (Sweden)

    Bradley J Wagstaff

    Full Text Available Alu elements are trans-mobilized by the autonomous non-LTR retroelement, LINE-1 (L1. Alu-induced insertion mutagenesis contributes to about 0.1% human genetic disease and is responsible for the majority of the documented instances of human retroelement insertion-induced disease. Here we introduce a SINE recovery method that provides a complementary approach for comprehensive analysis of the impact and biological mechanisms of Alu retrotransposition. Using this approach, we recovered 226 de novo tagged Alu inserts in HeLa cells. Our analysis reveals that in human cells marked Alu inserts driven by either exogenously supplied full length L1 or ORF2 protein are indistinguishable. Four percent of de novo Alu inserts were associated with genomic deletions and rearrangements and lacked the hallmarks of retrotransposition. In contrast to L1 inserts, 5' truncations of Alu inserts are rare, as most of the recovered inserts (96.5% are full length. De novo Alus show a random pattern of insertion across chromosomes, but further characterization revealed an Alu insertion bias exists favoring insertion near other SINEs, highly conserved elements, with almost 60% landing within genes. De novo Alu inserts show no evidence of RNA editing. Priming for reverse transcription rarely occurred within the first 20 bp (most 5' of the A-tail. The A-tails of recovered inserts show significant expansion, with many at least doubling in length. Sequence manipulation of the construct led to the demonstration that the A-tail expansion likely occurs during insertion due to slippage by the L1 ORF2 protein. We postulate that the A-tail expansion directly impacts Alu evolution by reintroducing new active source elements to counteract the natural loss of active Alus and minimizing Alu extinction.

  12. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  13. Regression calibration with more surrogates than mismeasured variables

    KAUST Repository

    Kipnis, Victor

    2012-06-29

    In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.

  14. Regression calibration with more surrogates than mismeasured variables

    KAUST Repository

    Kipnis, Victor; Midthune, Douglas; Freedman, Laurence S.; Carroll, Raymond J.

    2012-01-01

    In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.

  15. Statistical analysis of sediment toxicity by additive monotone regression splines

    NARCIS (Netherlands)

    Boer, de W.J.; Besten, den P.J.; Braak, ter C.J.F.

    2002-01-01

    Modeling nonlinearity and thresholds in dose-effect relations is a major challenge, particularly in noisy data sets. Here we show the utility of nonlinear regression with additive monotone regression splines. These splines lead almost automatically to the estimation of thresholds. We applied this

  16. Analysis of some methods for reduced rank Gaussian process regression

    DEFF Research Database (Denmark)

    Quinonero-Candela, J.; Rasmussen, Carl Edward

    2005-01-01

    While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent...... proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank...... Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning...

  17. Controlling attribute effect in linear regression

    KAUST Repository

    Calders, Toon; Karim, Asim A.; Kamiran, Faisal; Ali, Wasif Mohammad; Zhang, Xiangliang

    2013-01-01

    In data mining we often have to learn from biased data, because, for instance, data comes from different batches or there was a gender or racial bias in the collection of social data. In some applications it may be necessary to explicitly control this bias in the models we learn from the data. This paper is the first to study learning linear regression models under constraints that control the biasing effect of a given attribute such as gender or batch number. We show how propensity modeling can be used for factoring out the part of the bias that can be justified by externally provided explanatory attributes. Then we analytically derive linear models that minimize squared error while controlling the bias by imposing constraints on the mean outcome or residuals of the models. Experiments with discrimination-aware crime prediction and batch effect normalization tasks show that the proposed techniques are successful in controlling attribute effects in linear regression models. © 2013 IEEE.

  18. Controlling attribute effect in linear regression

    KAUST Repository

    Calders, Toon

    2013-12-01

    In data mining we often have to learn from biased data, because, for instance, data comes from different batches or there was a gender or racial bias in the collection of social data. In some applications it may be necessary to explicitly control this bias in the models we learn from the data. This paper is the first to study learning linear regression models under constraints that control the biasing effect of a given attribute such as gender or batch number. We show how propensity modeling can be used for factoring out the part of the bias that can be justified by externally provided explanatory attributes. Then we analytically derive linear models that minimize squared error while controlling the bias by imposing constraints on the mean outcome or residuals of the models. Experiments with discrimination-aware crime prediction and batch effect normalization tasks show that the proposed techniques are successful in controlling attribute effects in linear regression models. © 2013 IEEE.

  19. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...

  20. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  1. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  2. Modeling Fire Occurrence at the City Scale: A Comparison between Geographically Weighted Regression and Global Linear Regression.

    Science.gov (United States)

    Song, Chao; Kwan, Mei-Po; Zhu, Jiping

    2017-04-08

    An increasing number of fires are occurring with the rapid development of cities, resulting in increased risk for human beings and the environment. This study compares geographically weighted regression-based models, including geographically weighted regression (GWR) and geographically and temporally weighted regression (GTWR), which integrates spatial and temporal effects and global linear regression models (LM) for modeling fire risk at the city scale. The results show that the road density and the spatial distribution of enterprises have the strongest influences on fire risk, which implies that we should focus on areas where roads and enterprises are densely clustered. In addition, locations with a large number of enterprises have fewer fire ignition records, probably because of strict management and prevention measures. A changing number of significant variables across space indicate that heterogeneity mainly exists in the northern and eastern rural and suburban areas of Hefei city, where human-related facilities or road construction are only clustered in the city sub-centers. GTWR can capture small changes in the spatiotemporal heterogeneity of the variables while GWR and LM cannot. An approach that integrates space and time enables us to better understand the dynamic changes in fire risk. Thus governments can use the results to manage fire safety at the city scale.

  3. Normalization Ridge Regression in Practice I: Comparisons Between Ordinary Least Squares, Ridge Regression and Normalization Ridge Regression.

    Science.gov (United States)

    Bulcock, J. W.

    The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…

  4. Number Line Estimation Predicts Mathematical Skills: Difference in Grades 2 and 4

    Directory of Open Access Journals (Sweden)

    Meixia Zhu

    2017-09-01

    Full Text Available Studies have shown that number line estimation is important for learning. However, it is yet unclear if number line estimation predicts different mathematical skills in different grades after controlling for age, non-verbal cognitive ability, attention, and working memory. The purpose of this study was to examine the role of number line estimation on two mathematical skills (calculation fluency and math problem-solving in grade 2 and grade 4. One hundred and forty-eight children from Shanghai, China were assessed on measures of number line estimation, non-verbal cognitive ability (non-verbal matrices, working memory (N-back, attention (expressive attention, and mathematical skills (calculation fluency and math problem-solving. The results showed that in grade 2, number line estimation correlated significantly with calculation fluency (r = -0.27, p < 0.05 and math problem-solving (r = -0.52, p < 0.01. In grade 4, number line estimation correlated significantly with math problem-solving (r = -0.38, p < 0.01, but not with calculation fluency. Regression analyses indicated that in grade 2, number line estimation accounted for unique variance in math problem-solving (12.0% and calculation fluency (4.0% after controlling for the effects of age, non-verbal cognitive ability, attention, and working memory. In grade 4, number line estimation accounted for unique variance in math problem-solving (9.0% but not in calculation fluency. These findings suggested that number line estimation had an important role in math problem-solving for both grades 2 and 4 children and in calculation fluency for grade 2 children. We concluded that number line estimation could be a useful indicator for teachers to identify and improve children’s mathematical skills.

  5. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  6. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  7. Mixed Frequency Data Sampling Regression Models: The R Package midasr

    Directory of Open Access Journals (Sweden)

    Eric Ghysels

    2016-08-01

    Full Text Available When modeling economic relationships it is increasingly common to encounter data sampled at different frequencies. We introduce the R package midasr which enables estimating regression models with variables sampled at different frequencies within a MIDAS regression framework put forward in work by Ghysels, Santa-Clara, and Valkanov (2002. In this article we define a general autoregressive MIDAS regression model with multiple variables of different frequencies and show how it can be specified using the familiar R formula interface and estimated using various optimization methods chosen by the researcher. We discuss how to check the validity of the estimated model both in terms of numerical convergence and statistical adequacy of a chosen regression specification, how to perform model selection based on a information criterion, how to assess forecasting accuracy of the MIDAS regression model and how to obtain a forecast aggregation of different MIDAS regression models. We illustrate the capabilities of the package with a simulated MIDAS regression model and give two empirical examples of application of MIDAS regression.

  8. Evaluation of in-line Raman data for end-point determination of a coating process: Comparison of Science-Based Calibration, PLS-regression and univariate data analysis.

    Science.gov (United States)

    Barimani, Shirin; Kleinebudde, Peter

    2017-10-01

    A multivariate analysis method, Science-Based Calibration (SBC), was used for the first time for endpoint determination of a tablet coating process using Raman data. Two types of tablet cores, placebo and caffeine cores, received a coating suspension comprising a polyvinyl alcohol-polyethylene glycol graft-copolymer and titanium dioxide to a maximum coating thickness of 80µm. Raman spectroscopy was used as in-line PAT tool. The spectra were acquired every minute and correlated to the amount of applied aqueous coating suspension. SBC was compared to another well-known multivariate analysis method, Partial Least Squares-regression (PLS) and a simpler approach, Univariate Data Analysis (UVDA). All developed calibration models had coefficient of determination values (R 2 ) higher than 0.99. The coating endpoints could be predicted with root mean square errors (RMSEP) less than 3.1% of the applied coating suspensions. Compared to PLS and UVDA, SBC proved to be an alternative multivariate calibration method with high predictive power. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Comparison of partial least squares and lasso regression techniques as applied to laser-induced breakdown spectroscopy of geological samples

    International Nuclear Information System (INIS)

    Dyar, M.D.; Carmosino, M.L.; Breves, E.A.; Ozanne, M.V.; Clegg, S.M.; Wiens, R.C.

    2012-01-01

    response variables as possible while avoiding multicollinearity between principal components. When the selected number of principal components is projected back into the original feature space of the spectra, 6144 correlation coefficients are generated, a small fraction of which are mathematically significant to the regression. In contrast, the lasso models require only a small number (< 24) of non-zero correlation coefficients (β values) to determine the concentration of each of the ten major elements. Causality between the positively-correlated emission lines chosen by the lasso and the elemental concentration was examined. In general, the higher the lasso coefficient (β), the greater the likelihood that the selected line results from an emission of that element. Emission lines with negative β values should arise from elements that are anti-correlated with the element being predicted. For elements except Fe, Al, Ti, and P, the lasso-selected wavelength with the highest β value corresponds to the element being predicted, e.g. 559.8 nm for neutral Ca. However, the specific lines chosen by the lasso with positive β values are not always those from the element being predicted. Other wavelengths and the elements that most strongly correlate with them to predict concentration are obviously related to known geochemical correlations or close overlap of emission lines, while others must result from matrix effects. Use of the lasso technique thus directly informs our understanding of the underlying physical processes that give rise to LIBS emissions by determining which lines can best represent concentration, and which lines from other elements are causing matrix effects. - Highlights: ► Compositions of 100 rocks are predicted from LIBS with PLS-1, PLS-2, and the lasso. ► All yield comparable results in terms of accuracy, but not interpretability. ► Lasso chooses channels from known atomic emissions.

  10. Determinants of LSIL Regression in Women from a Colombian Cohort

    International Nuclear Information System (INIS)

    Molano, Monica; Gonzalez, Mauricio; Gamboa, Oscar; Ortiz, Natasha; Luna, Joaquin; Hernandez, Gustavo; Posso, Hector; Murillo, Raul; Munoz, Nubia

    2010-01-01

    Objective: To analyze the role of Human Papillomavirus (HPV) and other risk factors in the regression of cervical lesions in women from the Bogota Cohort. Methods: 200 HPV positive women with abnormal cytology were included for regression analysis. The time of lesion regression was modeled using methods for interval censored survival time data. Median duration of total follow-up was 9 years. Results: 80 (40%) women were diagnosed with Atypical Squamous Cells of Undetermined Significance (ASCUS) or Atypical Glandular Cells of Undetermined Significance (AGUS) while 120 (60%) were diagnosed with Low Grade Squamous Intra-epithelial Lesions (LSIL). Globally, 40% of the lesions were still present at first year of follow up, while 1.5% was still present at 5 year check-up. The multivariate model showed similar regression rates for lesions in women with ASCUS/AGUS and women with LSIL (HR= 0.82, 95% CI 0.59-1.12). Women infected with HR HPV types and those with mixed infections had lower regression rates for lesions than did women infected with LR types (HR=0.526, 95% CI 0.33-0.84, for HR types and HR=0.378, 95% CI 0.20-0.69, for mixed infections). Furthermore, women over 30 years had a higher lesion regression rate than did women under 30 years (HR1.53, 95% CI 1.03-2.27). The study showed that the median time for lesion regression was 9 months while the median time for HPV clearance was 12 months. Conclusions: In the studied population, the type of infection and the age of the women are critical factors for the regression of cervical lesions.

  11. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  12. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  13. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  14. Spontaneous regression of retinopathy of prematurity:incidence and predictive factors

    Directory of Open Access Journals (Sweden)

    Rui-Hong Ju

    2013-08-01

    Full Text Available AIM:To evaluate the incidence of spontaneous regression of changes in the retina and vitreous in active stage of retinopathy of prematurity(ROP and identify the possible relative factors during the regression.METHODS: This was a retrospective, hospital-based study. The study consisted of 39 premature infants with mild ROP showed spontaneous regression (Group A and 17 with severe ROP who had been treated before naturally involuting (Group B from August 2008 through May 2011. Data on gender, single or multiple pregnancy, gestational age, birth weight, weight gain from birth to the sixth week of life, use of oxygen in mechanical ventilation, total duration of oxygen inhalation, surfactant given or not, need for and times of blood transfusion, 1,5,10-min Apgar score, presence of bacterial or fungal or combined infection, hyaline membrane disease (HMD, patent ductus arteriosus (PDA, duration of stay in the neonatal intensive care unit (NICU and duration of ROP were recorded.RESULTS: The incidence of spontaneous regression of ROP with stage 1 was 86.7%, and with stage 2, stage 3 was 57.1%, 5.9%, respectively. With changes in zone Ⅲ regression was detected 100%, in zoneⅡ 46.2% and in zoneⅠ 0%. The mean duration of ROP in spontaneous regression group was 5.65±3.14 weeks, lower than that of the treated ROP group (7.34±4.33 weeks, but this difference was not statistically significant (P=0.201. GA, 1min Apgar score, 5min Apgar score, duration of NICU stay, postnatal age of initial screening and oxygen therapy longer than 10 days were significant predictive factors for the spontaneous regression of ROP (P<0.05. Retinal hemorrhage was the only independent predictive factor the spontaneous regression of ROP (OR 0.030, 95%CI 0.001-0.775, P=0.035.CONCLUSION:This study showed most stage 1 and 2 ROP and changes in zone Ⅲ can spontaneously regression in the end. Retinal hemorrhage is weakly inversely associated with the spontaneous regression.

  15. Splenectomy vs. rituximab as a second-line therapy in immune thrombocytopenic purpura: a single center experience.

    Science.gov (United States)

    Al Askar, Ahmed S; Shaheen, Naila A; Al Zahrani, Mohsen; Al Otaibi, Mohammed G; Al Qahtani, Bader S; Ahmed, Faris; Al Zughaibi, Mohand; Kamran, Ismat; Mendoza, May Anne; Khan, Altaf

    2018-01-01

    Immune thrombocytopenic purpura (ITP) is a common hematological disease treated primarily by corticosteroids. The aim of the present study was to compare response rate between patients, underwent splenectomy vs. rituximab as second-line therapy. Adult patients diagnosed with ITP who did not respond to corticosteroids or relapsed during the period 1990-2014 were included in a quasi-experimental study. Categorical variables were compared using Fisher exact test. Response to treatment was compared using logistic regression. Data were analyzed using SAS V9.2. One-hundred and forty-three patients with ITP were identified through medical records. Of 62 patients treated, 30 (48.38%) required second-line therapy. 19 (63%) patients received rituximab, and 11 (37%) underwent splenectomy. Platelets at diagnosis were not different between study groups (p = 0.062). Splenectomy group patients were younger (p = 0.011). Response to second-line therapy showed no significant difference between two groups (OR 2.03, 95% CI (0.21-22.09), p = 0.549). Results did not show a statistically significant difference in platelet counts over time between treatment groups (p = 0.101). When used exclusively as a second-line therapy for steroid-refractory ITP, the response rate was not statistically different between rituximab and splenectomy. However, further large studies are needed to assess the response rates for these treatment modalities as a second-line therapy.

  16. Learning a Nonnegative Sparse Graph for Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung

    2015-09-01

    Previous graph-based semisupervised learning (G-SSL) methods have the following drawbacks: 1) they usually predefine the graph structure and then use it to perform label prediction, which cannot guarantee an overall optimum and 2) they only focus on the label prediction or the graph structure construction but are not competent in handling new samples. To this end, a novel nonnegative sparse graph (NNSG) learning method was first proposed. Then, both the label prediction and projection learning were integrated into linear regression. Finally, the linear regression and graph structure learning were unified within the same framework to overcome these two drawbacks. Therefore, a novel method, named learning a NNSG for linear regression was presented, in which the linear regression and graph learning were simultaneously performed to guarantee an overall optimum. In the learning process, the label information can be accurately propagated via the graph structure so that the linear regression can learn a discriminative projection to better fit sample labels and accurately classify new samples. An effective algorithm was designed to solve the corresponding optimization problem with fast convergence. Furthermore, NNSG provides a unified perceptiveness for a number of graph-based learning methods and linear regression methods. The experimental results showed that NNSG can obtain very high classification accuracy and greatly outperforms conventional G-SSL methods, especially some conventional graph construction methods.

  17. Assessing the impact of local meteorological variables on surface ozone in Hong Kong during 2000-2015 using quantile and multiple line regression models

    Science.gov (United States)

    Zhao, Wei; Fan, Shaojia; Guo, Hai; Gao, Bo; Sun, Jiaren; Chen, Laiguo

    2016-11-01

    The quantile regression (QR) method has been increasingly introduced to atmospheric environmental studies to explore the non-linear relationship between local meteorological conditions and ozone mixing ratios. In this study, we applied QR for the first time, together with multiple linear regression (MLR), to analyze the dominant meteorological parameters influencing the mean, 10th percentile, 90th percentile and 99th percentile of maximum daily 8-h average (MDA8) ozone concentrations in 2000-2015 in Hong Kong. The dominance analysis (DA) was used to assess the relative importance of meteorological variables in the regression models. Results showed that the MLR models worked better at suburban and rural sites than at urban sites, and worked better in winter than in summer. QR models performed better in summer for 99th and 90th percentiles and performed better in autumn and winter for 10th percentile. And QR models also performed better in suburban and rural areas for 10th percentile. The top 3 dominant variables associated with MDA8 ozone concentrations, changing with seasons and regions, were frequently associated with the six meteorological parameters: boundary layer height, humidity, wind direction, surface solar radiation, total cloud cover and sea level pressure. Temperature rarely became a significant variable in any season, which could partly explain the peak of monthly average ozone concentrations in October in Hong Kong. And we found the effect of solar radiation would be enhanced during extremely ozone pollution episodes (i.e., the 99th percentile). Finally, meteorological effects on MDA8 ozone had no significant changes before and after the 2010 Asian Games.

  18. Multiple Linear Regression: A Realistic Reflector.

    Science.gov (United States)

    Nutt, A. T.; Batsell, R. R.

    Examples of the use of Multiple Linear Regression (MLR) techniques are presented. This is done to show how MLR aids data processing and decision-making by providing the decision-maker with freedom in phrasing questions and by accurately reflecting the data on hand. A brief overview of the rationale underlying MLR is given, some basic definitions…

  19. Inbreeding depression in maize populations and its effects on the obtention of promising inbred lines

    Directory of Open Access Journals (Sweden)

    Deoclecio Domingos Garbuglio

    2017-10-01

    Full Text Available Inbreeding can potentially be used for the development of inbred lines containing alleles of interest, but the genetic causes that control inbreeding depression are not completely known, and there are few studies found in the literature. The present study aimed to obtain estimates of inbreeding depression for eight traits in seven tropical maize populations, analyze the effects of inbreeding over generations and environments, and predict the behavior of inbred lines in future generation S? through linear regression methods. It was found that regardless of the base population used, prediction values could vary when the model was based on only 2 generations of inbreeding due to the environmental component. The influence of the environment in this type of study could be reduced when considering 3 generations of inbreeding, allowing greater precision in predicting the phenotypes of inbred lines. The use of linear regression was effective for inbred line prediction for the different agronomic traits evaluated. The use of 3 levels of inbreeding minimizes the effects of the environmental component in inbred line prediction for grain yield. GO-S was the most promising population for inbred line extraction.

  20. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis.

    Science.gov (United States)

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.

  1. Selecting a Regression Saturated by Indicators

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren; Santos, Carlos

    We consider selecting a regression model, using a variant of Gets, when there are more variables than observations, in the special case that the variables are impulse dummies (indicators) for every observation. We show that the setting is unproblematic if tackled appropriately, and obtain the fin...... the finite-sample distribution of estimators of the mean and variance in a simple location-scale model under the null that no impulses matter. A Monte Carlo simulation confirms the null distribution, and shows power against an alternative of interest....

  2. Selecting a Regression Saturated by Indicators

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren; Santos, Carlos

    We consider selecting a regression model, using a variant of Gets, when there are more variables than observations, in the special case that the variables are impulse dummies (indicators) for every observation. We show that the setting is unproblematic if tackled appropriately, and obtain the fin...... the finite-sample distribution of estimators of the mean and variance in a simple location-scale model under the null that no impulses matter. A Monte Carlo simulation confirms the null distribution, and shows power against an alternative of interest...

  3. Tightness of M-estimators for multiple linear regression in time series

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We show tightness of a general M-estimator for multiple linear regression in time series. The positive criterion function for the M-estimator is assumed lower semi-continuous and sufficiently large for large argument: Particular cases are the Huber-skip and quantile regression. Tightness requires...

  4. Transient simulation of regression rate on thrust regulation process in hybrid rocket motor

    Directory of Open Access Journals (Sweden)

    Tian Hui

    2014-12-01

    Full Text Available The main goal of this paper is to study the characteristics of regression rate of solid grain during thrust regulation process. For this purpose, an unsteady numerical model of regression rate is established. Gas–solid coupling is considered between the solid grain surface and combustion gas. Dynamic mesh is used to simulate the regression process of the solid fuel surface. Based on this model, numerical simulations on a H2O2/HTPB (hydroxyl-terminated polybutadiene hybrid motor have been performed in the flow control process. The simulation results show that under the step change of the oxidizer mass flow rate condition, the regression rate cannot reach a stable value instantly because the flow field requires a short time period to adjust. The regression rate increases with the linear gain of oxidizer mass flow rate, and has a higher slope than the relative inlet function of oxidizer flow rate. A shorter regulation time can cause a higher regression rate during regulation process. The results also show that transient calculation can better simulate the instantaneous regression rate in the operation process.

  5. Theileria parva antigens recognized by CD8+ T cells show varying degrees of diversity in buffalo-derived infected cell lines.

    Science.gov (United States)

    Sitt, Tatjana; Pelle, Roger; Chepkwony, Maurine; Morrison, W Ivan; Toye, Philip

    2018-05-06

    The extent of sequence diversity among the genes encoding 10 antigens (Tp1-10) known to be recognized by CD8+ T lymphocytes from cattle immune to Theileria parva was analysed. The sequences were derived from parasites in 23 buffalo-derived cell lines, three cattle-derived isolates and one cloned cell line obtained from a buffalo-derived stabilate. The results revealed substantial variation among the antigens through sequence diversity. The greatest nucleotide and amino acid diversity were observed in Tp1, Tp2 and Tp9. Tp5 and Tp7 showed the least amount of allelic diversity, and Tp5, Tp6 and Tp7 had the lowest levels of protein diversity. Tp6 was the most conserved protein; only a single non-synonymous substitution was found in all obtained sequences. The ratio of non-synonymous: synonymous substitutions varied from 0.84 (Tp1) to 0.04 (Tp6). Apart from Tp2 and Tp9, we observed no variation in the other defined CD8+ T cell epitopes (Tp4, 5, 7 and 8), indicating that epitope variation is not a universal feature of T. parva antigens. In addition to providing markers that can be used to examine the diversity in T. parva populations, the results highlight the potential for using conserved antigens to develop vaccines that provide broad protection against T. parva.

  6. Lung Adenocarcinomas and Lung Cancer Cell Lines Show Association of MMP-1 Expression With STAT3 Activation

    Directory of Open Access Journals (Sweden)

    Alexander Schütz

    2015-04-01

    Full Text Available Signal transducer and activator of transcription 3 (STAT3 is constitutively activated in the majority of lung cancer. This study aims at defining connections between STAT3 function and the malignant properties of non–small cell lung carcinoma (NSCLC cells. To address possible mechanisms by which STAT3 influences invasiveness, the expression of matrix metalloproteinase-1 (MMP-1 was analyzed and correlated with the STAT3 activity status. Studies on both surgical biopsies and on lung cancer cell lines revealed a coincidence of STAT3 activation and strong expression of MMP-1. MMP-1 and tyrosine-phosphorylated activated STAT3 were found co-localized in cancer tissues, most pronounced in tumor fronts, and in particular in adenocarcinomas. STAT3 activity was constitutive, although to different degrees, in the lung cancer cell lines investigated. Three cell lines (BEN, KNS62, and A549 were identified in which STAT3 activitation was inducible by Interleukin-6 (IL-6. In A549 cells, STAT3 activity enhanced the level of MMP-1 mRNA and stimulated transcription from the MMP-1 promoter in IL-6–stimulated A549 cells. STAT3 specificity of this effect was confirmed by STAT3 knockdown through RNA interference. Our results link aberrant activity of STAT3 in lung cancer cells to malignant tumor progression through up-regulation of expression of invasiveness-associated MMPs.

  7. Capacitance Regression Modelling Analysis on Latex from Selected Rubber Tree Clones

    International Nuclear Information System (INIS)

    Rosli, A D; Baharudin, R; Hashim, H; Khairuzzaman, N A; Mohd Sampian, A F; Abdullah, N E; Kamaru'zzaman, M; Sulaiman, M S

    2015-01-01

    This paper investigates the capacitance regression modelling performance of latex for various rubber tree clones, namely clone 2002, 2008, 2014 and 3001. Conventionally, the rubber tree clones identification are based on observation towards tree features such as shape of leaf, trunk, branching habit and pattern of seeds texture. The former method requires expert persons and very time-consuming. Currently, there is no sensing device based on electrical properties that can be employed to measure different clones from latex samples. Hence, with a hypothesis that the dielectric constant of each clone varies, this paper discusses the development of a capacitance sensor via Capacitance Comparison Bridge (known as capacitance sensor) to measure an output voltage of different latex samples. The proposed sensor is initially tested with 30ml of latex sample prior to gradually addition of dilution water. The output voltage and capacitance obtained from the test are recorded and analyzed using Simple Linear Regression (SLR) model. This work outcome infers that latex clone of 2002 has produced the highest and reliable linear regression line with determination coefficient of 91.24%. In addition, the study also found that the capacitive elements in latex samples deteriorate if it is diluted with higher volume of water. (paper)

  8. Finding-equal regression method and its application in predication of U resources

    International Nuclear Information System (INIS)

    Cao Huimo

    1995-03-01

    The commonly adopted deposit model method in mineral resources predication has two main part: one is model data that show up geological mineralization law for deposit, the other is statistics predication method that accords with characters of the data namely pretty regression method. This kind of regression method may be called finding-equal regression, which is made of the linear regression and distribution finding-equal method. Because distribution finding-equal method is a data pretreatment which accords with advanced mathematical precondition for the linear regression namely equal distribution theory, and this kind of data pretreatment is possible of realization. Therefore finding-equal regression not only can overcome nonlinear limitations, that are commonly occurred in traditional linear regression or other regression and always have no solution, but also can distinguish outliers and eliminate its weak influence, which would usually appeared when Robust regression possesses outlier in independent variables. Thus this newly finding-equal regression stands the best status in all kind of regression methods. Finally, two good examples of U resource quantitative predication are provided

  9. Resting-state functional magnetic resonance imaging: the impact of regression analysis.

    Science.gov (United States)

    Yeh, Chia-Jung; Tseng, Yu-Sheng; Lin, Yi-Ru; Tsai, Shang-Yueh; Huang, Teng-Yi

    2015-01-01

    To investigate the impact of regression methods on resting-state functional magnetic resonance imaging (rsfMRI). During rsfMRI preprocessing, regression analysis is considered effective for reducing the interference of physiological noise on the signal time course. However, it is unclear whether the regression method benefits rsfMRI analysis. Twenty volunteers (10 men and 10 women; aged 23.4 ± 1.5 years) participated in the experiments. We used node analysis and functional connectivity mapping to assess the brain default mode network by using five combinations of regression methods. The results show that regressing the global mean plays a major role in the preprocessing steps. When a global regression method is applied, the values of functional connectivity are significantly lower (P ≤ .01) than those calculated without a global regression. This step increases inter-subject variation and produces anticorrelated brain areas. rsfMRI data processed using regression should be interpreted carefully. The significance of the anticorrelated brain areas produced by global signal removal is unclear. Copyright © 2014 by the American Society of Neuroimaging.

  10. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    Science.gov (United States)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  11. Physiologic noise regression, motion regression, and TOAST dynamic field correction in complex-valued fMRI time series.

    Science.gov (United States)

    Hahn, Andrew D; Rowe, Daniel B

    2012-02-01

    As more evidence is presented suggesting that the phase, as well as the magnitude, of functional MRI (fMRI) time series may contain important information and that there are theoretical drawbacks to modeling functional response in the magnitude alone, removing noise in the phase is becoming more important. Previous studies have shown that retrospective correction of noise from physiologic sources can remove significant phase variance and that dynamic main magnetic field correction and regression of estimated motion parameters also remove significant phase fluctuations. In this work, we investigate the performance of physiologic noise regression in a framework along with correction for dynamic main field fluctuations and motion regression. Our findings suggest that including physiologic regressors provides some benefit in terms of reduction in phase noise power, but it is small compared to the benefit of dynamic field corrections and use of estimated motion parameters as nuisance regressors. Additionally, we show that the use of all three techniques reduces phase variance substantially, removes undesirable spatial phase correlations and improves detection of the functional response in magnitude and phase. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    Science.gov (United States)

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  14. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  15. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  16. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  17. Are increases in cigarette taxation regressive?

    Science.gov (United States)

    Borren, P; Sutton, M

    1992-12-01

    Using the latest published data from Tobacco Advisory Council surveys, this paper re-evaluates the question of whether or not increases in cigarette taxation are regressive in the United Kingdom. The extended data set shows no evidence of increasing price-elasticity by social class as found in a major previous study. To the contrary, there appears to be no clear pattern in the price responsiveness of smoking behaviour across different social classes. Increases in cigarette taxation, while reducing smoking levels in all groups, fall most heavily on men and women in the lowest social class. Men and women in social class five can expect to pay eight and eleven times more of a tax increase respectively, than their social class one counterparts. Taken as a proportion of relative incomes, the regressive nature of increases in cigarette taxation is even more pronounced.

  18. LINE FUSION GENES: a database of LINE expression in human genes

    Directory of Open Access Journals (Sweden)

    Park Hong-Seog

    2006-06-01

    Full Text Available Abstract Background Long Interspersed Nuclear Elements (LINEs are the most abundant retrotransposons in humans. About 79% of human genes are estimated to contain at least one segment of LINE per transcription unit. Recent studies have shown that LINE elements can affect protein sequences, splicing patterns and expression of human genes. Description We have developed a database, LINE FUSION GENES, for elucidating LINE expression throughout the human gene database. We searched the 28,171 genes listed in the NCBI database for LINE elements and analyzed their structures and expression patterns. The results show that the mRNA sequences of 1,329 genes were affected by LINE expression. The LINE expression types were classified on the basis of LINEs in the 5' UTR, exon or 3' UTR sequences of the mRNAs. Our database provides further information, such as the tissue distribution and chromosomal location of the genes, and the domain structure that is changed by LINE integration. We have linked all the accession numbers to the NCBI data bank to provide mRNA sequences for subsequent users. Conclusion We believe that our work will interest genome scientists and might help them to gain insight into the implications of LINE expression for human evolution and disease. Availability http://www.primate.or.kr/line

  19. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  20. The sandfly Lutzomyia longipalpis LL5 embryonic cell line has active Toll and Imd pathways and shows immune responses to bacteria, yeast and Leishmania.

    Science.gov (United States)

    Tinoco-Nunes, Bruno; Telleria, Erich Loza; da Silva-Neves, Monique; Marques, Christiane; Azevedo-Brito, Daisy Aline; Pitaluga, André Nóbrega; Traub-Csekö, Yara Maria

    2016-04-20

    Lutzomyia longipalpis is the main vector of visceral leishmaniasis in Latin America. Sandfly immune responses are poorly understood. In previous work we showed that these vector insects respond to bacterial infections by modulating a defensin gene expression and activate the Imd pathway in response to Leishmania infection. Aspects of innate immune pathways in insects (including mosquito vectors of human diseases) have been revealed by studying insect cell lines, and we have previously demonstrated antiviral responses in the L. longipalpis embryonic cell line LL5. The expression patterns of antimicrobial peptides (AMPs) and transcription factors were evaluated after silencing the repressors of the Toll pathway (cactus) and Imd pathway (caspar). AMPs and transcription factor expression patterns were also evaluated after challenge with heat-killed bacteria, heat-killed yeast, or live Leishmania. These studies showed that LL5 cells have active Toll and Imd pathways, since they displayed an increased expression of AMP genes following silencing of the repressors cactus and caspar, respectively. These pathways were also activated by challenges with bacteria, yeast and Leishmania infantum chagasi. We demonstrated that L. longipalpis LL5 embryonic cells respond to immune stimuli and are therefore a good model to study the immunological pathways of this important vector of leishmaniasis.

  1. Clinical Relevance of Alternative Endpoints in Colorectal Cancer First-Line Therapy With Bevacizumab: A Retrospective Study.

    Science.gov (United States)

    Turpin, Anthony; Paget-Bailly, Sophie; Ploquin, Anne; Hollebecque, Antoine; Peugniez, Charlotte; El-Hajbi, Farid; Bonnetain, Franck; Hebbar, Mohamed

    2018-03-01

    We studied the relationship between intermediate criteria and overall survival (OS) in metastatic colorectal cancer (mCRC) patients who received first-line chemotherapy with bevacizumab. We assessed OS, progression-free survival (PFS), duration of disease control (DDC), the sum of the periods in which the disease did not progress, and the time to failure of strategy (TFS), which was defined as the entire period before the introduction of a second-line treatment. Linear correlation and regression models were used, and Prentice criteria were investigated. With a median follow-up of 57.6 months for 216 patients, the median OS was 24.5 months (95% confidence interval [CI], 21.3-29.7). The median PFS, DDC, and TFS were 8.9 (95% CI, 8.4-9.7), 11.0 (95% CI, 9.8-12.4), and 11.1 (95% CI, 10.0-13.0) months, respectively. The correlations between OS and DDC (Pearson coefficient, 0.79 [95% CI, 0.73-0.83], determination coefficient, 0.62) and OS and TFS (Pearson coefficient, 0.79 [95% CI, 0.73-0.84], determination coefficient, 0.63) were satisfactory. Linear regression analysis showed a significant association between OS and DDC, and between OS and TFS. Prentice criteria were verified for TFS as well as DDC. DDC and TFS correlated with OS and are relevant as intermediate criteria in the setting of patients with mCRC treated with a first-line bevacizumab-based regimen. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  3. Current and potential tree locations in tree line ecotone of Changbai Mountains, Northeast China: the controlling effects of topography.

    Science.gov (United States)

    Zong, Shengwei; Wu, Zhengfang; Xu, Jiawei; Li, Ming; Gao, Xiaofeng; He, Hongshi; Du, Haibo; Wang, Lei

    2014-01-01

    Tree line ecotone in the Changbai Mountains has undergone large changes in the past decades. Tree locations show variations on the four sides of the mountains, especially on the northern and western sides, which has not been fully explained. Previous studies attributed such variations to the variations in temperature. However, in this study, we hypothesized that topographic controls were responsible for causing the variations in the tree locations in tree line ecotone of the Changbai Mountains. To test the hypothesis, we used IKONOS images and WorldView-1 image to identify the tree locations and developed a logistic regression model using topographical variables to identify the dominant controls of the tree locations. The results showed that aspect, wetness, and slope were dominant controls for tree locations on western side of the mountains, whereas altitude, SPI, and aspect were the dominant factors on northern side. The upmost altitude a tree can currently reach was 2140 m asl on the northern side and 2060 m asl on western side. The model predicted results showed that habitats above the current tree line on the both sides were available for trees. Tree recruitments under the current tree line may take advantage of the available habitats at higher elevations based on the current tree location. Our research confirmed the controlling effects of topography on the tree locations in the tree line ecotone of Changbai Mountains and suggested that it was essential to assess the tree response to topography in the research of tree line ecotone.

  4. A consistent framework for Horton regression statistics that leads to a modified Hack's law

    Science.gov (United States)

    Furey, P.R.; Troutman, B.M.

    2008-01-01

    A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.

  5. A Matlab program for stepwise regression

    Directory of Open Access Journals (Sweden)

    Yanhong Qi

    2016-03-01

    Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.

  6. Multiple linear stepwise regression of liver lipid levels: proton MR spectroscopy study in vivo at 3.0 T

    International Nuclear Information System (INIS)

    Xu Li; Liang Changhong; Xiao Yuanqiu; Zhang Zhonglin

    2010-01-01

    Objective: To analyze the correlations between liver lipid level determined by liver 3.0 T 1 H-MRS in vivo and influencing factors using multiple linear stepwise regression. Methods: The prospective study of liver 1 H-MRS was performed with 3.0 T system and eight-channel torso phased-array coils using PRESS sequence. Forty-four volunteers were enrolled in this study. Liver spectra were collected with a TR of 1500 ms, TE of 30 ms, volume of interest of 2 cm×2 cm×2 cm, NSA of 64 times. The acquired raw proton MRS data were processed by using a software program SAGE. For each MRS measurement, using water as the internal reference, the amplitude of the lipid signal was normalized to the sum of the signal from lipid and water to obtain percentage lipid within the liver. The statistical description of height, weight, age and BMI, Line width and water suppression were recorded, and Pearson analysis was applied to test their relationships. Multiple linear stepwise regression was used to set the statistical model for the prediction of Liver lipid content. Results: Age (39.1±12.6) years, body weight (64.4±10.4) kg, BMI (23.3±3.1) kg/m 2 , linewidth (18.9±4.4) and the water suppression (90.7±6.5)% had significant correlation with liver lipid content (0.00 to 0.96%, median 0.02%), r were 0.11, 0.44, 0.40, 0.52, -0.73 respectively (P<0.05). But only age, BMI, line width, and the water suppression entered into the multiple linear regression equation. Liver lipid content prediction equation was as follows: Y= 1.395 - (0.021×water suppression) + (0.022×BMI) + (0.014×line width) - (0.004×age), and the coefficient of determination was 0. 613, corrected coefficient of determination was 0.59. Conclusion: The regression model fitted well, since the variables of age, BMI, width, and water suppression can explain about 60% of liver lipid content changes. (authors)

  7. Comparison of partial least squares and lasso regression techniques as applied to laser-induced breakdown spectroscopy of geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Dyar, M.D., E-mail: mdyar@mtholyoke.edu [Dept. of Astronomy, Mount Holyoke College, 50 College St., South Hadley, MA 01075 (United States); Carmosino, M.L.; Breves, E.A.; Ozanne, M.V. [Dept. of Astronomy, Mount Holyoke College, 50 College St., South Hadley, MA 01075 (United States); Clegg, S.M.; Wiens, R.C. [Los Alamos National Laboratory, P.O. Box 1663, MS J565, Los Alamos, NM 87545 (United States)

    2012-04-15

    response variables as possible while avoiding multicollinearity between principal components. When the selected number of principal components is projected back into the original feature space of the spectra, 6144 correlation coefficients are generated, a small fraction of which are mathematically significant to the regression. In contrast, the lasso models require only a small number (< 24) of non-zero correlation coefficients ({beta} values) to determine the concentration of each of the ten major elements. Causality between the positively-correlated emission lines chosen by the lasso and the elemental concentration was examined. In general, the higher the lasso coefficient ({beta}), the greater the likelihood that the selected line results from an emission of that element. Emission lines with negative {beta} values should arise from elements that are anti-correlated with the element being predicted. For elements except Fe, Al, Ti, and P, the lasso-selected wavelength with the highest {beta} value corresponds to the element being predicted, e.g. 559.8 nm for neutral Ca. However, the specific lines chosen by the lasso with positive {beta} values are not always those from the element being predicted. Other wavelengths and the elements that most strongly correlate with them to predict concentration are obviously related to known geochemical correlations or close overlap of emission lines, while others must result from matrix effects. Use of the lasso technique thus directly informs our understanding of the underlying physical processes that give rise to LIBS emissions by determining which lines can best represent concentration, and which lines from other elements are causing matrix effects. - Highlights: Black-Right-Pointing-Pointer Compositions of 100 rocks are predicted from LIBS with PLS-1, PLS-2, and the lasso. Black-Right-Pointing-Pointer All yield comparable results in terms of accuracy, but not interpretability. Black-Right-Pointing-Pointer Lasso chooses channels from

  8. Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection

    KAUST Repository

    Chen, Lisha

    2012-12-01

    The reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the number of model parameters and takes advantage of interrelations between the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection. © 2012 American Statistical Association.

  9. Anti-correlated networks, global signal regression, and the effects of caffeine in resting-state functional MRI.

    Science.gov (United States)

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T

    2012-10-15

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Determining the Pressure Shift of Helium I Lines Using White Dwarf Stars

    Science.gov (United States)

    Camarota, Lawrence

    This dissertation explores the non-Doppler shifting of Helium lines in the high pressure conditions of a white dwarf photosphere. In particular, this dissertation seeks to mathematically quantify the shift in a way that is simple to reproduce and account for in future studies without requiring prior knowledge of the star's bulk properties (mass, radius, temperature, etc.). Two main methods will be used in this analysis. First, the spectral line will be quantified with a continuous wavelet transformation, and the components will be used in a chi2 minimizing linear regression to predict the shift. Second, the position of the lines will be calculated using a best-fit Levy-alpha line function. These techniques stand in contrast to traditional methods of quantifying the center of often broad spectral lines, which usually assume symmetry on the parts of the lines.

  11. Steganalysis using logistic regression

    Science.gov (United States)

    Lubenko, Ivans; Ker, Andrew D.

    2011-02-01

    We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

  12. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  13. Fungible weights in logistic regression.

    Science.gov (United States)

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Exact Rational Expectations, Cointegration, and Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    We interpret the linear relations from exact rational expectations models as restrictions on the parameters of the statistical model called the cointegrated vector autoregressive model for non-stationary variables. We then show how reduced rank regression, Anderson (1951), plays an important role...

  15. Exact rational expectations, cointegration, and reduced rank regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    We interpret the linear relations from exact rational expectations models as restrictions on the parameters of the statistical model called the cointegrated vector autoregressive model for non-stationary variables. We then show how reduced rank regression, Anderson (1951), plays an important role...

  16. Exact rational expectations, cointegration, and reduced rank regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders Rygh

    2008-01-01

    We interpret the linear relations from exact rational expectations models as restrictions on the parameters of the statistical model called the cointegrated vector autoregressive model for non-stationary variables. We then show how reduced rank regression, Anderson (1951), plays an important role...

  17. Virtual machine consolidation enhancement using hybrid regression algorithms

    Directory of Open Access Journals (Sweden)

    Amany Abdelsamea

    2017-11-01

    Full Text Available Cloud computing data centers are growing rapidly in both number and capacity to meet the increasing demands for highly-responsive computing and massive storage. Such data centers consume enormous amounts of electrical energy resulting in high operating costs and carbon dioxide emissions. The reason for this extremely high energy consumption is not just the quantity of computing resources and the power inefficiency of hardware, but rather lies in the inefficient usage of these resources. VM consolidation involves live migration of VMs hence the capability of transferring a VM between physical servers with a close to zero down time. It is an effective way to improve the utilization of resources and increase energy efficiency in cloud data centers. VM consolidation consists of host overload/underload detection, VM selection and VM placement. Most of the current VM consolidation approaches apply either heuristic-based techniques, such as static utilization thresholds, decision-making based on statistical analysis of historical data; or simply periodic adaptation of the VM allocation. Most of those algorithms rely on CPU utilization only for host overload detection. In this paper we propose using hybrid factors to enhance VM consolidation. Specifically we developed a multiple regression algorithm that uses CPU utilization, memory utilization and bandwidth utilization for host overload detection. The proposed algorithm, Multiple Regression Host Overload Detection (MRHOD, significantly reduces energy consumption while ensuring a high level of adherence to Service Level Agreements (SLA since it gives a real indication of host utilization based on three parameters (CPU, Memory, Bandwidth utilizations instead of one parameter only (CPU utilization. Through simulations we show that our approach reduces power consumption by 6 times compared to single factor algorithms using random workload. Also using PlanetLab workload traces we show that MRHOD improves

  18. Erdheim-Chester disease in a child with MR imaging showing regression of marrow changes

    International Nuclear Information System (INIS)

    Joo, Chan Uhng; Go, Yang Sim; Kim, In Hwan; Kim, Chul Seong; Lee, Sang Yong

    2005-01-01

    Erdheim-Chester disease is a disseminated xanthogranulomatous infiltrative disease of unknown origin that generally presents in adulthood. A review of the English-language literature demonstrated that pediatric cases were extremely rare, and to our knowledge, only two cases, a 7- and 14-year-old, have been published. We report a case of Erdheim-Chester disease in a 10-year-old girl evaluated with MR imaging. Radiographs revealed typical bilateral, symmetric osteosclerosis of the metaphyseal regions of long bones of the upper and lower extremities. A histologic examination demonstrated foamy histiocytes in bone marrow smears. Bilateral symmetric low signal intensities of both proximal tibiae and distal femurs were demonstrated on T1-weighted MR images. After oral steroid therapy for 8 months, follow-up MR imaging showed remarkable restoration of normal high signal intensity in both the tibial and femoral metaphyses. To our knowledge, this may be the first case of Erdheim-Chester disease that showed normal restoration of the abnormal signal intensities in the metaphyses of long bones after steroid therapy. (orig.)

  19. Alternative regression models to assess increase in childhood BMI.

    Science.gov (United States)

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-09-08

    Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  20. Cisgenic Rvi6 scab-resistant apple lines show no differences in Rvi6 transcription when compared with conventionally bred cultivars.

    Science.gov (United States)

    Chizzali, Cornelia; Gusberti, Michele; Schouten, Henk J; Gessler, Cesare; Broggini, Giovanni A L

    2016-03-01

    The expression of the apple scab resistance gene Rvi6 in different apple cultivars and lines is not modulated by biotic or abiotic factors. All commercially important apple cultivars are susceptible to Venturia inaequalis, the causal organism of apple scab. A limited number of apple cultivars were bred to express the resistance gene Vf from the wild apple genotype Malus floribunda 821. Positional cloning of the Vf locus allowed the identification of the Rvi6 (formerly HcrVf2) scab resistance gene that was subsequently used to generate cisgenic apple lines. It is important to understand and compare how this resistance gene is transcribed and modulated during infection in conventionally bred cultivars and in cisgenic lines. The aim of this work was to study the transcription pattern of Rvi6 in three classically bred apple cultivars and six lines of 'Gala' genetically modified to express Rvi6. Rvi6 transcription was analyzed at two time points using quantitative real-time PCR (RT-qPCR) following inoculation with V. inaequalis conidia or water. Rvi6 transcription was assessed in relation to five reference genes. β-Actin, RNAPol, and UBC were the most suited to performing RT-qPCR experiments on Malus × domestica. Inoculation with V. inaequalis conidia under conditions conducive to scab infection failed to produce any significant changes to the transcription level of Rvi6. Rvi6 expression levels were inconsistent in response to external treatments in the different apple cultivars, and transgenic, intragenic or cisgenic lines.

  1. Principal component regression analysis with SPSS.

    Science.gov (United States)

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  2. Testing of a Fiber Optic Wear, Erosion and Regression Sensor

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt A.

    2011-01-01

    The nature of the physical processes and harsh environments associated with erosion and wear in propulsion environments makes their measurement and real-time rate quantification difficult. A fiber optic sensor capable of determining the wear (regression, erosion, ablation) associated with these environments has been developed and tested in a number of different applications to validate the technique. The sensor consists of two fiber optics that have differing attenuation coefficients and transmit light to detectors. The ratio of the two measured intensities can be correlated to the lengths of the fiber optic lines, and if the fibers and the host parent material in which they are embedded wear at the same rate the remaining length of fiber provides a real-time measure of the wear process. Testing in several disparate situations has been performed, with the data exhibiting excellent qualitative agreement with the theoretical description of the process and when a separate calibrated regression measurement is available good quantitative agreement is obtained as well. The light collected by the fibers can also be used to optically obtain the spectra and measure the internal temperature of the wear layer.

  3. Testing and Modeling Fuel Regression Rate in a Miniature Hybrid Burner

    Directory of Open Access Journals (Sweden)

    Luciano Fanton

    2012-01-01

    Full Text Available Ballistic characterization of an extended group of innovative HTPB-based solid fuel formulations for hybrid rocket propulsion was performed in a lab-scale burner. An optical time-resolved technique was used to assess the quasisteady regression history of single perforation, cylindrical samples. The effects of metalized additives and radiant heat transfer on the regression rate of such formulations were assessed. Under the investigated operating conditions and based on phenomenological models from the literature, analyses of the collected experimental data show an appreciable influence of the radiant heat flux from burnt gases and soot for both unloaded and loaded fuel formulations. Pure HTPB regression rate data are satisfactorily reproduced, while the impressive initial regression rates of metalized formulations require further assessment.

  4. Tax Evasion, Information Reporting, and the Regressive Bias Prediction

    DEFF Research Database (Denmark)

    Boserup, Simon Halphen; Pinje, Jori Veng

    2013-01-01

    evasion and audit probabilities once we account for information reporting in the tax compliance game. When conditioning on information reporting, we find that both reduced-form evidence and simulations exhibit the predicted regressive bias. However, in the overall economy, this bias is negated by the tax......Models of rational tax evasion and optimal enforcement invariably predict a regressive bias in the effective tax system, which reduces redistribution in the economy. Using Danish administrative data, we show that a calibrated structural model of this type replicates moments and correlations of tax...

  5. Detection of sensor degradation using K-means clustering and support vector regression in nuclear power plant

    International Nuclear Information System (INIS)

    Seo, Inyong; Ha, Bokam; Lee, Sungwoo; Shin, Changhoon; Lee, Jaeyong; Kim, Seongjun

    2011-01-01

    In a nuclear power plant (NPP), periodic sensor calibrations are required to assure sensors are operating correctly. However, only a few faulty sensors are found to be rectified. For the safe operation of an NPP and the reduction of unnecessary calibration, on-line calibration monitoring is needed. In this study, an on-line calibration monitoring called KPCSVR using k-means clustering and principal component based Auto-Associative support vector regression (PCSVR) is proposed for nuclear power plant. To reduce the training time of the model, k-means clustering method was used. Response surface methodology is employed to efficiently determine the optimal values of support vector regression hyperparameters. The proposed KPCSVR model was confirmed with actual plant data of Kori Nuclear Power Plant Unit 3 which were measured from the primary and secondary systems of the plant, and compared with the PCSVR model. By using data clustering, the average accuracy of PCSVR improved from 1.228×10 -4 to 0.472×10 -4 and the average sensitivity of PCSVR from 0.0930 to 0.0909, which results in good detection of sensor drift. Moreover, the training time is greatly reduced from 123.5 to 31.5 sec. (author)

  6. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  7. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  8. Determinants of orphan drugs prices in France: a regression analysis.

    Science.gov (United States)

    Korchagina, Daria; Millier, Aurelie; Vataire, Anne-Lise; Aballea, Samuel; Falissard, Bruno; Toumi, Mondher

    2017-04-21

    The introduction of the orphan drug legislation led to the increase in the number of available orphan drugs, but the access to them is often limited due to the high price. Social preferences regarding funding orphan drugs as well as the criteria taken into consideration while setting the price remain unclear. The study aimed at identifying the determinant of orphan drug prices in France using a regression analysis. All drugs with a valid orphan designation at the moment of launch for which the price was available in France were included in the analysis. The selection of covariates was based on a literature review and included drug characteristics (Anatomical Therapeutic Chemical (ATC) class, treatment line, age of target population), diseases characteristics (severity, prevalence, availability of alternative therapeutic options), health technology assessment (HTA) details (actual benefit (AB) and improvement in actual benefit (IAB) scores, delay between the HTA and commercialisation), and study characteristics (type of study, comparator, type of endpoint). The main data sources were European public assessment reports, HTA reports, summaries of opinion on orphan designation of the European Medicines Agency, and the French insurance database of drugs and tariffs. A generalized regression model was developed to test the association between the annual treatment cost and selected covariates. A total of 68 drugs were included. The mean annual treatment cost was €96,518. In the univariate analysis, the ATC class (p = 0.01), availability of alternative treatment options (p = 0.02) and the prevalence (p = 0.02) showed a significant correlation with the annual cost. The multivariate analysis demonstrated significant association between the annual cost and availability of alternative treatment options, ATC class, IAB score, type of comparator in the pivotal clinical trial, as well as commercialisation date and delay between the HTA and commercialisation. The

  9. Optimization of line configuration and balancing for flexible machining lines

    Science.gov (United States)

    Liu, Xuemei; Li, Aiping; Chen, Zurui

    2016-05-01

    Line configuration and balancing is to select the type of line and allot a given set of operations as well as machines to a sequence of workstations to realize high-efficiency production. Most of the current researches for machining line configuration and balancing problems are related to dedicated transfer lines with dedicated machine workstations. With growing trends towards great product variety and fluctuations in market demand, dedicated transfer lines are being replaced with flexible machining line composed of identical CNC machines. This paper deals with the line configuration and balancing problem for flexible machining lines. The objective is to assign operations to workstations and find the sequence of execution, specify the number of machines in each workstation while minimizing the line cycle time and total number of machines. This problem is subject to precedence, clustering, accessibility and capacity constraints among the features, operations, setups and workstations. The mathematical model and heuristic algorithm based on feature group strategy and polychromatic sets theory are presented to find an optimal solution. The feature group strategy and polychromatic sets theory are used to establish constraint model. A heuristic operations sequencing and assignment algorithm is given. An industrial case study is carried out, and multiple optimal solutions in different line configurations are obtained. The case studying results show that the solutions with shorter cycle time and higher line balancing rate demonstrate the feasibility and effectiveness of the proposed algorithm. This research proposes a heuristic line configuration and balancing algorithm based on feature group strategy and polychromatic sets theory which is able to provide better solutions while achieving an improvement in computing time.

  10. Estimating the exceedance probability of rain rate by logistic regression

    Science.gov (United States)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  11. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  12. Comparison between Linear and Nonlinear Regression in a Laboratory Heat Transfer Experiment

    Science.gov (United States)

    Gonçalves, Carine Messias; Schwaab, Marcio; Pinto, José Carlos

    2013-01-01

    In order to interpret laboratory experimental data, undergraduate students are used to perform linear regression through linearized versions of nonlinear models. However, the use of linearized models can lead to statistically biased parameter estimates. Even so, it is not an easy task to introduce nonlinear regression and show for the students…

  13. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  14. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    Science.gov (United States)

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  15. [Hyperspectral Estimation of Apple Tree Canopy LAI Based on SVM and RF Regression].

    Science.gov (United States)

    Han, Zhao-ying; Zhu, Xi-cun; Fang, Xian-yi; Wang, Zhuo-yuan; Wang, Ling; Zhao, Geng-Xing; Jiang, Yuan-mao

    2016-03-01

    Leaf area index (LAI) is the dynamic index of crop population size. Hyperspectral technology can be used to estimate apple canopy LAI rapidly and nondestructively. It can be provide a reference for monitoring the tree growing and yield estimation. The Red Fuji apple trees of full bearing fruit are the researching objects. Ninety apple trees canopies spectral reflectance and LAI values were measured by the ASD Fieldspec3 spectrometer and LAI-2200 in thirty orchards in constant two years in Qixia research area of Shandong Province. The optimal vegetation indices were selected by the method of correlation analysis of the original spectral reflectance and vegetation indices. The models of predicting the LAI were built with the multivariate regression analysis method of support vector machine (SVM) and random forest (RF). The new vegetation indices, GNDVI527, ND-VI676, RVI682, FD-NVI656 and GRVI517 and the previous two main vegetation indices, NDVI670 and NDVI705, are in accordance with LAI. In the RF regression model, the calibration set decision coefficient C-R2 of 0.920 and validation set decision coefficient V-R2 of 0.889 are higher than the SVM regression model by 0.045 and 0.033 respectively. The root mean square error of calibration set C-RMSE of 0.249, the root mean square error validation set V-RMSE of 0.236 are lower than that of the SVM regression model by 0.054 and 0.058 respectively. Relative analysis of calibrating error C-RPD and relative analysis of validation set V-RPD reached 3.363 and 2.520, 0.598 and 0.262, respectively, which were higher than the SVM regression model. The measured and predicted the scatterplot trend line slope of the calibration set and validation set C-S and V-S are close to 1. The estimation result of RF regression model is better than that of the SVM. RF regression model can be used to estimate the LAI of red Fuji apple trees in full fruit period.

  16. Alternative regression models to assess increase in childhood BMI

    Directory of Open Access Journals (Sweden)

    Mansmann Ulrich

    2008-09-01

    Full Text Available Abstract Background Body mass index (BMI data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs, quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS. We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. Results GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. Conclusion GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  17. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  18. Graphene Oxide Nanoribbons Induce Autophagic Vacuoles in Neuroblastoma Cell Lines

    Directory of Open Access Journals (Sweden)

    Emanuela Mari

    2016-11-01

    Full Text Available Since graphene nanoparticles are attracting increasing interest in relation to medical applications, it is important to understand their potential effects on humans. In the present study, we prepared graphene oxide (GO nanoribbons by oxidative unzipping of single-wall carbon nanotubes (SWCNTs and analyzed their toxicity in two human neuroblastoma cell lines. Neuroblastoma is the most common solid neoplasia in children. The hallmark of these tumors is the high number of different clinical variables, ranging from highly metastatic, rapid progression and resistance to therapy to spontaneous regression or change into benign ganglioneuromas. Patients with neuroblastoma are grouped into different risk groups that are characterized by different prognosis and different clinical behavior. Relapse and mortality in high risk patients is very high in spite of new advances in chemotherapy. Cell lines, obtained from neuroblastomas have different genotypic and phenotypic features. The cell lines SK-N-BE(2 and SH-SY5Y have different genetic mutations and tumorigenicity. Cells were exposed to low doses of GO for different times in order to investigate whether GO was a good vehicle for biological molecules delivering individualized therapy. Cytotoxicity in both cell lines was studied by measuring cellular oxidative stress (ROS, mitochondria membrane potential, expression of lysosomial proteins and cell growth. GO uptake and cytoplasmic distribution of particles were studied by Transmission Electron Microscopy (TEM for up to 72 h. The results show that GO at low concentrations increased ROS production and induced autophagy in both neuroblastoma cell lines within a few hours of exposure, events that, however, are not followed by growth arrest or death. For this reason, we suggest that the GO nanoparticle can be used for therapeutic delivery to the brain tissue with minimal effects on healthy cells.

  19. MICROLENSING OF QUASAR BROAD EMISSION LINES: CONSTRAINTS ON BROAD LINE REGION SIZE

    Energy Technology Data Exchange (ETDEWEB)

    Guerras, E.; Mediavilla, E. [Instituto de Astrofisica de Canarias, Via Lactea S/N, La Laguna E-38200, Tenerife (Spain); Jimenez-Vicente, J. [Departamento de Fisica Teorica y del Cosmos, Universidad de Granada, Campus de Fuentenueva, E-18071 Granada (Spain); Kochanek, C. S. [Department of Astronomy and the Center for Cosmology and Astroparticle Physics, The Ohio State University, 4055 McPherson Lab, 140 West 18th Avenue, Columbus, OH 43221 (United States); Munoz, J. A. [Departamento de Astronomia y Astrofisica, Universidad de Valencia, E-46100 Burjassot, Valencia (Spain); Falco, E. [Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Motta, V. [Departamento de Fisica y Astronomia, Universidad de Valparaiso, Avda. Gran Bretana 1111, Valparaiso (Chile)

    2013-02-20

    We measure the differential microlensing of the broad emission lines between 18 quasar image pairs in 16 gravitational lenses. We find that the broad emission lines are in general weakly microlensed. The results show, at a modest level of confidence (1.8{sigma}), that high ionization lines such as C IV are more strongly microlensed than low ionization lines such as H{beta}, indicating that the high ionization line emission regions are more compact. If we statistically model the distribution of microlensing magnifications, we obtain estimates for the broad line region size of r{sub s} = 24{sup +22} {sub -15} and r{sub s} = 55{sup +150} {sub -35} lt-day (90% confidence) for the high and low ionization lines, respectively. When the samples are divided into higher and lower luminosity quasars, we find that the line emission regions of more luminous quasars are larger, with a slope consistent with the expected scaling from photoionization models. Our estimates also agree well with the results from local reveberation mapping studies.

  20. Lining seam elimination algorithm and surface crack detection in concrete tunnel lining

    Science.gov (United States)

    Qu, Zhong; Bai, Ling; An, Shi-Quan; Ju, Fang-Rong; Liu, Ling

    2016-11-01

    Due to the particularity of the surface of concrete tunnel lining and the diversity of detection environments such as uneven illumination, smudges, localized rock falls, water leakage, and the inherent seams of the lining structure, existing crack detection algorithms cannot detect real cracks accurately. This paper proposed an algorithm that combines lining seam elimination with the improved percolation detection algorithm based on grid cell analysis for surface crack detection in concrete tunnel lining. First, check the characteristics of pixels within the overlapping grid to remove the background noise and generate the percolation seed map (PSM). Second, cracks are detected based on the PSM by the accelerated percolation algorithm so that the fracture unit areas can be scanned and connected. Finally, the real surface cracks in concrete tunnel lining can be obtained by removing the lining seam and performing percolation denoising. Experimental results show that the proposed algorithm can accurately, quickly, and effectively detect the real surface cracks. Furthermore, it can fill the gap in the existing concrete tunnel lining surface crack detection by removing the lining seam.

  1. Minimax Regression Quantiles

    DEFF Research Database (Denmark)

    Bache, Stefan Holst

    A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....

  2. Optimal regression for reasoning about knowledge and actions

    NARCIS (Netherlands)

    Ditmarsch, van H.; Herzig, Andreas; Lima, de Tiago

    2007-01-01

    We show how in the propositional case both Reiter’s and Scherl & Levesque’s solutions to the frame problem can be modelled in dynamic epistemic logic (DEL), and provide an optimal regression algorithm for the latter. Our method is as follows: we extend Reiter’s framework by integrating observation

  3. Spontaneous Regression of Lymphangiomas in a Single Center Over 34 Years

    Directory of Open Access Journals (Sweden)

    Motoi Kato, MD

    2017-09-01

    Conclusions:. We concluded that elderly patients with macrocystic or mixed type lymphangioma may have to wait for treatment for over 3 months from the initial onset. Conversely, microcystic type could not be expected to show regression in a short period, and prompt initiation of the treatments may be required. The difference of the regression or not may depend on the characteristics of the lymph flow.

  4. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, S.S. [Department of Information and Communication Systems Engineering, University of the Aegean, Karlovassi, 83 200 Samos (Greece); Ekonomou, L.; Chatzarakis, G.E. [Department of Electrical Engineering Educators, ASPETE - School of Pedagogical and Technological Education, N. Heraklion, 141 21 Athens (Greece); Karamousantas, D.C. [Technological Educational Institute of Kalamata, Antikalamos, 24100 Kalamata (Greece); Katsikas, S.K. [Department of Technology Education and Digital Systems, University of Piraeus, 150 Androutsou Srt., 18 532 Piraeus (Greece); Liatsis, P. [Division of Electrical Electronic and Information Engineering, School of Engineering and Mathematical Sciences, Information and Biomedical Engineering Centre, City University, Northampton Square, London EC1V 0HB (United Kingdom)

    2008-09-15

    This study addresses the problem of modeling the electricity demand loads in Greece. The provided actual load data is deseasonilized and an AutoRegressive Moving Average (ARMA) model is fitted on the data off-line, using the Akaike Corrected Information Criterion (AICC). The developed model fits the data in a successful manner. Difficulties occur when the provided data includes noise or errors and also when an on-line/adaptive modeling is required. In both cases and under the assumption that the provided data can be represented by an ARMA model, simultaneous order and parameter estimation of ARMA models under the presence of noise are performed. The produced results indicate that the proposed method, which is based on the multi-model partitioning theory, tackles successfully the studied problem. For validation purposes the produced results are compared with three other established order selection criteria, namely AICC, Akaike's Information Criterion (AIC) and Schwarz's Bayesian Information Criterion (BIC). The developed model could be useful in the studies that concern electricity consumption and electricity prices forecasts. (author)

  5. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    Science.gov (United States)

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower

  6. Quality of life in breast cancer patients--a quantile regression analysis.

    Science.gov (United States)

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  7. Proliferation of epithelial cell rests, formation of apical cysts, and regression of apical cysts after periapical wound healing.

    Science.gov (United States)

    Lin, Louis M; Huang, George T-J; Rosenberg, Paul A

    2007-08-01

    There is continuing controversy regarding the potential for inflammatory apical cysts to heal after nonsurgical endodontic therapy. Molecular cell biology may provide answers to a series of related questions. How are the epithelial cell rests of Malassez stimulated to proliferate? How are the apical cysts formed? How does the lining epithelium of apical cysts regress after endodontic therapy? Epithelial cell rests are induced to divide and proliferate by inflammatory mediators, proinflammatory cytokines, and growth factors released from host cells during periradicular inflammation. Quiescent epithelial cell rests can behave like restricted-potential stem cells if stimulated to proliferate. Formation of apical cysts is most likely caused by the merging of proliferating epithelial strands from all directions to form a three-dimensional ball mass. After endodontic therapy, epithelial cells in epithelial strands of periapical granulomas and the lining epithelium of apical cysts may stop proliferating because of a reduction in inflammatory mediators, proinflammatory cytokines, and growth factors. Epithelial cells will also regress because of activation of apoptosis or programmed cell death through deprivation of survival factors or by receiving death signals during periapical wound healing.

  8. Analytical, Practical and Emotional Intelligence and Line Manager Competencies

    Directory of Open Access Journals (Sweden)

    Anna Baczyńska

    2015-12-01

    Full Text Available Purpose: The research objective was to examine to what extent line manager competencies are linked to intelligence, and more specifically, three types of intelligence: analytical (fluid, practical and emotional. Methodology: The research was carried out with line managers (N=98 who took part in 12 Assessment Centre sessions and completed tests measuring analytical, practical and emotional intelligence. The adopted hypotheses were tested using a multiple regression. In the regression model, the dependent variable was a managerial competency (management and striving for results, social skills, openness to change, problem solving, employee development and the explanatory variables were the three types of intelligence. Five models, each for a separate management competency, were tested in this way. Findings: In the study, it was hypothesized that practical intelligence relates to procedural tacit knowledge and is the strongest indicator of managerial competency. Analysis of the study results testing this hypothesis indicated that practical intelligence largely accounts for the level of competency used in managerial work (from 21% to 38%. The study findings suggest that practical intelligence is a better indicator of managerial competencies among line managers than traditionally measured IQ or emotional intelligence. Originality: This research fills an important gap in the literature on the subject, indicating the links between major contemporary selection indicators (i.e., analytical, practical and emotional intelligence and managerial competencies presented in realistic work simulations measured using the Assessment Centre process.

  9. Novel mesostructured inclusions in the epidermal lining of Artemia franciscana ovisacs show optical activity

    Directory of Open Access Journals (Sweden)

    Elena Hollergschwandtner

    2017-10-01

    Full Text Available Background Biomineralization, e.g., in sea urchins or mollusks, includes the assembly of mesoscopic superstructures from inorganic crystalline components and biopolymers. The resulting mesocrystals inspire biophysicists and material scientists alike, because of their extraordinary physical properties. Current efforts to replicate mesocrystal synthesis in vitro require understanding the principles of their self-assembly in vivo. One question, not addressed so far, is whether intracellular crystals of proteins can assemble with biopolymers into functional mesocrystal-like structures. During our electron microscopy studies into Artemia franciscana (Crustacea: Branchiopoda, we found initial evidence of such proteinaceous mesostructures. Results EM preparations with high-pressure freezing and accelerated freeze substitution revealed an extraordinary intracellular source of mesostructured inclusions in both the cyto-and nucleoplasm of the epidermal lining of ovisacs of A. franciscana. Confocal reflection microscopy not only confirmed our finding; it also revealed reflective, light dispersing activity of these flake-like structures, their positioning and orientation with respect to the ovisac inside. Both the striation of alternating electron dense and electron-lucent components and the sharp edges of the flakes indicate self-assembly of material of yet unknown origin under supposed participation of crystallization. However, selected area electron diffraction could not verify the status of crystallization. Energy dispersive X-ray analysis measured a marked increase in nitrogen within the flake-like inclusion, and the almost complete absence of elements that are typically involved in inorganic crystallization. This rise in nitrogen could possibility be related to higher package density of proteins, achieved by mesostructure assembly. Conclusions The ovisac lining of A. franciscana is endowed with numerous mesostructured inclusions that have not been

  10. Post-processing through linear regression

    Science.gov (United States)

    van Schaeybroeck, B.; Vannitsem, S.

    2011-03-01

    Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.

  11. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  12. Development of Elite BPH-Resistant Wide-Spectrum Restorer Lines for Three and Two Line Hybrid Rice.

    Science.gov (United States)

    Fan, Fengfeng; Li, Nengwu; Chen, Yunping; Liu, Xingdan; Sun, Heng; Wang, Jie; He, Guangcun; Zhu, Yingguo; Li, Shaoqing

    2017-01-01

    Hybrid rice has contributed significantly to the world food security. Breeding of elite high-yield, strong-resistant broad-spectrum restorer line is an important strategy for hybrid rice in commercial breeding programs. Here, we developed three elite brown planthopper (BPH)-resistant wide-spectrum restorer lines by pyramiding big-panicle gene Gn8.1 , BPH-resistant genes Bph6 and Bph9 , fertility restorer genes Rf3, Rf4, Rf5 , and Rf6 through molecular marker assisted selection. Resistance analysis revealed that the newly developed restorer lines showed stronger BPH-resistance than any of the single-gene donor parent Luoyang-6 and Luoyang-9. Moreover, the three new restorer lines had broad spectrum recovery capabilities for Honglian CMS, Wild abortive CMS and two-line GMS sterile lines, and higher grain yields than that of the recurrent parent 9,311 under nature field conditions. Importantly, the hybrid crosses also showed good performance for grain yield and BPH-resistance. Thus, the development of elite BPH-resistant wide-spectrum restorer lines has a promising future for breeding of broad spectrum BPH-resistant high-yield varieties.

  13. Semiparametric regression during 2003–2007

    KAUST Repository

    Ruppert, David; Wand, M.P.; Carroll, Raymond J.

    2009-01-01

    Semiparametric regression is a fusion between parametric regression and nonparametric regression that integrates low-rank penalized splines, mixed model and hierarchical Bayesian methodology – thus allowing more streamlined handling of longitudinal and spatial correlation. We review progress in the field over the five-year period between 2003 and 2007. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application.

  14. Unbalanced Regressions and the Predictive Equation

    DEFF Research Database (Denmark)

    Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo

    Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...

  15. Interpretation of commonly used statistical regression models.

    Science.gov (United States)

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  16. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  17. Regression modeling of ground-water flow

    Science.gov (United States)

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  18. Online and Batch Supervised Background Estimation via L1 Regression

    KAUST Repository

    Dutta, Aritra

    2017-11-23

    We propose a surprisingly simple model for supervised video background estimation. Our model is based on $\\\\ell_1$ regression. As existing methods for $\\\\ell_1$ regression do not scale to high-resolution videos, we propose several simple and scalable methods for solving the problem, including iteratively reweighted least squares, a homotopy method, and stochastic gradient descent. We show through extensive experiments that our model and methods match or outperform the state-of-the-art online and batch methods in virtually all quantitative and qualitative measures.

  19. Online and Batch Supervised Background Estimation via L1 Regression

    KAUST Repository

    Dutta, Aritra; Richtarik, Peter

    2017-01-01

    We propose a surprisingly simple model for supervised video background estimation. Our model is based on $\\ell_1$ regression. As existing methods for $\\ell_1$ regression do not scale to high-resolution videos, we propose several simple and scalable methods for solving the problem, including iteratively reweighted least squares, a homotopy method, and stochastic gradient descent. We show through extensive experiments that our model and methods match or outperform the state-of-the-art online and batch methods in virtually all quantitative and qualitative measures.

  20. Predicting for activity of second-line trastuzumab-based therapy in her2-positive advanced breast cancer

    Directory of Open Access Journals (Sweden)

    Rottenfusser Andrea

    2009-10-01

    Full Text Available Abstract Background In Her2-positive advanced breast cancer, the upfront use of trastuzumab is well established. Upon progression on first-line therapy, patients may be switched to lapatinib. Others however remain candidates for continued antibody treatment (treatment beyond progression. Here, we aimed to identify factors predicting for activity of second-line trastuzumab-based therapy. Methods Ninety-seven patients treated with > 1 line of trastuzumab-containing therapy were available for this analysis. Her2-status was determined by immunohistochemistry and re-analyzed by FISH if a score of 2+ was gained. Time to progression (TTP on second-line therapy was defined as primary study endpoint. TTP and overall survival (OS were estimated using the Kaplan-Meier product limit method. Multivariate analyses (Cox proportional hazards model, multinomial logistic regression were applied in order to identify factors associated with TTP, response, OS, and incidence of brain metastases. p values Results Median TTP on second-line trastuzumab-based therapy was 7 months (95% CI 5.74-8.26, and 8 months (95% CI 6.25-9.74 on first-line, respectively (n.s.. In the multivariate models, none of the clinical or histopthological features could reliably predict for activity of second-line trastuzumab-based treatment. OS was 43 months suggesting improved survival in patients treated with trastuzumab in multiple-lines. A significant deterioration of cardiac function was observed in three patients; 40.2% developed brain metastases while on second-line trastuzumab or thereafter. Conclusion Trastuzumab beyond progression showed considerable activity. None of the variables investigated correlated with activity of second-line therapy. In order to predict for activity of second-line trastuzumab, it appears necessary to evaluate factors known to confer trastuzumab-resistance.

  1. Post-processing through linear regression

    Directory of Open Access Journals (Sweden)

    B. Van Schaeybroeck

    2011-03-01

    Full Text Available Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS method, a new time-dependent Tikhonov regularization (TDTR method, the total least-square method, a new geometric-mean regression (GM, a recently introduced error-in-variables (EVMOS method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified.

    These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise. At long lead times the regression schemes (EVMOS, TDTR which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.

  2. A comparison of random forest regression and multiple linear regression for prediction in neuroscience.

    Science.gov (United States)

    Smith, Paul F; Ganesh, Siva; Liu, Ping

    2013-10-30

    Regression is a common statistical tool for prediction in neuroscience. However, linear regression is by far the most common form of regression used, with regression trees receiving comparatively little attention. In this study, the results of conventional multiple linear regression (MLR) were compared with those of random forest regression (RFR), in the prediction of the concentrations of 9 neurochemicals in the vestibular nucleus complex and cerebellum that are part of the l-arginine biochemical pathway (agmatine, putrescine, spermidine, spermine, l-arginine, l-ornithine, l-citrulline, glutamate and γ-aminobutyric acid (GABA)). The R(2) values for the MLRs were higher than the proportion of variance explained values for the RFRs: 6/9 of them were ≥ 0.70 compared to 4/9 for RFRs. Even the variables that had the lowest R(2) values for the MLRs, e.g. ornithine (0.50) and glutamate (0.61), had much lower proportion of variance explained values for the RFRs (0.27 and 0.49, respectively). The RSE values for the MLRs were lower than those for the RFRs in all but two cases. In general, MLRs seemed to be superior to the RFRs in terms of predictive value and error. In the case of this data set, MLR appeared to be superior to RFR in terms of its explanatory value and error. This result suggests that MLR may have advantages over RFR for prediction in neuroscience with this kind of data set, but that RFR can still have good predictive value in some cases. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. FURTHER CONSIDERATIONS ON SPREADSHEET-BASED AUTOMATIC TREND LINES

    Directory of Open Access Journals (Sweden)

    DANIEL HOMOCIANU

    2015-12-01

    Full Text Available Most of the nowadays business applications working with data sets allow exports to the spreadsheet format. This fact is related to the experience of common business users with such products and to the possibility to couple what they have with something containing many models, functions and possibilities to process and represent data, by that getting something in dynamics and much more than a simple static less useful report. The purpose of Business Intelligence is to identify clusters, profiles, association rules, decision trees and many other patterns or even behaviours, but also to generate alerts for exceptions, determine trends and make predictions about the future based on historical data. In this context, the paper shows some practical results obtained after testing both the automatic creation of scatter charts and trend lines corresponding to the user’s preferences and the automatic suggesting of the most appropriate trend for the tested data mostly based on the statistical measure of how close they are to the regression function.

  4. Optimization of antibody immobilization for on-line or off-line immunoaffinity chromatography

    DEFF Research Database (Denmark)

    Beyer, Natascha Helena; Schou, Christian; Højrup, Peter

    2009-01-01

    -POROS. Protein G-based matrices are very stable showing essentially no decline in performance after 50 application-wash-elution-reequilibration cycles and being easily prepared within 2-3 h of working time with a typical antibody coupling yield of above 80%. In off-line applications where constant flow....... A systematic study was conducted to determine the most versatile antibody immobilization method for use in on-line and off-line IA chromatography applications using commonly accessible immobilization methods. Four chemistries were examined using polyclonal and monoclonal antibodies and antibody fragments. We...

  5. Covariant electromagnetic field lines

    Science.gov (United States)

    Hadad, Y.; Cohen, E.; Kaminer, I.; Elitzur, A. C.

    2017-08-01

    Faraday introduced electric field lines as a powerful tool for understanding the electric force, and these field lines are still used today in classrooms and textbooks teaching the basics of electromagnetism within the electrostatic limit. However, despite attempts at generalizing this concept beyond the electrostatic limit, such a fully relativistic field line theory still appears to be missing. In this work, we propose such a theory and define covariant electromagnetic field lines that naturally extend electric field lines to relativistic systems and general electromagnetic fields. We derive a closed-form formula for the field lines curvature in the vicinity of a charge, and show that it is related to the world line of the charge. This demonstrates how the kinematics of a charge can be derived from the geometry of the electromagnetic field lines. Such a theory may also provide new tools in modeling and analyzing electromagnetic phenomena, and may entail new insights regarding long-standing problems such as radiation-reaction and self-force. In particular, the electromagnetic field lines curvature has the attractive property of being non-singular everywhere, thus eliminating all self-field singularities without using renormalization techniques.

  6. Use of probabilistic weights to enhance linear regression myoelectric control

    Science.gov (United States)

    Smith, Lauren H.; Kuiken, Todd A.; Hargrove, Levi J.

    2015-12-01

    Objective. Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Approach. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts’ law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Main results. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p < 0.05) by preventing extraneous movement at additional DOFs. Similar results were seen in experiments with two transradial amputees. Though goodness-of-fit evaluations suggested that the EMG feature distributions showed some deviations from the Gaussian, equal-covariance assumptions used in this experiment, the assumptions were sufficiently met to provide improved performance compared to linear regression control. Significance. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.

  7. Determining factors influencing survival of breast cancer by fuzzy logistic regression model.

    Science.gov (United States)

    Nikbakht, Roya; Bahrampour, Abbas

    2017-01-01

    Fuzzy logistic regression model can be used for determining influential factors of disease. This study explores the important factors of actual predictive survival factors of breast cancer's patients. We used breast cancer data which collected by cancer registry of Kerman University of Medical Sciences during the period of 2000-2007. The variables such as morphology, grade, age, and treatments (surgery, radiotherapy, and chemotherapy) were applied in the fuzzy logistic regression model. Performance of model was determined in terms of mean degree of membership (MDM). The study results showed that almost 41% of patients were in neoplasm and malignant group and more than two-third of them were still alive after 5-year follow-up. Based on the fuzzy logistic model, the most important factors influencing survival were chemotherapy, morphology, and radiotherapy, respectively. Furthermore, the MDM criteria show that the fuzzy logistic regression have a good fit on the data (MDM = 0.86). Fuzzy logistic regression model showed that chemotherapy is more important than radiotherapy in survival of patients with breast cancer. In addition, another ability of this model is calculating possibilistic odds of survival in cancer patients. The results of this study can be applied in clinical research. Furthermore, there are few studies which applied the fuzzy logistic models. Furthermore, we recommend using this model in various research areas.

  8. A Seemingly Unrelated Poisson Regression Model

    OpenAIRE

    King, Gary

    1989-01-01

    This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.

  9. Lip line preference for variant face types.

    Science.gov (United States)

    Anwar, Nabila; Fida, Mubassar

    2012-06-01

    To determine the effect of altered lip line on attractiveness and to find preferred lip line for vertical face types in both genders. Cross-sectional analytical study. The Aga Khan University Hospital, Karachi, from May to July 2009. Photographs of two selected subjects were altered to produce three face types for the same individual with the aim of keeping the frame of the smile constant. Lip line was then altered for both the subjects as: both dentitions visible, upper incisors visible, upper incisors and 2 mm gum and 4 mm gum visible. The pictures were rated by different professionals for attractiveness. Descriptive statistics for the raters and multiple factor ANOVA was used to find the most attractive lip line. The total number of raters was 100 with the mean age of 30.3 ± 8 years. The alterations in the smile parameters produced statistically significant difference in the attractiveness of faces, whereas the perception difference was found to be insignificant amongst raters of different professions. Preferred lip line was the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects. The variability in lip line showed significant difference in the perceived attractiveness. Preferred lip lines as the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects.

  10. Multiple linear regression and regression with time series error models in forecasting PM10 concentrations in Peninsular Malaysia.

    Science.gov (United States)

    Ng, Kar Yong; Awang, Norhashidah

    2018-01-06

    Frequent haze occurrences in Malaysia have made the management of PM 10 (particulate matter with aerodynamic less than 10 μm) pollution a critical task. This requires knowledge on factors associating with PM 10 variation and good forecast of PM 10 concentrations. Hence, this paper demonstrates the prediction of 1-day-ahead daily average PM 10 concentrations based on predictor variables including meteorological parameters and gaseous pollutants. Three different models were built. They were multiple linear regression (MLR) model with lagged predictor variables (MLR1), MLR model with lagged predictor variables and PM 10 concentrations (MLR2) and regression with time series error (RTSE) model. The findings revealed that humidity, temperature, wind speed, wind direction, carbon monoxide and ozone were the main factors explaining the PM 10 variation in Peninsular Malaysia. Comparison among the three models showed that MLR2 model was on a same level with RTSE model in terms of forecasting accuracy, while MLR1 model was the worst.

  11. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  12. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  13. Quantum algorithm for linear regression

    Science.gov (United States)

    Wang, Guoming

    2017-07-01

    We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.

  14. On the estimation and testing of predictive panel regressions

    NARCIS (Netherlands)

    Karabiyik, H.; Westerlund, Joakim; Narayan, Paresh

    2016-01-01

    Hjalmarsson (2010) considers an OLS-based estimator of predictive panel regressions that is argued to be mixed normal under very general conditions. In a recent paper, Westerlund et al. (2016) show that while consistent, the estimator is generally not mixed normal, which invalidates standard normal

  15. Linear Regression on Sparse Features for Single-Channel Speech Separation

    DEFF Research Database (Denmark)

    Schmidt, Mikkel N.; Olsson, Rasmus Kongsgaard

    2007-01-01

    In this work we address the problem of separating multiple speakers from a single microphone recording. We formulate a linear regression model for estimating each speaker based on features derived from the mixture. The employed feature representation is a sparse, non-negative encoding of the speech...... mixture in terms of pre-learned speaker-dependent dictionaries. Previous work has shown that this feature representation by itself provides some degree of separation. We show that the performance is significantly improved when regression analysis is performed on the sparse, non-negative features, both...

  16. The study of logistic regression of risk factor on the death cause of uranium miners

    International Nuclear Information System (INIS)

    Wen Jinai; Yuan Liyun; Jiang Ruyi

    1999-01-01

    Logistic regression model has widely been used in the field of medicine. The computer software on this model is popular, but it is worth to discuss how to use this model correctly. Using SPSS (Statistical Package for the Social Science) software, unconditional logistic regression method was adopted to carry out multi-factor analyses on the cause of total death, cancer death and lung cancer death of uranium miners. The data is from radioepidemiological database of one uranium mine. The result show that attained age is a risk factor in the logistic regression analyses of total death, cancer death and lung cancer death. In the logistic regression analysis of cancer death, there is a negative correlation between the age of exposure and cancer death. This shows that the younger the age at exposure, the bigger the risk of cancer death. In the logistic regression analysis of lung cancer death, there is a positive correlation between the cumulated exposure and lung cancer death, this show that cumulated exposure is a most important risk factor of lung cancer death on uranium miners. It has been documented by many foreign reports that the lung cancer death rate is higher in uranium miners

  17. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  18. Logistic regression for dichotomized counts.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  19. Children's cognitive representation of the mathematical number line.

    Science.gov (United States)

    Rouder, Jeffrey N; Geary, David C

    2014-07-01

    Learning of the mathematical number line has been hypothesized to be dependent on an inherent sense of approximate quantity. Children's number line placements are predicted to conform to the underlying properties of this system; specifically, placements are exaggerated for small numerals and compressed for larger ones. Alternative hypotheses are based on proportional reasoning; specifically, numerals are placed relative to set anchors such as end points on the line. Traditional testing of these alternatives involves fitting group medians to corresponding regression models which assumes homogenous residuals and thus does not capture useful information from between- and within-child variation in placements across the number line. To more fully assess differential predictions, we developed a novel set of hierarchical statistical models that enable the simultaneous estimation of mean levels of and variation in performance, as well as developmental transitions. Using these techniques we fitted the number line placements of 224 children longitudinally assessed from first to fifth grade, inclusive. The compression pattern was evident in mean performance in first grade, but was the best fit for only 20% of first graders when the full range of variation in the data are modeled. Most first graders' placements suggested use of end points, consistent with proportional reasoning. Developmental transition involved incorporation of a mid-point anchor, consistent with a modified proportional reasoning strategy. The methodology introduced here enables a more nuanced assessment of children's number line representation and learning than any previous approaches and indicates that developmental improvement largely results from midpoint segmentation of the line. © 2014 John Wiley & Sons Ltd.

  20. Treating experimental data of inverse kinetic method by unitary linear regression analysis

    International Nuclear Information System (INIS)

    Zhao Yusen; Chen Xiaoliang

    2009-01-01

    The theory of treating experimental data of inverse kinetic method by unitary linear regression analysis was described. Not only the reactivity, but also the effective neutron source intensity could be calculated by this method. Computer code was compiled base on the inverse kinetic method and unitary linear regression analysis. The data of zero power facility BFS-1 in Russia were processed and the results were compared. The results show that the reactivity and the effective neutron source intensity can be obtained correctly by treating experimental data of inverse kinetic method using unitary linear regression analysis and the precision of reactivity measurement is improved. The central element efficiency can be calculated by using the reactivity. The result also shows that the effect to reactivity measurement caused by external neutron source should be considered when the reactor power is low and the intensity of external neutron source is strong. (authors)

  1. Mechanisms of neuroblastoma regression

    Science.gov (United States)

    Brodeur, Garrett M.; Bagatell, Rochelle

    2014-01-01

    Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179

  2. The characteristics of high-yield genotype of early-mature mutant lines in barley

    International Nuclear Information System (INIS)

    Chen Xiulan; Han Yuepeng; He Zhentian; Yang Hefeng

    2000-01-01

    The correlation and genetic parameters of eight agronomic traits of 36 early mature mutant lines induced from barley Sunong 9052 were studied by stepwise regression and path analysis. The results showed that: (1) the growing period of early mutants was shortened 2-13 days from that of their parent and the trait of yield had a great mutation range; (2) the number of grain per panicle significantly correlated with the days from sowing to heading; (3) according to direct path coefficients, the main characters related with individual plant-yield were in order of productive panicle per plant > 1000-grain-weight > number of grain per panicle > fertility, the high-yield genotype had more productive panicle and higher 10000-grain-weight, and to increase the yield in the breeding of early mature mutation was to select the lines with more tillers and productive panicles, higher 1000-grain-weight and lower number of grain per panicle; (4) the higher broad-sense heritability and genetic variation coefficient were found in 1000-grain-weight and the days from sowing to heading

  3. Using the Ridge Regression Procedures to Estimate the Multiple Linear Regression Coefficients

    Science.gov (United States)

    Gorgees, HazimMansoor; Mahdi, FatimahAssim

    2018-05-01

    This article concerns with comparing the performance of different types of ordinary ridge regression estimators that have been already proposed to estimate the regression parameters when the near exact linear relationships among the explanatory variables is presented. For this situations we employ the data obtained from tagi gas filling company during the period (2008-2010). The main result we reached is that the method based on the condition number performs better than other methods since it has smaller mean square error (MSE) than the other stated methods.

  4. Multicollinearity and Regression Analysis

    Science.gov (United States)

    Daoud, Jamal I.

    2017-12-01

    In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.

  5. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  6. Credit Scoring Problem Based on Regression Analysis

    OpenAIRE

    Khassawneh, Bashar Suhil Jad Allah

    2014-01-01

    ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....

  7. Conjoined legs: Sirenomelia or caudal regression syndrome?

    Directory of Open Access Journals (Sweden)

    Sakti Prasad Das

    2013-01-01

    Full Text Available Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.

  8. Conjoined legs: Sirenomelia or caudal regression syndrome?

    Science.gov (United States)

    Das, Sakti Prasad; Ojha, Niranjan; Ganesh, G Shankar; Mohanty, Ram Narayan

    2013-07-01

    Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.

  9. Fuzzy multinomial logistic regression analysis: A multi-objective programming approach

    Science.gov (United States)

    Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan

    2017-05-01

    Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.

  10. Autistic Regression

    Science.gov (United States)

    Matson, Johnny L.; Kozlowski, Alison M.

    2010-01-01

    Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…

  11. Ridge regression estimator: combining unbiased and ordinary ridge regression methods of estimation

    Directory of Open Access Journals (Sweden)

    Sharad Damodar Gore

    2009-10-01

    Full Text Available Statistical literature has several methods for coping with multicollinearity. This paper introduces a new shrinkage estimator, called modified unbiased ridge (MUR. This estimator is obtained from unbiased ridge regression (URR in the same way that ordinary ridge regression (ORR is obtained from ordinary least squares (OLS. Properties of MUR are derived. Results on its matrix mean squared error (MMSE are obtained. MUR is compared with ORR and URR in terms of MMSE. These results are illustrated with an example based on data generated by Hoerl and Kennard (1975.

  12. Discriminative Elastic-Net Regularized Linear Regression.

    Science.gov (United States)

    Zhang, Zheng; Lai, Zhihui; Xu, Yong; Shao, Ling; Wu, Jian; Xie, Guo-Sen

    2017-03-01

    In this paper, we aim at learning compact and discriminative linear regression models. Linear regression has been widely used in different problems. However, most of the existing linear regression methods exploit the conventional zero-one matrix as the regression targets, which greatly narrows the flexibility of the regression model. Another major limitation of these methods is that the learned projection matrix fails to precisely project the image features to the target space due to their weak discriminative capability. To this end, we present an elastic-net regularized linear regression (ENLR) framework, and develop two robust linear regression models which possess the following special characteristics. First, our methods exploit two particular strategies to enlarge the margins of different classes by relaxing the strict binary targets into a more feasible variable matrix. Second, a robust elastic-net regularization of singular values is introduced to enhance the compactness and effectiveness of the learned projection matrix. Third, the resulting optimization problem of ENLR has a closed-form solution in each iteration, which can be solved efficiently. Finally, rather than directly exploiting the projection matrix for recognition, our methods employ the transformed features as the new discriminate representations to make final image classification. Compared with the traditional linear regression model and some of its variants, our method is much more accurate in image classification. Extensive experiments conducted on publicly available data sets well demonstrate that the proposed framework can outperform the state-of-the-art methods. The MATLAB codes of our methods can be available at http://www.yongxu.org/lunwen.html.

  13. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Yang, Guangfei; Li, Xianneng; Wang, Jianliang; Lian, Lian; Ma, Tieju

    2015-01-01

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  14. On logistic regression analysis of dichotomized responses.

    Science.gov (United States)

    Lu, Kaifeng

    2017-01-01

    We study the properties of treatment effect estimate in terms of odds ratio at the study end point from logistic regression model adjusting for the baseline value when the underlying continuous repeated measurements follow a multivariate normal distribution. Compared with the analysis that does not adjust for the baseline value, the adjusted analysis produces a larger treatment effect as well as a larger standard error. However, the increase in standard error is more than offset by the increase in treatment effect so that the adjusted analysis is more powerful than the unadjusted analysis for detecting the treatment effect. On the other hand, the true adjusted odds ratio implied by the normal distribution of the underlying continuous variable is a function of the baseline value and hence is unlikely to be able to be adequately represented by a single value of adjusted odds ratio from the logistic regression model. In contrast, the risk difference function derived from the logistic regression model provides a reasonable approximation to the true risk difference function implied by the normal distribution of the underlying continuous variable over the range of the baseline distribution. We show that different metrics of treatment effect have similar statistical power when evaluated at the baseline mean. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Small sample GEE estimation of regression parameters for longitudinal data.

    Science.gov (United States)

    Paul, Sudhir; Zhang, Xuemao

    2014-09-28

    Longitudinal (clustered) response data arise in many bio-statistical applications which, in general, cannot be assumed to be independent. Generalized estimating equation (GEE) is a widely used method to estimate marginal regression parameters for correlated responses. The advantage of the GEE is that the estimates of the regression parameters are asymptotically unbiased even if the correlation structure is misspecified, although their small sample properties are not known. In this paper, two bias adjusted GEE estimators of the regression parameters in longitudinal data are obtained when the number of subjects is small. One is based on a bias correction, and the other is based on a bias reduction. Simulations show that the performances of both the bias-corrected methods are similar in terms of bias, efficiency, coverage probability, average coverage length, impact of misspecification of correlation structure, and impact of cluster size on bias correction. Both these methods show superior properties over the GEE estimates for small samples. Further, analysis of data involving a small number of subjects also shows improvement in bias, MSE, standard error, and length of the confidence interval of the estimates by the two bias adjusted methods over the GEE estimates. For small to moderate sample sizes (N ≤50), either of the bias-corrected methods GEEBc and GEEBr can be used. However, the method GEEBc should be preferred over GEEBr, as the former is computationally easier. For large sample sizes, the GEE method can be used. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Retrieving relevant factors with exploratory SEM and principal-covariate regression: A comparison.

    Science.gov (United States)

    Vervloet, Marlies; Van den Noortgate, Wim; Ceulemans, Eva

    2018-02-12

    Behavioral researchers often linearly regress a criterion on multiple predictors, aiming to gain insight into the relations between the criterion and predictors. Obtaining this insight from the ordinary least squares (OLS) regression solution may be troublesome, because OLS regression weights show only the effect of a predictor on top of the effects of other predictors. Moreover, when the number of predictors grows larger, it becomes likely that the predictors will be highly collinear, which makes the regression weights' estimates unstable (i.e., the "bouncing beta" problem). Among other procedures, dimension-reduction-based methods have been proposed for dealing with these problems. These methods yield insight into the data by reducing the predictors to a smaller number of summarizing variables and regressing the criterion on these summarizing variables. Two promising methods are principal-covariate regression (PCovR) and exploratory structural equation modeling (ESEM). Both simultaneously optimize reduction and prediction, but they are based on different frameworks. The resulting solutions have not yet been compared; it is thus unclear what the strengths and weaknesses are of both methods. In this article, we focus on the extents to which PCovR and ESEM are able to extract the factors that truly underlie the predictor scores and can predict a single criterion. The results of two simulation studies showed that for a typical behavioral dataset, ESEM (using the BIC for model selection) in this regard is successful more often than PCovR. Yet, in 93% of the datasets PCovR performed equally well, and in the case of 48 predictors, 100 observations, and large differences in the strengths of the factors, PCovR even outperformed ESEM.

  17. Categorical regression dose-response modeling

    Science.gov (United States)

    The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...

  18. Abstract Expression Grammar Symbolic Regression

    Science.gov (United States)

    Korns, Michael F.

    This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.

  19. Study of Yield and Effective Traits in Bread Wheat Recombinant Inbred Lines (Triticum aestivum L. under Water Deficit Condition

    Directory of Open Access Journals (Sweden)

    S. Mohammad zadeh

    2013-11-01

    Full Text Available The effects some traits on seed yield of recombinant inbred lines of wheat under water deficit stress was studied. This research was done at the Agricultural Research Stations, Islamic Azad University, Tabriz Branch in 2010- 2011. 28 recombinant inbred lines of wheat bread with two parents (Norstar and Zagros in split plot experiment based on a randomized complete block design with three replications at two irrigation levels (70 and 140 mm evaporation from pan class A were studied. Analysis of variance indicated a significant genetic differences in all traits under study among the lines. Lines No. 32, 163 and 182 produced highest yield under both irrigation levels. Number of spikes, grains per spike and harvest index had the highest positive correlation with grain yield. Path analysis based on stepwise regression showed that under the normal irrigation conditions, number spike (0.556, number of grains per spike (0.278, weight of 1000 grain (0.259 and the drought stress number spike (0.430, straw yield (0.276 and peduncle length (0.323 had the most direct and positive effect on yield respectively.

  20. Comparison of Classical Linear Regression and Orthogonal Regression According to the Sum of Squares Perpendicular Distances

    OpenAIRE

    KELEŞ, Taliha; ALTUN, Murat

    2016-01-01

    Regression analysis is a statistical technique for investigating and modeling the relationship between variables. The purpose of this study was the trivial presentation of the equation for orthogonal regression (OR) and the comparison of classical linear regression (CLR) and OR techniques with respect to the sum of squared perpendicular distances. For that purpose, the analyses were shown by an example. It was found that the sum of squared perpendicular distances of OR is smaller. Thus, it wa...

  1. Pathological assessment of liver fibrosis regression

    Directory of Open Access Journals (Sweden)

    WANG Bingqiong

    2017-03-01

    Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.

  2. Time course for tail regression during metamorphosis of the ascidian Ciona intestinalis.

    Science.gov (United States)

    Matsunobu, Shohei; Sasakura, Yasunori

    2015-09-01

    In most ascidians, the tadpole-like swimming larvae dramatically change their body-plans during metamorphosis and develop into sessile adults. The mechanisms of ascidian metamorphosis have been researched and debated for many years. Until now information on the detailed time course of the initiation and completion of each metamorphic event has not been described. One dramatic and important event in ascidian metamorphosis is tail regression, in which ascidian larvae lose their tails to adjust themselves to sessile life. In the present study, we measured the time associated with tail regression in the ascidian Ciona intestinalis. Larvae are thought to acquire competency for each metamorphic event in certain developmental periods. We show that the timing with which the competence for tail regression is acquired is determined by the time since hatching, and this timing is not affected by the timing of post-hatching events such as adhesion. Because larvae need to adhere to substrates with their papillae to induce tail regression, we measured the duration for which larvae need to remain adhered in order to initiate tail regression and the time needed for the tail to regress. Larvae acquire the ability to adhere to substrates before they acquire tail regression competence. We found that when larvae adhered before they acquired tail regression competence, they were able to remember the experience of adhesion until they acquired the ability to undergo tail regression. The time course of the events associated with tail regression provides a valuable reference, upon which the cellular and molecular mechanisms of ascidian metamorphosis can be elucidated. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Logistic Regression: Concept and Application

    Science.gov (United States)

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  4. Spontaneous regression of pulmonary bullae

    International Nuclear Information System (INIS)

    Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.

    2002-01-01

    The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd

  5. Sparse reduced-rank regression with covariance estimation

    KAUST Repository

    Chen, Lisha

    2014-12-08

    Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

  6. Sparse reduced-rank regression with covariance estimation

    KAUST Repository

    Chen, Lisha; Huang, Jianhua Z.

    2014-01-01

    Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

  7. Spontaneous regression of a large hepatocellular carcinoma: case report

    Directory of Open Access Journals (Sweden)

    Alqutub, Adel

    2011-01-01

    Full Text Available The prognosis of untreated advanced hepatocellular carcinoma (HCC is grim with a median survival of less than 6 months. Spontaneous regression of HCC has been defined as the disappearance of the hepatic lesions in the absence of any specific therapy. The spontaneous regression of a very large HCC is very rare and limited data is available in the English literature. We describe spontaneous regression of hepatocellular carcinoma in a 65-year-old male who presented to our clinic with vague abdominal pain and weight loss of two months duration. He was found to have multiple hepatic lesions with elevation of serum alpha-fetoprotein (AFP level to 6,500 µg/L (normal <20 µg/L. Computed tomography revealed advanced HCC replacing almost 80% of the right hepatic lobe. Without any intervention the patient showed gradual improvement over a period of few months. Follow-up CT scan revealed disappearance of hepatic lesions with progressive decline of AFP levels to normal. Various mechanisms have been postulated to explain this rare phenomenon, but the exact mechanism remains a mystery.

  8. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  9. Antinomias do zoológico humano: sociabilidade selvagem, reality shows e regressão da consciência

    Directory of Open Access Journals (Sweden)

    Francisco Rüdiger

    2008-11-01

    Full Text Available Estuda-se no artigo as articulações ideológicas e sentido histórico dos chamados reality shows na sociedade brasileira contemporânea. Em primeiro, situamos o gênero numa perspectiva histórica, sublinhado suas raí­zes religiosas e populares em conexão com a formação do sistema de poder próprio do Ocidente. Depois, expõem-se alguns aspectos do fenômeno, chamando atenção para sua estrutura interna e seu sentido concreto em nossa organização societária, com base nas suas versões brasileiras. Em terceiro, focamos os textos nas relações de poder que se articulam por meio desses programas, discutindo algumas das várias teorizações a seu respeito. Adiante e continuando a recorrer a matérias de imprensa, procede-se a um julgamento dessas últimas, visando propor uma interpretação histórica de seu significado. A conclusão retorna ao marco inicial e oferece uma visão geral em que talvez se possa pensar melhor o que está em jogo nos reality shows. Palavras-chave reality shows no Brasil, programas de televisão, sociabilidade Abstract This article analyses the ideological connections and historical meaning of the so-called reality shows in the contemporary Brazillian society. At first, we locate this genre in a historical perspective, stressing its religious and popular roots but also the connections between it and the power systems that have built Western World. Secondly, the text expose the main features of this kind of television show, calling attention to its inner structure but also to its meaning in our social organization, making critical remarks about their Brazillian versions. Focusing on the power relations that are articulated in it, we discuss some theories made about them. The historical meaning of these shows in our present circumstances is projected in the fourth stage of the article, that explores some materials extracted from the press. Finally, we return to the larger historical context

  10. Regression models of reactor diagnostic signals

    International Nuclear Information System (INIS)

    Vavrin, J.

    1989-01-01

    The application is described of an autoregression model as the simplest regression model of diagnostic signals in experimental analysis of diagnostic systems, in in-service monitoring of normal and anomalous conditions and their diagnostics. The method of diagnostics is described using a regression type diagnostic data base and regression spectral diagnostics. The diagnostics is described of neutron noise signals from anomalous modes in the experimental fuel assembly of a reactor. (author)

  11. Influences on call outcomes among Veteran callers to the National Veterans Crisis Line

    Science.gov (United States)

    Britton, Peter C.; Bossarte, Robert M.; Thompson, Caitlin; Kemp, Janet; Conner, Kenneth R.

    2016-01-01

    This evaluation examined the association of caller and call characteristics with proximal outcomes of Veterans Crisis Line calls. From October 1-7, 2010, 665 Veterans with recent suicidal ideation or a history of attempted suicide called the Veterans Crisis Line, 646 had complete data and were included in the analyses. A multivariable multinomial logistic regression was conducted to identify correlates of a favorable outcome, either a resolution or a referral, when compared to an unfavorable outcome, no resolution or referral. A multivariable logistic regression was used to identify correlates of responder-rated caller risk in a subset of calls. Approximately 84% of calls ended with a favorable outcome, 25% with a resolution and 59% with a referral to a local health care provider. Calls from high-risk callers had greater odds of ending with a referral than without a resolution or referral, as did weekday calls (6:00 am to 5:59 pm EST, Monday through Friday). Responders used caller intent to die and the absence of future plans to determine caller risk. Findings suggest that the Veterans Crisis Line is a useful mechanism for generating referrals for high-risk Veteran callers. Responders appeared to use known risk and protective factors to determine caller risk. PMID:23611446

  12. Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.

    Science.gov (United States)

    Lee, Sunbok; Lei, Man-Kit; Brody, Gene H

    2015-06-01

    Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).

  13. Application of principal component regression and partial least squares regression in ultraviolet spectrum water quality detection

    Science.gov (United States)

    Li, Jiangtong; Luo, Yongdao; Dai, Honglin

    2018-01-01

    Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.

  14. Two cases of Tolosa-Hunt syndrome showing interesting CT findings

    International Nuclear Information System (INIS)

    Abe, Masahiro; Hara, Yuzo; Ito, Noritaka; Nishimura, Mieko; Onishi, Yoshitaka; Hasuo, Kanehiro

    1982-01-01

    CT showed the lesion at the orbital apex in both of the 2 cases of Tolosa-Hunt syndrome. Steroid therapy resulted in improvement of clinical symptoms and regression of the lesion which was confirmed by CT. (Chiba, N.)

  15. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  16. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  17. The flinders sensitive line rats, a genetic model of depression, show abnormal serotonin receptor mRNA expression in the brain that is reversed by 17beta-estradiol.

    Science.gov (United States)

    Osterlund, M K; Overstreet, D H; Hurd, Y L

    1999-12-10

    The possible link between estrogen and serotonin (5-HT) in depression was investigated using a genetic animal model of depression, the Flinders Sensitive Line (FSL) rats, in comparison to control Flinders Resistant Line rats. The mRNA levels of the estrogen receptor (ER) alpha and beta subtypes and the 5-HT(1A) and 5-HT(2A) receptors were analyzed in several limbic-related areas of ovariectomized FSL and FRL rats treated with 17beta-estradiol (0.15 microg/g) or vehicle. The FSL animals were shown to express significantly lower levels of the 5-HT(2A) receptor transcripts in the perirhinal cortex, piriform cortex, and medial anterodorsal amygdala and higher levels in the CA 2-3 region of the hippocampus. The only significant difference between the rat lines in ER mRNA expression was found in the medial posterodorsal amygdala, where the FSL rats showed lower ERalpha expression levels. Overall, estradiol treatment increased 5-HT(2A) and decreased 5-HT(1A) receptor mRNA levels in several of the examined regions of both lines. Thus, in many areas, estradiol was found to regulate the 5-HT receptor mRNA expression in the opposite direction to the alterations found in the FSL rats. These findings further support the implication of 5-HT receptors, in particular the 5-HT(2A) subtype, in the etiology of affective disorders. Moreover, the ability of estradiol to regulate the expression of the 5-HT(1A) and 5-HT(2A) receptor genes might account for the reported influence of gonadal hormones in mood and depression.

  18. Nonparametric additive regression for repeatedly measured data

    KAUST Repository

    Carroll, R. J.

    2009-05-20

    We develop an easily computed smooth backfitting algorithm for additive model fitting in repeated measures problems. Our methodology easily copes with various settings, such as when some covariates are the same over repeated response measurements. We allow for a working covariance matrix for the regression errors, showing that our method is most efficient when the correct covariance matrix is used. The component functions achieve the known asymptotic variance lower bound for the scalar argument case. Smooth backfitting also leads directly to design-independent biases in the local linear case. Simulations show our estimator has smaller variance than the usual kernel estimator. This is also illustrated by an example from nutritional epidemiology. © 2009 Biometrika Trust.

  19. Length bias correction in gene ontology enrichment analysis using logistic regression.

    Science.gov (United States)

    Mi, Gu; Di, Yanming; Emerson, Sarah; Cumbie, Jason S; Chang, Jeff H

    2012-01-01

    When assessing differential gene expression from RNA sequencing data, commonly used statistical tests tend to have greater power to detect differential expression of genes encoding longer transcripts. This phenomenon, called "length bias", will influence subsequent analyses such as Gene Ontology enrichment analysis. In the presence of length bias, Gene Ontology categories that include longer genes are more likely to be identified as enriched. These categories, however, are not necessarily biologically more relevant. We show that one can effectively adjust for length bias in Gene Ontology analysis by including transcript length as a covariate in a logistic regression model. The logistic regression model makes the statistical issue underlying length bias more transparent: transcript length becomes a confounding factor when it correlates with both the Gene Ontology membership and the significance of the differential expression test. The inclusion of the transcript length as a covariate allows one to investigate the direct correlation between the Gene Ontology membership and the significance of testing differential expression, conditional on the transcript length. We present both real and simulated data examples to show that the logistic regression approach is simple, effective, and flexible.

  20. Boosted regression trees, multivariate adaptive regression splines and their two-step combinations with multiple linear regression or partial least squares to predict blood-brain barrier passage: a case study.

    Science.gov (United States)

    Deconinck, E; Zhang, M H; Petitet, F; Dubus, E; Ijjaali, I; Coomans, D; Vander Heyden, Y

    2008-02-18

    The use of some unconventional non-linear modeling techniques, i.e. classification and regression trees and multivariate adaptive regression splines-based methods, was explored to model the blood-brain barrier (BBB) passage of drugs and drug-like molecules. The data set contains BBB passage values for 299 structural and pharmacological diverse drugs, originating from a structured knowledge-based database. Models were built using boosted regression trees (BRT) and multivariate adaptive regression splines (MARS), as well as their respective combinations with stepwise multiple linear regression (MLR) and partial least squares (PLS) regression in two-step approaches. The best models were obtained using combinations of MARS with either stepwise MLR or PLS. It could be concluded that the use of combinations of a linear with a non-linear modeling technique results in some improved properties compared to the individual linear and non-linear models and that, when the use of such a combination is appropriate, combinations using MARS as non-linear technique should be preferred over those with BRT, due to some serious drawbacks of the BRT approaches.

  1. Tracking time-varying parameters with local regression

    DEFF Research Database (Denmark)

    Joensen, Alfred Karsten; Nielsen, Henrik Aalborg; Nielsen, Torben Skov

    2000-01-01

    This paper shows that the recursive least-squares (RLS) algorithm with forgetting factor is a special case of a varying-coe\\$cient model, and a model which can easily be estimated via simple local regression. This observation allows us to formulate a new method which retains the RLS algorithm, bu......, but extends the algorithm by including polynomial approximations. Simulation results are provided, which indicates that this new method is superior to the classical RLS method, if the parameter variations are smooth....

  2. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  3. Research on Line Patrol Strategy of 110kV Transmission Line after Lightning Strike

    Directory of Open Access Journals (Sweden)

    Li Mingjun

    2016-01-01

    Full Text Available Lightning faults occupy in the majority of instantaneous fault and reclosing can usually be successful, so power supply can be restored without immediate patrol in many cases. Firstly, this paper introduces the lightning fault positioning and identifying method. Then test electrical performance of insulators after lightning strike from 110kV lines. Data shows that lightning strike has little effect on the electric performance of insulator. Finally, illustrating disposal process of the 110 kV transmission line after lightning fault, certifying that the power supply reliability be ensured without line patrol.

  4. The End of the Lines for OX 169: No Binary Broad-Line Region

    Science.gov (United States)

    Halpern, J. P.; Eracleous, M.

    2000-03-01

    We show that unusual Balmer emission-line profiles of the quasar OX 169, frequently described as either self-absorbed or double peaked, are actually neither. The effect is an illusion resulting from two coincidences. First, the forbidden lines are quite strong and broad. Consequently, the [N II] λ6583 line and the associated narrow-line component of Hα present the appearance of twin Hα peaks. Second, the redshift of 0.2110 brings Hβ into coincidence with Na I D at zero redshift, and ISM absorption in Na I D divides the Hβ emission line. In spectra obtained over the past decade, we see no substantial change in the character of the line profiles and no indication of intrinsic double-peaked structure. The Hγ, Mg II, and Lyα emission lines are single peaked, and all of the emission-line redshifts are consistent once they are correctly attributed to their permitted and forbidden-line identifications. A systematic shift of up to 700 km s-1 between broad and narrow lines is seen, but such differences are common and could be due to gravitational and transverse redshift in a low-inclination disk. Stockton & Farnham had called attention to an apparent tidal tail in the host galaxy of OX 169 and speculated that a recent merger had supplied the nucleus with a coalescing pair of black holes that was now revealing its existence in the form of two physically distinct broad-line regions. Although there is no longer any evidence for two broad emission-line regions in OX 169, binary black holes should form frequently in galaxy mergers, and it is still worthwhile to monitor the radial velocities of emission lines that could supply evidence of their existence in certain objects.

  5. Assessment of deforestation using regression; Hodnotenie odlesnenia s vyuzitim regresie

    Energy Technology Data Exchange (ETDEWEB)

    Juristova, J. [Univerzita Komenskeho, Prirodovedecka fakulta, Katedra kartografie, geoinformatiky a DPZ, 84215 Bratislava (Slovakia)

    2013-04-16

    This work is devoted to the evaluation of deforestation using regression methods through software Idrisi Taiga. Deforestation is evaluated by the method of logistic regression. The dependent variable has discrete values '0' and '1', indicating that the deforestation occurred or not. Independent variables have continuous values, expressing the distance from the edge of the deforested areas of forests from urban areas, the river and the road network. The results were also used in predicting the probability of deforestation in subsequent periods. The result is a map showing the output probability of deforestation for the periods 1990/2000 and 200/2006 in accordance with predetermined coefficients (values of independent variables). (authors)

  6. Rank-Optimized Logistic Matrix Regression toward Improved Matrix Data Classification.

    Science.gov (United States)

    Zhang, Jianguang; Jiang, Jianmin

    2018-02-01

    While existing logistic regression suffers from overfitting and often fails in considering structural information, we propose a novel matrix-based logistic regression to overcome the weakness. In the proposed method, 2D matrices are directly used to learn two groups of parameter vectors along each dimension without vectorization, which allows the proposed method to fully exploit the underlying structural information embedded inside the 2D matrices. Further, we add a joint [Formula: see text]-norm on two parameter matrices, which are organized by aligning each group of parameter vectors in columns. This added co-regularization term has two roles-enhancing the effect of regularization and optimizing the rank during the learning process. With our proposed fast iterative solution, we carried out extensive experiments. The results show that in comparison to both the traditional tensor-based methods and the vector-based regression methods, our proposed solution achieves better performance for matrix data classifications.

  7. [Bibliometrics and visualization analysis of land use regression models in ambient air pollution research].

    Science.gov (United States)

    Zhang, Y J; Zhou, D H; Bai, Z P; Xue, F X

    2018-02-10

    Objective: To quantitatively analyze the current status and development trends regarding the land use regression (LUR) models on ambient air pollution studies. Methods: Relevant literature from the PubMed database before June 30, 2017 was analyzed, using the Bibliographic Items Co-occurrence Matrix Builder (BICOMB 2.0). Keywords co-occurrence networks, cluster mapping and timeline mapping were generated, using the CiteSpace 5.1.R5 software. Relevant literature identified in three Chinese databases was also reviewed. Results: Four hundred sixty four relevant papers were retrieved from the PubMed database. The number of papers published showed an annual increase, in line with the growing trend of the index. Most papers were published in the journal of Environmental Health Perspectives . Results from the Co-word cluster analysis identified five clusters: cluster#0 consisted of birth cohort studies related to the health effects of prenatal exposure to air pollution; cluster#1 referred to land use regression modeling and exposure assessment; cluster#2 was related to the epidemiology on traffic exposure; cluster#3 dealt with the exposure to ultrafine particles and related health effects; cluster#4 described the exposure to black carbon and related health effects. Data from Timeline mapping indicated that cluster#0 and#1 were the main research areas while cluster#3 and#4 were the up-coming hot areas of research. Ninety four relevant papers were retrieved from the Chinese databases with most of them related to studies on modeling. Conclusion: In order to better assess the health-related risks of ambient air pollution, and to best inform preventative public health intervention policies, application of LUR models to environmental epidemiology studies in China should be encouraged.

  8. Quantile regression for the statistical analysis of immunological data with many non-detects.

    Science.gov (United States)

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  9. A comparison of alternative methods for estimating the self-thinning boundary line

    Science.gov (United States)

    Lianjun Zhang; Huiquan Bi; Jeffrey H. Gove; Linda S. Heath

    2005-01-01

    The fundamental validity of the self-thinning "law" has been debated over the last three decades. A long-sanding concern centers on how to objectively select data points for fitting the self-thinning line and the most appropriate regression method for estimating the two coefficients. Using data from an even-aged Pinus strobus L. stand as an...

  10. Regression Analysis by Example. 5th Edition

    Science.gov (United States)

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  11. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  12. Is past life regression therapy ethical?

    Science.gov (United States)

    Andrade, Gabriel

    2017-01-01

    Past life regression therapy is used by some physicians in cases with some mental diseases. Anxiety disorders, mood disorders, and gender dysphoria have all been treated using life regression therapy by some doctors on the assumption that they reflect problems in past lives. Although it is not supported by psychiatric associations, few medical associations have actually condemned it as unethical. In this article, I argue that past life regression therapy is unethical for two basic reasons. First, it is not evidence-based. Past life regression is based on the reincarnation hypothesis, but this hypothesis is not supported by evidence, and in fact, it faces some insurmountable conceptual problems. If patients are not fully informed about these problems, they cannot provide an informed consent, and hence, the principle of autonomy is violated. Second, past life regression therapy has the great risk of implanting false memories in patients, and thus, causing significant harm. This is a violation of the principle of non-malfeasance, which is surely the most important principle in medical ethics.

  13. Cusps enable line attractors for neural computation

    International Nuclear Information System (INIS)

    Xiao, Zhuocheng; Zhang, Jiwei; Sornborger, Andrew T.; Tao, Louis

    2017-01-01

    Here, line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next. To understand how pulse-gating manifests itself in a high-dimensional, nonlinear, feedforward integrate-and-fire network, we use a Fokker-Planck approach to analyze system dynamics. We make a connection between pulse-gated propagation in the Fokker-Planck and population-averaged mean-field (firing rate) models, and then identify an approximate line attractor in state space as the essential structure underlying graded information propagation. An analysis of the line attractor shows that it consists of three fixed points: a central saddle with an unstable manifold along the line and stable manifolds orthogonal to the line, which is surrounded on either side by stable fixed points. Along the manifold defined by the fixed points, slow dynamics give rise to a ghost. We show that this line attractor arises at a cusp catastrophe, where a fold bifurcation develops as a function of synaptic noise; and that the ghost dynamics near the fold of the cusp underly the robustness of the line attractor. Understanding the dynamical aspects of this cusp catastrophe allows us to show how line attractors can persist in biologically realistic neuronal networks and how the interplay of pulse gating, synaptic coupling, and neuronal stochasticity can be used to enable attracting one-dimensional manifolds and, thus, dynamically control the processing of graded information.

  14. Cusps enable line attractors for neural computation

    Science.gov (United States)

    Xiao, Zhuocheng; Zhang, Jiwei; Sornborger, Andrew T.; Tao, Louis

    2017-11-01

    Line attractors in neuronal networks have been suggested to be the basis of many brain functions, such as working memory, oculomotor control, head movement, locomotion, and sensory processing. In this paper, we make the connection between line attractors and pulse gating in feed-forward neuronal networks. In this context, because of their neutral stability along a one-dimensional manifold, line attractors are associated with a time-translational invariance that allows graded information to be propagated from one neuronal population to the next. To understand how pulse-gating manifests itself in a high-dimensional, nonlinear, feedforward integrate-and-fire network, we use a Fokker-Planck approach to analyze system dynamics. We make a connection between pulse-gated propagation in the Fokker-Planck and population-averaged mean-field (firing rate) models, and then identify an approximate line attractor in state space as the essential structure underlying graded information propagation. An analysis of the line attractor shows that it consists of three fixed points: a central saddle with an unstable manifold along the line and stable manifolds orthogonal to the line, which is surrounded on either side by stable fixed points. Along the manifold defined by the fixed points, slow dynamics give rise to a ghost. We show that this line attractor arises at a cusp catastrophe, where a fold bifurcation develops as a function of synaptic noise; and that the ghost dynamics near the fold of the cusp underly the robustness of the line attractor. Understanding the dynamical aspects of this cusp catastrophe allows us to show how line attractors can persist in biologically realistic neuronal networks and how the interplay of pulse gating, synaptic coupling, and neuronal stochasticity can be used to enable attracting one-dimensional manifolds and, thus, dynamically control the processing of graded information.

  15. Regression analysis of a chemical reaction fouling model

    International Nuclear Information System (INIS)

    Vasak, F.; Epstein, N.

    1996-01-01

    A previously reported mathematical model for the initial chemical reaction fouling of a heated tube is critically examined in the light of the experimental data for which it was developed. A regression analysis of the model with respect to that data shows that the reference point upon which the two adjustable parameters of the model were originally based was well chosen, albeit fortuitously. (author). 3 refs., 2 tabs., 2 figs

  16. Regression Models for Market-Shares

    DEFF Research Database (Denmark)

    Birch, Kristina; Olsen, Jørgen Kai; Tjur, Tue

    2005-01-01

    On the background of a data set of weekly sales and prices for three brands of coffee, this paper discusses various regression models and their relation to the multiplicative competitive-interaction model (the MCI model, see Cooper 1988, 1993) for market-shares. Emphasis is put on the interpretat......On the background of a data set of weekly sales and prices for three brands of coffee, this paper discusses various regression models and their relation to the multiplicative competitive-interaction model (the MCI model, see Cooper 1988, 1993) for market-shares. Emphasis is put...... on the interpretation of the parameters in relation to models for the total sales based on discrete choice models.Key words and phrases. MCI model, discrete choice model, market-shares, price elasitcity, regression model....

  17. Regimen durability in HIV-infected children and adolescents initiating first-line ART in a large public sector HIV cohort in South Africa.

    Science.gov (United States)

    Bonawitz, Rachael; Brennan, Alana T; Long, Lawrence; Heeren, Timothy; Maskew, Mhairi; Sanne, Ian; Fox, Matthew P

    2018-04-15

    In April 2010 tenofovir and abacavir replaced stavudine in public-sector first-line antiretroviral therapy (ART) for children under 20 years old in South Africa. The association of both abacavir and tenofovir with fewer side-effects and toxicities compared to stavudine could translate to increased durability of tenofovir or abacavir-based regimens. We evaluated changes over time in regimen durability for pediatric patients 3 to 19 years of age at 8 public sector clinics in Johannesburg, South Africa. Cohort analysis of treatment naïve, non-pregnant pediatric patients from 3 to 19 years old initiated on ART between April 2004-December 2013. First-line ART regimens before April 2010 consisted of stavudine or zidovudine with lamivudine and either efavirenz or nevirapine. Tenofovir and/or abacavir was substituted for stavudine after April 2010 in first-line ART. We evaluated the frequency and type of single-drug substitutions, treatment interruptions, and switches to second-line therapy. Fine and Gray competing risk regression models were used to evaluate the association of antiretroviral drug type with single-drug substitutions, treatment interruptions, and second-line switches in the first 24-months on treatment. 398 (15.3%) single-drug substitutions, 187 (7.2%) treatment interruptions and 86 (3.3%) switches to second-line therapy occurred among 2602 pediatric patients over 24-months on ART. Overall, the rate of single-drug substitutions started to increase in 2009, peaked in 2011 at 25%, then declined to 10% in 2013, well after the integration of tenofovir into pediatric regimens; no patients over the age of 3 were initiated on abacavir for first-line therapy. Competing risk regression models showed patients on zidovudine or stavudine had upwards of a 5-fold increase in single-drug substitution vs. patients initiated on tenofovir in the first 24-months on ART. Older adolescents also had a 2-3-fold increase in treatment interruptions and switches to second-line

  18. Constancy of spectral-line bisectors

    International Nuclear Information System (INIS)

    Gray, D.F.

    1983-01-01

    Bisectors of spectral line profiles in cool stars indicate the strength of convection in the photospheres of these objects. The present investigation is concerned with the feasibility of studying time variations in line bisectors, the reality of apparent line-to-line differences within the same stellar spectrum, and bisector differences between stars of identical spectral types. The differences considered pertain to the shape of the bisector. The material used in the investigation was acquired at the McDonald Observatory using a 1728 diode Reticon array at the coudefocus of the 2.1-m telescope. Observed bisector errors are discussed. It is established that different lines in the same star show significantly different bisectors. The observed error bands are shown by the shaded regions. The slope and curvature are unique for each case

  19. EFFECTING FACTORS DELIVERED FINANCIAL REPORTING TIME LINES AT MANUFACTURING COMPANY GROUPS LISTED IDX

    Directory of Open Access Journals (Sweden)

    Sunaryo Sunaryo

    2012-11-01

    Full Text Available The primary objective of this research is to learn the effect among ROA, Leverage, Company Size, and Outsider Ownership with time lines, either partially or simultaneously. Secondary data were collected by purposive sampling of manufacturing company groups listed on IDX and the preceding scientific research journals, using logistic regression to test the hypothesis simultaneously. The results of this research describe that ROA and Leverage do not significant effect to time lines, but company size and outsider ownership have significant effect to time lines. It is recommended that the topic of this research can be continued with merchandising company groups, or service company groups either general or special, like: hotels, insurances, bankings; or, with new independence variables added. 

  20. Cancer burden trends in Umbria region using a joinpoint regression

    Directory of Open Access Journals (Sweden)

    Giuseppe Michele Masanotti

    2015-09-01

    Full Text Available INTRODUCTION. The analysis of the epidemiological data on cancer is an important tool to control and evaluate the outcomes of primary and secondary prevention, the effectiveness of health care and, in general, all cancer control activities. MATERIALS AND METHODS. The aim of the this paper is to analyze the cancer mortality in the Umbria region from 1978 to 2009 and incidence from 1994-2008. Sex and site-specific trends for standardized rates were analyzed by "joinpoint regression", using the surveillance epidemiology and end results (SEER software. RESULTS. Applying the jointpoint analyses by sex and cancer site, to incidence spanning from 1994 to 2008 and mortality from 1978 to 2009 for all sites, both in males and females, a significant joinpoint for mortality was found; moreover the trend shape was similar and the joinpoint years were very close. In males standardized rate significantly increased up to 1989 by 1.23% per year and significantly decreased thereafter by -1.31%; among females the mortality rate increased in average of 0.78% (not significant per year till 1988 and afterward significantly decreased by -0.92% per year. Incidence rate showed different trends among sexes. In males was practically constant over the period studied (not significant increase 0.14% per year, in females significantly increased by 1.49% per year up to 2001 and afterward slowly decreased (-0.71% n.s. estimated annual percent change − EAPC. CONCLUSIONS. For all sites combined trends for mortality decreased since late '80s, both in males and females; such behaviour is in line with national and European Union data. This work shows that, even compared to health systems that invest more resources, the Umbria public health system achieved good health outcomes.

  1. Detection of epistatic effects with logic regression and a classical linear regression model.

    Science.gov (United States)

    Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata

    2014-02-01

    To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.

  2. Automatic selection of indicators in a fully saturated regression

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren; Santos, Carlos

    2008-01-01

    We consider selecting a regression model, using a variant of Gets, when there are more variables than observations, in the special case that the variables are impulse dummies (indicators) for every observation. We show that the setting is unproblematic if tackled appropriately, and obtain the fin...... the finite-sample distribution of estimators of the mean and variance in a simple location-scale model under the null that no impulses matter. A Monte Carlo simulation confirms the null distribution, and shows power against an alternative of interest....

  3. Correlation, regression, and cointegration of nonstationary economic time series

    DEFF Research Database (Denmark)

    Johansen, Søren

    Yule (1926) introduced the concept of spurious or nonsense correlation, and showed by simulation that for some nonstationary processes, that the empirical correlations seem not to converge in probability even if the processes were independent. This was later discussed by Granger and Newbold (1974......), and Phillips (1986) found the limit distributions. We propose to distinguish between empirical and population correlation coeffients and show in a bivariate autoregressive model for nonstationary variables that the empirical correlation and regression coe¢ cients do not converge to the relevant population...

  4. Line ratios and wavelengths of helium-like argon n=2 satellite transitions and resonance lines

    International Nuclear Information System (INIS)

    Biedermann, C.; Radtke, R.; Fournier, K.

    2003-01-01

    The characteristic X-ray emission from helium-like argon was investigated as a mean to diagnose hot plasmas. We have measured the radiation from n=2-1 parent lines and from KLn dielectronic recombination satellites with high wavelength resolution as function of the excitation energy using the Berlin Electron Beam Ion Trap. Values of wavelength relative to the resonance and forbidden line are tabulated and compared with references. The line intensity observed over a wide range of excitation energies is weighted with a Maxwellian electron-energy distribution to analyze line ratios as function of plasma temperature. Line ratios (j+z)/w and k/w compare nicely with theoretical predictions and demonstrate their applicability as temperature diagnostic. The ratio z/(x+y) shows not to depend on the electron density

  5. A Product Line Enhanced Unified Process

    DEFF Research Database (Denmark)

    Zhang, Weishan; Kunz, Thomas

    2006-01-01

    The Unified Process facilitates reuse for a single system, but falls short handling multiple similar products. In this paper we present an enhanced Unified Process, called UPEPL, integrating the product line technology in order to alleviate this problem. In UPEPL, the product line related activit...... activities are added and could be conducted side by side with other classical UP activities. In this way both the advantages of Unified Process and software product lines could co-exist in UPEPL. We show how to use UPEPL with an industrial mobile device product line in our case study....

  6. Generalized allometric regression to estimate biomass of Populus in short-rotation coppice

    Energy Technology Data Exchange (ETDEWEB)

    Ben Brahim, Mohammed; Gavaland, Andre; Cabanettes, Alain [INRA Centre de Toulouse, Castanet-Tolosane Cedex (France). Unite Agroforesterie et Foret Paysanne

    2000-07-01

    Data from four different stands were combined to establish a single generalized allometric equation to estimate above-ground biomass of individual Populus trees grown on short-rotation coppice. The generalized model was performed using diameter at breast height, the mean diameter and the mean height of each site as dependent variables and then compared with the stand-specific regressions using F-test. Results showed that this single regression estimates tree biomass well at each stand and does not introduce bias with increasing diameter.

  7. How Do Different Aspects of Spatial Skills Relate to Early Arithmetic and Number Line Estimation?

    Directory of Open Access Journals (Sweden)

    Véronique Cornu

    2017-12-01

    Full Text Available The present study investigated the predictive role of spatial skills for arithmetic and number line estimation in kindergarten children (N = 125. Spatial skills are known to be related to mathematical development, but due to the construct’s non-unitary nature, different aspects of spatial skills need to be differentiated. In the present study, a spatial orientation task, a spatial visualization task and visuo-motor integration task were administered to assess three different aspects of spatial skills. Furthermore, we assessed counting abilities, knowledge of Arabic numerals, quantitative knowledge, as well as verbal working memory and verbal intelligence in kindergarten. Four months later, the same children performed an arithmetic and a number line estimation task to evaluate how the abilities measured at Time 1 predicted early mathematics outcomes. Hierarchical regression analysis revealed that children’s performance in arithmetic was predicted by their performance on the spatial orientation and visuo-motor integration task, as well as their knowledge of the Arabic numerals. Performance in number line estimation was significantly predicted by the children’s spatial orientation performance. Our findings emphasize the role of spatial skills, notably spatial orientation, in mathematical development. The relation between spatial orientation and arithmetic was partially mediated by the number line estimation task. Our results further show that some aspects of spatial skills might be more predictive of mathematical development than others, underlining the importance to differentiate within the construct of spatial skills when it comes to understanding numerical development.

  8. The U-line line balancing problem

    NARCIS (Netherlands)

    Miltenburg, G.J.; Wijngaard, J.

    1994-01-01

    The traditional line balancing (LB) problem considers a production line in which stations are arranged consecutively in a line. A balance is determined by grouping tasks into stations while moving forward (or backward) through a precedence network. Recently many production lines are being arranged

  9. Characterizing performances of solder paste printing process at flexible manufacturing lines

    International Nuclear Information System (INIS)

    Siew, Jit Ping; Low, Heng Chin; Teoh, Ping Chow

    2015-01-01

    Solder paste printing (SPP) has been a challenge on printed circuit board (PCB) manufacturing, evident by the proliferation of solder paste inspection equipment, or substituted by rigorous non-value added activity of manual inspections. The objective of this study is to characterize the SPP performance of various products manufactured in flexible production lines with different equipment configurations, and determine areas for process improvement. The study began by collecting information on SPP performance relative to component placement (CP) process, and to the proportion of mixed products. Using a clustering algorithm to group similar elements together, SPP performance across all product-production line pairs are statistically modeled to discover the trend and the influential factors. The main findings are: (a) Ratio of overall dpku for CP and SPP processes are 2:1; (b) logistic regression models of SPP performance indicated that only effects of product-production line and solder paste printer configuration are significant; (c) PCB circuitry design with BGA components and single solder paste printer line configurations generated the highest monthly defects, with the highest variation in the latter

  10. Characterizing performances of solder paste printing process at flexible manufacturing lines

    Energy Technology Data Exchange (ETDEWEB)

    Siew, Jit Ping; Low, Heng Chin [University of Science Malaysia, 11800 Minden, Penang (Malaysia); Teoh, Ping Chow [Wawasan Open University, 54 Jalan Sultan Ahmad Shah, 10050 Penang (Malaysia)

    2015-02-03

    Solder paste printing (SPP) has been a challenge on printed circuit board (PCB) manufacturing, evident by the proliferation of solder paste inspection equipment, or substituted by rigorous non-value added activity of manual inspections. The objective of this study is to characterize the SPP performance of various products manufactured in flexible production lines with different equipment configurations, and determine areas for process improvement. The study began by collecting information on SPP performance relative to component placement (CP) process, and to the proportion of mixed products. Using a clustering algorithm to group similar elements together, SPP performance across all product-production line pairs are statistically modeled to discover the trend and the influential factors. The main findings are: (a) Ratio of overall dpku for CP and SPP processes are 2:1; (b) logistic regression models of SPP performance indicated that only effects of product-production line and solder paste printer configuration are significant; (c) PCB circuitry design with BGA components and single solder paste printer line configurations generated the highest monthly defects, with the highest variation in the latter.

  11. Bike Map Lines

    Data.gov (United States)

    Town of Chapel Hill, North Carolina — Chapel Hill Bike Map Lines from KMZ file.This data came from the wiki comment board for the public, not an “official map” showing the Town of Chapel Hill's plans or...

  12. Regression analysis using dependent Polya trees.

    Science.gov (United States)

    Schörgendorfer, Angela; Branscum, Adam J

    2013-11-30

    Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Quantitative proteomics identifies central players in erlotinib resistance of the non-small cell lung cancer cell line HCC827

    DEFF Research Database (Denmark)

    Jacobsen, Kirstine; Lund, Rikke Raaen; Beck, Hans Christian

    Background: Erlotinib (Tarceva®, Roche) has significantly changed the treatment of non-small cell lung cancer (NSCLC) as 70% of patients show significant tumor regression when treated. However, all patients relapse due to development of acquired resistance, which in 43-50% of cases are caused...... by a secondary mutation (T790M) in EGFR. Importantly, a majority of resistance cases are still unexplained. Our aim is to identify novel resistance mechanisms in erlotinib-resistant subclones of the NSCLC cell line HCC827. Materials & Methods: We established 3 erlotinib-resistant subclones (resistant to 10, 20...... or other EGFR or KRAS mutations, potentiating the identification of novel resistance mechanisms. We identified 2875 cytoplasmic proteins present in all 4 cell lines. Of these 87, 56 and 23 are upregulated >1.5 fold; and 117, 72 and 32 are downregulated >1.5 fold, respectively, in the 3 resistant clones...

  14. Clinical value of regression of electrocardiographic left ventricular hypertrophy after aortic valve replacement.

    Science.gov (United States)

    Yamabe, Sayuri; Dohi, Yoshihiro; Higashi, Akifumi; Kinoshita, Hiroki; Sada, Yoshiharu; Hidaka, Takayuki; Kurisu, Satoshi; Shiode, Nobuo; Kihara, Yasuki

    2016-09-01

    Electrocardiographic left ventricular hypertrophy (ECG-LVH) gradually regressed after aortic valve replacement (AVR) in patients with severe aortic stenosis. Sokolow-Lyon voltage (SV1 + RV5/6) is possibly the most widely used criterion for ECG-LVH. The aim of this study was to determine whether decrease in Sokolow-Lyon voltage reflects left ventricular reverse remodeling detected by echocardiography after AVR. Of 129 consecutive patients who underwent AVR for severe aortic stenosis, 38 patients with preoperative ECG-LVH, defined by SV1 + RV5/6 of ≥3.5 mV, were enrolled in this study. Electrocardiography and echocardiography were performed preoperatively and 1 year postoperatively. The patients were divided into ECG-LVH regression group (n = 19) and non-regression group (n = 19) according to the median value of the absolute regression in SV1 + RV5/6. Multivariate logistic regression analysis was performed to assess determinants of ECG-LVH regression among echocardiographic indices. ECG-LVH regression group showed significantly greater decrease in left ventricular mass index and left ventricular dimensions than Non-regression group. ECG-LVH regression was independently determined by decrease in the left ventricular mass index [odds ratio (OR) 1.28, 95 % confidence interval (CI) 1.03-1.69, p = 0.048], left ventricular end-diastolic dimension (OR 1.18, 95 % CI 1.03-1.41, p = 0.014), and left ventricular end-systolic dimension (OR 1.24, 95 % CI 1.06-1.52, p = 0.0047). ECG-LVH regression could be a marker of the effect of AVR on both reducing the left ventricular mass index and left ventricular dimensions. The effect of AVR on reverse remodeling can be estimated, at least in part, by regression of ECG-LVH.

  15. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  16. Multiple Linear Regression Analysis of Factors Affecting Real Property Price Index From Case Study Research In Istanbul/Turkey

    Science.gov (United States)

    Denli, H. H.; Koc, Z.

    2015-12-01

    Estimation of real properties depending on standards is difficult to apply in time and location. Regression analysis construct mathematical models which describe or explain relationships that may exist between variables. The problem of identifying price differences of properties to obtain a price index can be converted into a regression problem, and standard techniques of regression analysis can be used to estimate the index. Considering regression analysis for real estate valuation, which are presented in real marketing process with its current characteristics and quantifiers, the method will help us to find the effective factors or variables in the formation of the value. In this study, prices of housing for sale in Zeytinburnu, a district in Istanbul, are associated with its characteristics to find a price index, based on information received from a real estate web page. The associated variables used for the analysis are age, size in m2, number of floors having the house, floor number of the estate and number of rooms. The price of the estate represents the dependent variable, whereas the rest are independent variables. Prices from 60 real estates have been used for the analysis. Same price valued locations have been found and plotted on the map and equivalence curves have been drawn identifying the same valued zones as lines.

  17. Spontaneous regression of epithelial downgrowth from clear corneal phacoemulsification wound

    Directory of Open Access Journals (Sweden)

    Ryan M. Jaber

    2018-06-01

    Full Text Available Purpose: To report a case of spontaneous regression of optical coherence tomography (OCT and confocal microscopy-supported epithelial downgrowth associated with clear corneal phacoemulsification wound. Observations: A 66-year-old Caucasian male presented two years after phacoemulsification in the left eye with an enlarging cornea endothelial lesion in that eye. His early post-operative course had been complicated by corneal edema and iris transillumination defects. The patient presented to our clinic with a large geographic sheet of epithelial downgrowth and iris synechiae to the temporal clear corneal wound. His vision was correctable to 20/25 in his left eye. Anterior segment OCT showed a hyperreflective layer on the posterior cornea with an abrupt transition that corresponded to the clinical transition zone of the epithelial downgrowth. Confocal microscopy showed polygonal cells with hyperreflective nuclei suggestive of epithelial cells in the area of the lesion with a transition to a normal endothelial cell mosaic. Given the lack of glaucoma or inflammation and the relatively good vision, the plan was made to closely monitor for progression with the anticipation that he may require aggressive surgery. Over course of subsequent follow-up visits at three, seven and ten months; the endothelial lesion receded significantly. Confocal imaging in the area of the previously affected cornea showed essentially normal morphology with anan endothelial cell count of 1664 cells/mm2. Conclusions and importance: Epithelial downgrowth may spontaneously regress. Though the mechanism is yet understood, contact inhibition of movement may play a role. Despite this finding, epithelial downgrowth is typically a devastating process requiring aggressive treatment. Keywords: Epithelial downgrowth, Spontaneous regression, Confocal microscopy, Contact inhibition of movement

  18. Sintering equation: determination of its coefficients by experiments - using multiple regression

    International Nuclear Information System (INIS)

    Windelberg, D.

    1999-01-01

    Sintering is a method for volume-compression (or volume-contraction) of powdered or grained material applying high temperature (less than the melting point of the material). Maekipirtti tried to find an equation which describes the process of sintering by its main parameters sintering time, sintering temperature and volume contracting. Such equation is called a sintering equation. It also contains some coefficients which characterise the behaviour of the material during the process of sintering. These coefficients have to be determined by experiments. Here we show that some linear regressions will produce wrong coefficients, but multiple regression results in an useful sintering equation. (orig.)

  19. Regression of environmental noise in LIGO data

    International Nuclear Information System (INIS)

    Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G

    2015-01-01

    We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)

  20. Normative data for distal line bisection and baking tray task.

    Science.gov (United States)

    Facchin, Alessio; Beschin, Nicoletta; Pisano, Alessia; Reverberi, Cristina

    2016-09-01

    Line bisection is one of the tests used to diagnose unilateral spatial neglect (USN). Despite its wide application, no procedure or norms were available for the distal variant when the task was performed at distance with a laser pointer. Furthermore, the baking tray task was an ecological test aimed at diagnosing USN in a more natural context. The aim of this study was to collect normative values for these two tests in an Italian population. We recruited a sample of 191 healthy subjects with ages ranging from 20 to 89 years. They performed line bisection with a laser pointer on three different line lengths (1, 1.5, and 2 m) at a distance of 3 m. After this task, the subjects performed the baking tray task and a second repetition of line bisection to test the reliability of measurement. Multiple regression analysis revealed no significant effects of demographic variables on the performance of both tests. Normative cut-off values for the two tests were developed using non-parametric tolerance intervals. The results formed the basis for clinical use of these two tools for assessing lateralized performance of patients with brain injury and for diagnosing USN.

  1. Modeling Governance KB with CATPCA to Overcome Multicollinearity in the Logistic Regression

    Science.gov (United States)

    Khikmah, L.; Wijayanto, H.; Syafitri, U. D.

    2017-04-01

    The problem often encounters in logistic regression modeling are multicollinearity problems. Data that have multicollinearity between explanatory variables with the result in the estimation of parameters to be bias. Besides, the multicollinearity will result in error in the classification. In general, to overcome multicollinearity in regression used stepwise regression. They are also another method to overcome multicollinearity which involves all variable for prediction. That is Principal Component Analysis (PCA). However, classical PCA in only for numeric data. Its data are categorical, one method to solve the problems is Categorical Principal Component Analysis (CATPCA). Data were used in this research were a part of data Demographic and Population Survey Indonesia (IDHS) 2012. This research focuses on the characteristic of women of using the contraceptive methods. Classification results evaluated using Area Under Curve (AUC) values. The higher the AUC value, the better. Based on AUC values, the classification of the contraceptive method using stepwise method (58.66%) is better than the logistic regression model (57.39%) and CATPCA (57.39%). Evaluation of the results of logistic regression using sensitivity, shows the opposite where CATPCA method (99.79%) is better than logistic regression method (92.43%) and stepwise (92.05%). Therefore in this study focuses on major class classification (using a contraceptive method), then the selected model is CATPCA because it can raise the level of the major class model accuracy.

  2. High-resolution Laboratory Measurements of Coronal Lines near the Fe IX Line at 171 Å

    Science.gov (United States)

    Beiersdorfer, Peter; Träbert, Elmar

    2018-02-01

    We present high-resolution laboratory measurements in the spectral region between 165 and 175 Å that focus on the emission from various ions of C, O, F, Ne, S, Ar, Fe, and Ni. This wavelength region is centered on the λ171 Fe IX channel of the Atmospheric Imaging Assembly on the Solar Dynamics Observatory, and we place special emphasis on the weaker emission lines of Fe IX predicted in this region. In general, our measurements show a multitude of weak lines missing in the current databases, where the emission lines of Ni are probably most in need of further identification and reclassification. We also find that the wavelengths of some of the known lines need updating. Using the multi-reference Møller–Plesset method for wavelength predictions and collisional-radiative modeling of the line intensities, we have made tentative assignments of more than a dozen lines to the spectrum of Fe IX, some of which have formerly been identified as Fe VII, Fe XIV, or Fe XVI lines. Several Fe features remain unassigned, although they appear to be either Fe VII or Fe X lines. Further work will be needed to complete and correct the spectral line lists in this wavelength region.

  3. Comparing spatial regression to random forests for large ...

    Science.gov (United States)

    Environmental data may be “large” due to number of records, number of covariates, or both. Random forests has a reputation for good predictive performance when using many covariates, whereas spatial regression, when using reduced rank methods, has a reputation for good predictive performance when using many records. In this study, we compare these two techniques using a data set containing the macroinvertebrate multimetric index (MMI) at 1859 stream sites with over 200 landscape covariates. Our primary goal is predicting MMI at over 1.1 million perennial stream reaches across the USA. For spatial regression modeling, we develop two new methods to accommodate large data: (1) a procedure that estimates optimal Box-Cox transformations to linearize covariate relationships; and (2) a computationally efficient covariate selection routine that takes into account spatial autocorrelation. We show that our new methods lead to cross-validated performance similar to random forests, but that there is an advantage for spatial regression when quantifying the uncertainty of the predictions. Simulations are used to clarify advantages for each method. This research investigates different approaches for modeling and mapping national stream condition. We use MMI data from the EPA's National Rivers and Streams Assessment and predictors from StreamCat (Hill et al., 2015). Previous studies have focused on modeling the MMI condition classes (i.e., good, fair, and po

  4. On the null distribution of Bayes factors in linear regression

    Science.gov (United States)

    We show that under the null, the 2 log (Bayes factor) is asymptotically distributed as a weighted sum of chi-squared random variables with a shifted mean. This claim holds for Bayesian multi-linear regression with a family of conjugate priors, namely, the normal-inverse-gamma prior, the g-prior, and...

  5. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.

    Science.gov (United States)

    van Smeden, Maarten; de Groot, Joris A H; Moons, Karel G M; Collins, Gary S; Altman, Douglas G; Eijkemans, Marinus J C; Reitsma, Johannes B

    2016-11-24

    Ten events per variable (EPV) is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth's correction, are compared. The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect ('separation'). We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth's correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  6. Cluster Analysis of Maize Inbred Lines

    Directory of Open Access Journals (Sweden)

    Jiban Shrestha

    2016-12-01

    Full Text Available The determination of diversity among inbred lines is important for heterosis breeding. Sixty maize inbred lines were evaluated for their eight agro morphological traits during winter season of 2011 to analyze their genetic diversity. Clustering was done by average linkage method. The inbred lines were grouped into six clusters. Inbred lines grouped into Clusters II had taller plants with maximum number of leaves. The cluster III was characterized with shorter plants with minimum number of leaves. The inbred lines categorized into cluster V had early flowering whereas the group into cluster VI had late flowering time. The inbred lines grouped into the cluster III were characterized by higher value of anthesis silking interval (ASI and those of cluster VI had lower value of ASI. These results showed that the inbred lines having widely divergent clusters can be utilized in hybrid breeding programme.

  7. Evaluation of syngas production unit cost of bio-gasification facility using regression analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Yangyang; Parajuli, Prem B.

    2011-08-10

    Evaluation of economic feasibility of a bio-gasification facility needs understanding of its unit cost under different production capacities. The objective of this study was to evaluate the unit cost of syngas production at capacities from 60 through 1800Nm 3/h using an economic model with three regression analysis techniques (simple regression, reciprocal regression, and log-log regression). The preliminary result of this study showed that reciprocal regression analysis technique had the best fit curve between per unit cost and production capacity, with sum of error squares (SES) lower than 0.001 and coefficient of determination of (R 2) 0.996. The regression analysis techniques determined the minimum unit cost of syngas production for micro-scale bio-gasification facilities of $0.052/Nm 3, under the capacity of 2,880 Nm 3/h. The results of this study suggest that to reduce cost, facilities should run at a high production capacity. In addition, the contribution of this technique could be the new categorical criterion to evaluate micro-scale bio-gasification facility from the perspective of economic analysis.

  8. Robust Locally Weighted Regression For Ground Surface Extraction In Mobile Laser Scanning 3D Data

    Directory of Open Access Journals (Sweden)

    A. Nurunnabi

    2013-10-01

    Full Text Available A new robust way for ground surface extraction from mobile laser scanning 3D point cloud data is proposed in this paper. Fitting polynomials along 2D/3D points is one of the well-known methods for filtering ground points, but it is evident that unorganized point clouds consist of multiple complex structures by nature so it is not suitable for fitting a parametric global model. The aim of this research is to develop and implement an algorithm to classify ground and non-ground points based on statistically robust locally weighted regression which fits a regression surface (line in 2D by fitting without any predefined global functional relation among the variables of interest. Afterwards, the z (elevation-values are robustly down weighted based on the residuals for the fitted points. The new set of down weighted z-values along with x (or y values are used to get a new fit of the (lower surface (line. The process of fitting and down-weighting continues until the difference between two consecutive fits is insignificant. Then the final fit represents the ground level of the given point cloud and the ground surface points can be extracted. The performance of the new method has been demonstrated through vehicle based mobile laser scanning 3D point cloud data from urban areas which include different problematic objects such as short walls, large buildings, electric poles, sign posts and cars. The method has potential in areas like building/construction footprint determination, 3D city modelling, corridor mapping and asset management.

  9. Forecasting with Dynamic Regression Models

    CERN Document Server

    Pankratz, Alan

    2012-01-01

    One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.

  10. Mice lacking Ras-GRF1 show contextual fear conditioning but not spatial memory impairments: convergent evidence from two independently generated mouse mutant lines

    Directory of Open Access Journals (Sweden)

    Raffaele ed'Isa

    2011-12-01

    Full Text Available Ras-GRF1 is a neuronal specific guanine exchange factor that, once activated by both ionotropic and metabotropic neurotransmitter receptors, can stimulate Ras proteins, leading to long-term phosphorylation of downstream signaling. The two available reports on the behavior of two independently generated Ras-GRF1 deficient mouse lines provide contrasting evidence on the role of Ras-GRF1 in spatial memory and contextual fear conditioning. These discrepancies may be due to the distinct alterations introduced in the mouse genome by gene targeting in the two lines that could differentially affect expression of nearby genes located in the imprinted region containing the Ras-grf1 locus. In order to determine the real contribution of Ras-GRF1 to spatial memory we compared in Morris Water Maze learning the Brambilla’s mice with a third mouse line (GENA53 in which a nonsense mutation was introduced in the Ras-GRF1 coding region without additional changes in the genome and we found that memory in this task is normal. Also, we measured both contextual and cued fear conditioning, which were previously reported to be affected in the Brambilla’s mice, and we confirmed that contextual learning but not cued conditioning is impaired in both mouse lines. In addition, we also tested both lines for the first time in conditioned place aversion in the Intellicage, an ecological and remotely controlled behavioral test, and we observed normal learning. Finally, based on previous reports of other mutant lines suggesting that Ras-GRF1 may control body weight, we also measured this non-cognitive phenotype and we confirmed that both Ras-GRF1 deficient mutants are smaller than their control littermates. In conclusion, we demonstrate that Ras-GRF1 has no unique role in spatial memory while its function in contextual fear conditioning is likely to be due not only to its involvement in amygdalar functions but possibly to some distinct hippocampal connections specific to

  11. CPR in medical TV shows: non-health care student perspective.

    Science.gov (United States)

    Alismail, Abdullah; Meyer, Nicole C; Almutairi, Waleed; Daher, Noha S

    2018-01-01

    There are over a dozen medical shows airing on television, many of which are during prime time. Researchers have recently become more interested in the role of these shows, and the awareness on cardiopulmonary resuscitation. Several cases have been reported where a lay person resuscitated a family member using medical TV shows as a reference. The purpose of this study is to examine and evaluate college students' perception on cardiopulmonary resuscitation and when to shock using an automated external defibrillator based on their experience of watching medical TV shows. A total of 170 students (nonmedical major) were surveyed in four different colleges in the United States. The survey consisted of questions that reflect their perception and knowledge acquired from watching medical TV shows. A stepwise regression was used to determine the significant predictors of "How often do you watch medical drama TV shows" in addition to chi-square analysis for nominal variables. Regression model showed significant effect that TV shows did change students' perception positively ( p <0.001), and they would select shock on asystole as the frequency of watching increases ( p =0.023). The findings of this study show that high percentage of nonmedical college students are influenced significantly by medical shows. One particular influence is the false belief about when a shock using the automated external defibrillator (AED) is appropriate as it is portrayed falsely in most medical shows. This finding raises a concern about how these shows portray basic life support, especially when not following American Heart Association (AHA) guidelines. We recommend the medical advisors in these shows to use AHA guidelines and AHA to expand its expenditures to include medical shows to educate the public on the appropriate action to rescue an out-of-hospital cardiac arrest patient.

  12. Linear Regression Based Real-Time Filtering

    Directory of Open Access Journals (Sweden)

    Misel Batmend

    2013-01-01

    Full Text Available This paper introduces real time filtering method based on linear least squares fitted line. Method can be used in case that a filtered signal is linear. This constraint narrows a band of potential applications. Advantage over Kalman filter is that it is computationally less expensive. The paper further deals with application of introduced method on filtering data used to evaluate a position of engraved material with respect to engraving machine. The filter was implemented to the CNC engraving machine control system. Experiments showing its performance are included.

  13. Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate

    Directory of Open Access Journals (Sweden)

    Minh Vu Trieu

    2017-03-01

    Full Text Available This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS, Brazilian tensile strength (BTS, rock brittleness index (BI, the distance between planes of weakness (DPW, and the alpha angle (Alpha between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP. Four (4 statistical regression models (two linear and two nonlinear are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2 of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.

  14. Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate

    Science.gov (United States)

    Minh, Vu Trieu; Katushin, Dmitri; Antonov, Maksim; Veinthal, Renno

    2017-03-01

    This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM) based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), rock brittleness index (BI), the distance between planes of weakness (DPW), and the alpha angle (Alpha) between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP). Four (4) statistical regression models (two linear and two nonlinear) are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2) of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.

  15. A study of machine learning regression methods for major elemental analysis of rocks using laser-induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    Boucher, Thomas F.; Ozanne, Marie V.; Carmosino, Marco L.; Dyar, M. Darby; Mahadevan, Sridhar; Breves, Elly A.; Lepore, Kate H.; Clegg, Samuel M.

    2015-01-01

    The ChemCam instrument on the Mars Curiosity rover is generating thousands of LIBS spectra and bringing interest in this technique to public attention. The key to interpreting Mars or any other types of LIBS data are calibrations that relate laboratory standards to unknowns examined in other settings and enable predictions of chemical composition. Here, LIBS spectral data are analyzed using linear regression methods including partial least squares (PLS-1 and PLS-2), principal component regression (PCR), least absolute shrinkage and selection operator (lasso), elastic net, and linear support vector regression (SVR-Lin). These were compared against results from nonlinear regression methods including kernel principal component regression (K-PCR), polynomial kernel support vector regression (SVR-Py) and k-nearest neighbor (kNN) regression to discern the most effective models for interpreting chemical abundances from LIBS spectra of geological samples. The results were evaluated for 100 samples analyzed with 50 laser pulses at each of five locations averaged together. Wilcoxon signed-rank tests were employed to evaluate the statistical significance of differences among the nine models using their predicted residual sum of squares (PRESS) to make comparisons. For MgO, SiO 2 , Fe 2 O 3 , CaO, and MnO, the sparse models outperform all the others except for linear SVR, while for Na 2 O, K 2 O, TiO 2 , and P 2 O 5 , the sparse methods produce inferior results, likely because their emission lines in this energy range have lower transition probabilities. The strong performance of the sparse methods in this study suggests that use of dimensionality-reduction techniques as a preprocessing step may improve the performance of the linear models. Nonlinear methods tend to overfit the data and predict less accurately, while the linear methods proved to be more generalizable with better predictive performance. These results are attributed to the high dimensionality of the data (6144

  16. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    International Nuclear Information System (INIS)

    Chan, Yea-Kuang; Tsai, Yu-Ching

    2017-01-01

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  17. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Yea-Kuang; Tsai, Yu-Ching [Institute of Nuclear Energy Research, Taoyuan City, Taiwan (China). Nuclear Engineering Division

    2017-03-15

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  18. Data processing for potentiometric precipitation titration of mixtures of isovalent ions by linear regression analysis

    International Nuclear Information System (INIS)

    Mar'yanov, B.M.; Shumar, S.V.; Gavrilenko, M.A.

    1994-01-01

    A method for the computer processing of the curves of potentiometric differential titration using the precipitation reactions is developed. This method is based on transformation of the titration curve into a line of multiphase regression, whose parameters determine the equivalence points and the solubility products of the formed precipitates. The computational algorithm is tested using experimental curves for the titration of solutions containing Hg(2) and Cd(2) by the solution of sodium diethyldithiocarbamate. The random errors (RSD) for the titration of 1x10 -4 M solutions are in the range of 3-6%. 7 refs.; 2 figs.; 1 tab

  19. FREQFIT: Computer program which performs numerical regression and statistical chi-squared goodness of fit analysis

    International Nuclear Information System (INIS)

    Hofland, G.S.; Barton, C.C.

    1990-01-01

    The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program's results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig

  20. Gibrat’s law and quantile regressions

    DEFF Research Database (Denmark)

    Distante, Roberta; Petrella, Ivan; Santoro, Emiliano

    2017-01-01

    The nexus between firm growth, size and age in U.S. manufacturing is examined through the lens of quantile regression models. This methodology allows us to overcome serious shortcomings entailed by linear regression models employed by much of the existing literature, unveiling a number of important...

  1. ON REGRESSION REPRESENTATIONS OF STOCHASTIC-PROCESSES

    NARCIS (Netherlands)

    RUSCHENDORF, L; DEVALK, [No Value

    We construct a.s. nonlinear regression representations of general stochastic processes (X(n))n is-an-element-of N. As a consequence we obtain in particular special regression representations of Markov chains and of certain m-dependent sequences. For m-dependent sequences we obtain a constructive

  2. Introduction to the use of regression models in epidemiology.

    Science.gov (United States)

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  3. Ethanolic extract of Artemisia aucheri induces regression of aorta wall fatty streaks in hypercholesterolemic rabbits.

    Science.gov (United States)

    Asgary, S; Dinani, N Jafari; Madani, H; Mahzouni, P

    2008-05-01

    Artemisia aucheri is a native-growing plant which is widely used in Iranian traditional medicine. This study was designed to evaluate the effects of A. aucheri on regression of atherosclerosis in hypercholesterolemic rabbits. Twenty five rabbits were randomly divided into five groups of five each and treated 3-months as follows: 1: normal diet, 2: hypercholesterolemic diet (HCD), 3 and 4: HCD for 60 days and then normal diet and normal diet + A. aucheri (100 mg x kg(-1) x day(-1)) respectively for an additional 30 days (regression period). In the regression period dietary use of A. aucheri in group 4 significantly decreased total cholesterol, triglyceride and LDL-cholesterol, while HDL-cholesterol was significantly increased. The atherosclerotic area was significantly decreased in this group. Animals, which received only normal diet in the regression period showed no regression but rather progression of atherosclerosis. These findings suggest that A. aucheri may cause regression of atherosclerotic lesions.

  4. From Rasch scores to regression

    DEFF Research Database (Denmark)

    Christensen, Karl Bang

    2006-01-01

    Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....

  5. Comparison of a neural network with multiple linear regression for quantitative analysis in ICP-atomic emission spectroscopy

    International Nuclear Information System (INIS)

    Schierle, C.; Otto, M.

    1992-01-01

    A two layer perceptron with backpropagation of error is used for quantitative analysis in ICP-AES. The network was trained by emission spectra of two interfering lines of Cd and As and the concentrations of both elements were subsequently estimated from mixture spectra. The spectra of the Cd and As lines were also used to perform multiple linear regression (MLR) via the calculation of the pseudoinverse S + of the sensitivity matrix S. In the present paper it is shown that there exist close relations between the operation of the perceptron and the MLR procedure. These are most clearly apparent in the correlation between the weights of the backpropagation network and the elements of the pseudoinverse. Using MLR, the confidence intervals over the predictions are exploited to correct for the optical device of the wavelength shift. (orig.)

  6. Complete regression of myocardial involvement associated with lymphoma following chemotherapy.

    Science.gov (United States)

    Vinicki, Juan Pablo; Cianciulli, Tomás F; Farace, Gustavo A; Saccheri, María C; Lax, Jorge A; Kazelian, Lucía R; Wachs, Adolfo

    2013-09-26

    Cardiac involvement as an initial presentation of malignant lymphoma is a rare occurrence. We describe the case of a 26 year old man who had initially been diagnosed with myocardial infiltration on an echocardiogram, presenting with a testicular mass and unilateral peripheral facial paralysis. On admission, electrocardiograms (ECG) revealed negative T-waves in all leads and ST-segment elevation in the inferior leads. On two-dimensional echocardiography, there was infiltration of the pericardium with mild effusion, infiltrative thickening of the aortic walls, both atria and the interatrial septum and a mildly depressed systolic function of both ventricles. An axillary biopsy was performed and reported as a T-cell lymphoblastic lymphoma (T-LBL). Following the diagnosis and staging, chemotherapy was started. Twenty-two days after finishing the first cycle of chemotherapy, the ECG showed regression of T-wave changes in all leads and normalization of the ST-segment elevation in the inferior leads. A follow-up Two-dimensional echocardiography confirmed regression of the myocardial infiltration. This case report illustrates a lymphoma presenting with testicular mass, unilateral peripheral facial paralysis and myocardial involvement, and demonstrates that regression of infiltration can be achieved by intensive chemotherapy treatment. To our knowledge, there are no reported cases of T-LBL presenting as a testicular mass and unilateral peripheral facial paralysis, with complete regression of myocardial involvement.

  7. Development of a Modified Kernel Regression Model for a Robust Signal Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2016-10-15

    The demand for robust and resilient performance has led to the use of online-monitoring techniques to monitor the process parameters and signal validation. On-line monitoring and signal validation techniques are the two important terminologies in process and equipment monitoring. These techniques are automated methods of monitoring instrument performance while the plant is operating. To implementing these techniques, several empirical models are used. One of these models is nonparametric regression model, otherwise known as kernel regression (KR). Unlike parametric models, KR is an algorithmic estimation procedure which assumes no significant parameters, and it needs no training process after its development when new observations are prepared; which is good for a system characteristic of changing due to ageing phenomenon. Although KR is used and performed excellently when applied to steady state or normal operating data, it has limitation in time-varying data that has several repetition of the same signal, especially if those signals are used to infer the other signals. The convectional KR has limitation in correctly estimating the dependent variable when time-varying data with repeated values are used to estimate the dependent variable especially in signal validation and monitoring. Therefore, we presented here in this work a modified KR that can resolve this issue which can also be feasible in time domain. Data are first transformed prior to the Euclidian distance evaluation considering their slopes/changes with respect to time. The performance of the developed model is evaluated and compared with that of conventional KR using both the lab experimental data and the real time data from CNS provided by KAERI. The result shows that the proposed developed model, having demonstrated high performance accuracy than that of conventional KR, is capable of resolving the identified limitation with convectional KR. We also discovered that there is still need to further

  8. Producing The New Regressive Left

    DEFF Research Database (Denmark)

    Crone, Christine

    members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...

  9. An Analysis of Bank Service Satisfaction Based on Quantile Regression and Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Wen-Tsao Pan

    2016-01-01

    Full Text Available Bank service satisfaction is vital to the success of a bank. In this paper, we propose to use the grey relational analysis to gauge the levels of service satisfaction of the banks. With the grey relational analysis, we compared the effects of different variables on service satisfaction. We gave ranks to the banks according to their levels of service satisfaction. We further used the quantile regression model to find the variables that affected the satisfaction of a customer at a specific quantile of satisfaction level. The result of the quantile regression analysis provided a bank manager with information to formulate policies to further promote satisfaction of the customers at different quantiles of satisfaction level. We also compared the prediction accuracies of the regression models at different quantiles. The experiment result showed that, among the seven quantile regression models, the median regression model has the best performance in terms of RMSE, RTIC, and CE performance measures.

  10. Line profile variations in selected Seyfert galaxies

    International Nuclear Information System (INIS)

    Kollatschny, W; Zetzl, M; Ulbrich, K

    2010-01-01

    Continua as well as the broad emission lines in Seyfert 1 galaxies vary in different galaxies with different amplitudes on typical timescales of days to years. We present the results of two independent variability campaigns taken with the Hobby-Eberly Telescope. We studied in detail the integrated line and continuum variations in the optical spectra of the narrow-line Seyfert galaxy Mrk 110 and the very broad-line Seyfert galaxy Mrk 926. The broad-line emitting region in Mrk 110 has radii of four to 33 light-days as a function of the ionization degree of the emission lines. The line-profile variations are matched by Keplerian disk models with some accretion disk wind. The broad-line region in Mrk 926 is very small showing an extension of two to three light-days only. We could detect a structure in the rms line-profiles as well as in the response of the line profile segments of Mrk 926 indicating the BLR is structured.

  11. The Widom line of supercooled water

    International Nuclear Information System (INIS)

    Franzese, Giancarlo; Stanley, H Eugene

    2007-01-01

    Water can be supercooled to temperatures as low as -92 deg. C, the experimental crystal homogeneous nucleation temperature T H at 2 kbar. Within the supercooled liquid phase its response functions show an anomalous increase consistent with the presence of a liquid-liquid critical point located in a region inaccessible to experiments on bulk water. Recent experiments on the dynamics of confined water show that a possible way to understand the properties of water is to investigate the supercooled phase diagram in the vicinity of the Widom line (locus of maximum correlation length) that emanates from the hypothesized liquid-liquid critical point. Here we explore the Widom line for a Hamiltonian model of water using an analytic approach, and discuss the plausibility of the hypothesized liquid-liquid critical point, as well as its possible consequences, on the basis of the assumptions of the model. The present analysis allows us (i) to find an analytic expression for the spinodal line of the high-density liquid phase, with respect to the low-density liquid phase, showing that this line becomes flat in the P-T phase diagram in the physical limit of a large number of available orientations for the hydrogen bonds, as recently seen in simulations and experiments (Xu et al 2005 Proc. Natl Acad. Sci. 102 16558); (ii) to find an estimate of the values for the hypothesized liquid-liquid critical point coordinates that compare very well with Monte Carlo results; and (iii) to show how the Widom line can be located by studying the derivative of the probability of forming hydrogen bonds with local tetrahedral orientation which can be calculated analytically within this approach

  12. Mixture of Regression Models with Single-Index

    OpenAIRE

    Xiang, Sijia; Yao, Weixin

    2016-01-01

    In this article, we propose a class of semiparametric mixture regression models with single-index. We argue that many recently proposed semiparametric/nonparametric mixture regression models can be considered special cases of the proposed model. However, unlike existing semiparametric mixture regression models, the new pro- posed model can easily incorporate multivariate predictors into the nonparametric components. Backfitting estimates and the corresponding algorithms have been proposed for...

  13. Local bilinear multiple-output quantile/depth regression

    Czech Academy of Sciences Publication Activity Database

    Hallin, M.; Lu, Z.; Paindaveine, D.; Šiman, Miroslav

    2015-01-01

    Roč. 21, č. 3 (2015), s. 1435-1466 ISSN 1350-7265 R&D Projects: GA MŠk(CZ) 1M06047 Institutional support: RVO:67985556 Keywords : conditional depth * growth chart * halfspace depth * local bilinear regression * multivariate quantile * quantile regression * regression depth Subject RIV: BA - General Mathematics Impact factor: 1.372, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/siman-0446857.pdf

  14. QSOs with narrow emission lines

    International Nuclear Information System (INIS)

    Baldwin, J.A.; Mcmahon, R.; Hazard, C.; Williams, R.E.

    1988-01-01

    Observations of two new high-redshift, narrow-lined QSOs (NLQSOs) are presented and discussed together with observations of similar objects reported in the literature. Gravitational lensing is ruled out as a possible means of amplifying the luminosity for one of these objects. It is found that the NLQSOs have broad bases on their emission lines as well as the prominent narrow cores which define this class. Thus, these are not pole-on QSOs. The FWHM of the emission lines fits onto the smoothly falling tail of the lower end of the line-width distribution for complete QSO samples. The equivalent widths of the combined broad and narrow components of the lines are normal for QSOs of the luminosity range under study. However, the NLQSOs do show ionization differences from broader-lined QSOs; most significant, the semiforbidden C III/C IV intensity ratio is unusually low. The N/C abundance ratio in these objects is found to be normal; the Al/C abundance ratio may be quite high. 38 references

  15. Do clinical and translational science graduate students understand linear regression? Development and early validation of the REGRESS quiz.

    Science.gov (United States)

    Enders, Felicity

    2013-12-01

    Although regression is widely used for reading and publishing in the medical literature, no instruments were previously available to assess students' understanding. The goal of this study was to design and assess such an instrument for graduate students in Clinical and Translational Science and Public Health. A 27-item REsearch on Global Regression Expectations in StatisticS (REGRESS) quiz was developed through an iterative process. Consenting students taking a course on linear regression in a Clinical and Translational Science program completed the quiz pre- and postcourse. Student results were compared to practicing statisticians with a master's or doctoral degree in statistics or a closely related field. Fifty-two students responded precourse, 59 postcourse , and 22 practicing statisticians completed the quiz. The mean (SD) score was 9.3 (4.3) for students precourse and 19.0 (3.5) postcourse (P REGRESS quiz was internally reliable (Cronbach's alpha 0.89). The initial validation is quite promising with statistically significant and meaningful differences across time and study populations. Further work is needed to validate the quiz across multiple institutions. © 2013 Wiley Periodicals, Inc.

  16. Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Science.gov (United States)

    Gray-Davies, Tristan; Holmes, Chris C.; Caron, François

    2018-01-01

    We present a novel Bayesian nonparametric regression model for covariates X and continuous response variable Y ∈ ℝ. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F (y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates. PMID:29623150

  17. The MIDAS Touch: Mixed Data Sampling Regression Models

    OpenAIRE

    Ghysels, Eric; Santa-Clara, Pedro; Valkanov, Rossen

    2004-01-01

    We introduce Mixed Data Sampling (henceforth MIDAS) regression models. The regressions involve time series data sampled at different frequencies. Technically speaking MIDAS models specify conditional expectations as a distributed lag of regressors recorded at some higher sampling frequencies. We examine the asymptotic properties of MIDAS regression estimation and compare it with traditional distributed lag models. MIDAS regressions have wide applicability in macroeconomics and �nance.

  18. Suppression Situations in Multiple Linear Regression

    Science.gov (United States)

    Shieh, Gwowen

    2006-01-01

    This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis for the examination of their relationship. For the two-predictor regression, the author demonstrates that the previous results in the literature are…

  19. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  20. A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression.

    Science.gov (United States)

    Stock, Michiel; Pahikkala, Tapio; Airola, Antti; De Baets, Bernard; Waegeman, Willem

    2018-06-12

    Many machine learning problems can be formulated as predicting labels for a pair of objects. Problems of that kind are often referred to as pairwise learning, dyadic prediction, or network inference problems. During the past decade, kernel methods have played a dominant role in pairwise learning. They still obtain a state-of-the-art predictive performance, but a theoretical analysis of their behavior has been underexplored in the machine learning literature. In this work we review and unify kernel-based algorithms that are commonly used in different pairwise learning settings, ranging from matrix filtering to zero-shot learning. To this end, we focus on closed-form efficient instantiations of Kronecker kernel ridge regression. We show that independent task kernel ridge regression, two-step kernel ridge regression, and a linear matrix filter arise naturally as a special case of Kronecker kernel ridge regression, implying that all these methods implicitly minimize a squared loss. In addition, we analyze universality, consistency, and spectral filtering properties. Our theoretical results provide valuable insights into assessing the advantages and limitations of existing pairwise learning methods.

  1. An analysis model of the secondary tunnel lining considering ground-primary support-secondary lining interaction

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Seong-Ho; Chang, Seok-Bue [Yooshin Engineering Corporation, Seoul(Korea); Lee, Sang-Duk [Ajou University, Suwon(Korea)

    2002-06-30

    It is the common practice to over design the reinforcement for the secondary tunnel lining due to the lack of rational insight into the ground loosening loads, and due to the conservative application of the empirical design methods. The main loads of the secondary lining are the ground loosening loads and the ground water pressure, and the ground load is critical in the reinforcement design of the secondary lining in the case of drained tunnel. If the external load is absent around a tunnel. the reasons of the load for secondary tunnel lining are the deterioration of the primary supports such as shotcrete, steel rib, and rock bolts. Accordingly, the analysis method considering the ground-primary supports-secondary lining interaction should be required for the rational design of the secondary tunnel lining. In this paper, the interaction was conceptually described by the simple mass-spring model and the load transfer from the ground and primary supports to the secondary lining is showed by the ground-primary supports-secondary lining reaction curves for the theoretical solution of a circular tunnel, And also, the application of this proposed model to numerical analysis is verified in order to check the potential for the tunnel with the complex analysis conditions. (author). 8 refs., 2 tabs., 7 figs.

  2. Comparison of Neural Networks and Regression Time Series in Estimating the Development of the Afternoon Price of Palladium on the New York Stock Exchange

    Directory of Open Access Journals (Sweden)

    Marek Vochozka

    2017-12-01

    Full Text Available Purpose of the article: Palladium is presently used for producing electronics, industrial products or jewellery, as well as products in the medical field. Its value is raised especially by its unique physical and chemical characteristics. Predicting the value of such a metal is not an easy matter (with regard to the fact that prices may change significantly in time. Methodology/methods: To carry out the analysis, London Fix Price PM data was used, i.e. amounts reported in the afternoon for a period longer than 10 years. To process the data, Statistica software is used. Linear regression is carried out using a whole range of functions, and subsequently regression via neural structures is performed, where several distributional functions are used again. Subsequently, 1000 neural networks are generated, out of which 5 proving the best characteristics are chosen. Scientific aim: The aim of the paper is to perform a regression analysis of the development of the palladium price on the New York Stock Exchange using neural structures and linear regression, then to compare the two methods and determine the more suitable one for a possible prediction of the future development of the palladium price on the New York Stock Exchange. Findings: Results are compared on the level of an expert perspective and the evaluator’s – economist’s experience. Within regression time lines, the curve obtained by the least squares methods via negative-exponential smoothing gets closest to Palladium price line development. Out of the neural networks, all 5 chosen networks prove to be the most practically useful. Conclusions: Because it is not possible to predict extraordinary situations and their impact on the palladium price (at most in the short term, but certainly not over a long period of time, simplification and the creation of a relatively simple model is appropriate and the result is useful.

  3. The impact of global signal regression on resting state correlations: are anti-correlated networks introduced?

    Science.gov (United States)

    Murphy, Kevin; Birn, Rasmus M; Handwerker, Daniel A; Jones, Tyler B; Bandettini, Peter A

    2009-02-01

    Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step.

  4. Regression methods for medical research

    CERN Document Server

    Tai, Bee Choo

    2013-01-01

    Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the

  5. Prediction of epigenetically regulated genes in breast cancer cell lines

    Energy Technology Data Exchange (ETDEWEB)

    Loss, Leandro A; Sadanandam, Anguraj; Durinck, Steffen; Nautiyal, Shivani; Flaucher, Diane; Carlton, Victoria EH; Moorhead, Martin; Lu, Yontao; Gray, Joe W; Faham, Malek; Spellman, Paul; Parvin, Bahram

    2010-05-04

    Methylation of CpG islands within the DNA promoter regions is one mechanism that leads to aberrant gene expression in cancer. In particular, the abnormal methylation of CpG islands may silence associated genes. Therefore, using high-throughput microarrays to measure CpG island methylation will lead to better understanding of tumor pathobiology and progression, while revealing potentially new biomarkers. We have examined a recently developed high-throughput technology for measuring genome-wide methylation patterns called mTACL. Here, we propose a computational pipeline for integrating gene expression and CpG island methylation profles to identify epigenetically regulated genes for a panel of 45 breast cancer cell lines, which is widely used in the Integrative Cancer Biology Program (ICBP). The pipeline (i) reduces the dimensionality of the methylation data, (ii) associates the reduced methylation data with gene expression data, and (iii) ranks methylation-expression associations according to their epigenetic regulation. Dimensionality reduction is performed in two steps: (i) methylation sites are grouped across the genome to identify regions of interest, and (ii) methylation profles are clustered within each region. Associations between the clustered methylation and the gene expression data sets generate candidate matches within a fxed neighborhood around each gene. Finally, the methylation-expression associations are ranked through a logistic regression, and their significance is quantified through permutation analysis. Our two-step dimensionality reduction compressed 90% of the original data, reducing 137,688 methylation sites to 14,505 clusters. Methylation-expression associations produced 18,312 correspondences, which were used to further analyze epigenetic regulation. Logistic regression was used to identify 58 genes from these correspondences that showed a statistically signifcant negative correlation between methylation profles and gene expression in the

  6. Identification of resistance mechanisms in erlotinib-resistant subclones of the non-small cell lung cancer cell line HCC827 by exome sequencing

    DEFF Research Database (Denmark)

    Jacobsen, Kirstine; Alcaraz, Nicolas; Lund, Rikke Raaen

    the SeqCap EZ Human Exome Library v3.0 kit and whole-exome sequencing of these (100 bp paired-end) were performed on an Illumina HiSeq 2000 platform. Using a recently developed in-house analysis pipeline the sequencing data were analyzed. The analysis pipeline includes quality control using Trim......Background: Erlotinib (Tarceva®, Roche) has significantly changed the treatment of non-small cell lung cancer (NSCLC) as 70% of patients show significant tumor regression upon treatment (Santarpia et. al., 2013). However, all patients relapse due to development of acquired resistance, which...... mutations in erlotinib-resistant subclones of the NSCLC cell line, HCC827. Materials & Methods: We established 3 erlotinib-resistant subclones (resistant to 10, 20, 30 µM erlotinib, respectively). DNA libraries of each subclone and the parental HCC827 cell line were prepared in biological duplicates using...

  7. Regression models for interval censored survival data: Application to HIV infection in Danish homosexual men

    DEFF Research Database (Denmark)

    Carstensen, Bendix

    1996-01-01

    This paper shows how to fit excess and relative risk regression models to interval censored survival data, and how to implement the models in standard statistical software. The methods developed are used for the analysis of HIV infection rates in a cohort of Danish homosexual men.......This paper shows how to fit excess and relative risk regression models to interval censored survival data, and how to implement the models in standard statistical software. The methods developed are used for the analysis of HIV infection rates in a cohort of Danish homosexual men....

  8. Interacting open Wilson lines from noncommutative field theories

    International Nuclear Information System (INIS)

    Kiem, Youngjai; Lee, Sangmin; Rey, Soo-Jong; Sato, Haru-Tada

    2002-01-01

    In noncommutative field theories, it is known that the one-loop effective action describes the propagation of noninteracting open Wilson lines, obeying the flying dipole's relation. We show that the two-loop effective action describes the cubic interaction among 'closed string' states created by open Wilson line operators. Taking d-dimensional λ[Φ 3 ] * theory as the simplest setup, we compute the nonplanar contribution at a low-energy and large noncommutativity limit. We find that the contribution is expressible in a remarkably simple cubic interaction involving scalar open Wilson lines only and nothing else. We show that the interaction is purely geometrical and noncommutative in nature, depending only on the size of each open Wilson line

  9. BOX-COX REGRESSION METHOD IN TIME SCALING

    Directory of Open Access Journals (Sweden)

    ATİLLA GÖKTAŞ

    2013-06-01

    Full Text Available Box-Cox regression method with λj, for j = 1, 2, ..., k, power transformation can be used when dependent variable and error term of the linear regression model do not satisfy the continuity and normality assumptions. The situation obtaining the smallest mean square error  when optimum power λj, transformation for j = 1, 2, ..., k, of Y has been discussed. Box-Cox regression method is especially appropriate to adjust existence skewness or heteroscedasticity of error terms for a nonlinear functional relationship between dependent and explanatory variables. In this study, the advantage and disadvantage use of Box-Cox regression method have been discussed in differentiation and differantial analysis of time scale concept.

  10. High-throughput quantitative biochemical characterization of algal biomass by NIR spectroscopy; multiple linear regression and multivariate linear regression analysis.

    Science.gov (United States)

    Laurens, L M L; Wolfrum, E J

    2013-12-18

    One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.

  11. Gaussian Process Regression Model in Spatial Logistic Regression

    Science.gov (United States)

    Sofro, A.; Oktaviarina, A.

    2018-01-01

    Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.

  12. Regression Analysis and the Sociological Imagination

    Science.gov (United States)

    De Maio, Fernando

    2014-01-01

    Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.

  13. On Weighted Support Vector Regression

    DEFF Research Database (Denmark)

    Han, Xixuan; Clemmensen, Line Katrine Harder

    2014-01-01

    We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...

  14. EMD-regression for modelling multi-scale relationships, and application to weather-related cardiovascular mortality

    Science.gov (United States)

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.

    2018-01-01

    In a number of environmental studies, relationships between natural processes are often assessed through regression analyses, using time series data. Such data are often multi-scale and non-stationary, leading to a poor accuracy of the resulting regression models and therefore to results with moderate reliability. To deal with this issue, the present paper introduces the EMD-regression methodology consisting in applying the empirical mode decomposition (EMD) algorithm on data series and then using the resulting components in regression models. The proposed methodology presents a number of advantages. First, it accounts of the issues of non-stationarity associated to the data series. Second, this approach acts as a scan for the relationship between a response variable and the predictors at different time scales, providing new insights about this relationship. To illustrate the proposed methodology it is applied to study the relationship between weather and cardiovascular mortality in Montreal, Canada. The results shed new knowledge concerning the studied relationship. For instance, they show that the humidity can cause excess mortality at the monthly time scale, which is a scale not visible in classical models. A comparison is also conducted with state of the art methods which are the generalized additive models and distributed lag models, both widely used in weather-related health studies. The comparison shows that EMD-regression achieves better prediction performances and provides more details than classical models concerning the relationship.

  15. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis

    Directory of Open Access Journals (Sweden)

    Maarten van Smeden

    2016-11-01

    Full Text Available Abstract Background Ten events per variable (EPV is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. Methods The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth’s correction, are compared. Results The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect (‘separation’. We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth’s correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. Conclusions The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  16. Cisgenic Rvi6 scab-resistant apple lines show no differences in Rvi6 transcription when compared with conventionally bred cultivars

    NARCIS (Netherlands)

    Chizzali, Cornelia; Gusberti, Michele; Schouten, H.J.; Gessler, Cesare; Broggini, G.A.L.

    2016-01-01

    Main conclusion: The expression of the apple scab resistance geneRvi6in different apple cultivars and lines is not modulated by biotic or abiotic factors.All commercially important apple cultivars are susceptible to Venturia inaequalis, the causal organism of apple scab. A limited number of apple

  17. An Additive-Multiplicative Cox-Aalen Regression Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects...

  18. AN APPLICATION OF THE LOGISTIC REGRESSION MODEL IN THE EXPERIMENTAL PHYSICAL CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Elpidio Corral-López

    2015-06-01

    Full Text Available The calculation of intensive properties molar volumes of ethanol-water mixtures by experimental densities and tangent method in the Physical Chemistry Laboratory presents the problem of making manually the molar volume curve versus mole fraction and the trace of the tangent line trace. The advantage of using a statistical model the Logistic Regression on a Texas VOYAGE graphing calculator allowed trace the curve and the tangents in situ, and also evaluate the students work during the experimental session. The error percentage between the molar volumes calculated using literature data and those obtained with statistical method is minimal, which validates the model. It is advantageous use the calculator with this application as a teaching support tool, reducing the evaluation time of 3 weeks to 3 hours.

  19. A multiple regression analysis for accurate background subtraction in 99Tcm-DTPA renography

    International Nuclear Information System (INIS)

    Middleton, G.W.; Thomson, W.H.; Davies, I.H.; Morgan, A.

    1989-01-01

    A technique for accurate background subtraction in 99 Tc m -DTPA renography is described. The technique is based on a multiple regression analysis of the renal curves and separate heart and soft tissue curves which together represent background activity. It is compared, in over 100 renograms, with a previously described linear regression technique. Results show that the method provides accurate background subtraction, even in very poorly functioning kidneys, thus enabling relative renal filtration and excretion to be accurately estimated. (author)

  20. Long-distance pulse propagation on high-frequency dissipative nonlinear transmission lines/resonant tunneling diode line cascaded maps

    International Nuclear Information System (INIS)

    Klofai, Yerima; Essimbi, B Z; Jaeger, D

    2011-01-01

    Pulse propagation on high-frequency dissipative nonlinear transmission lines (NLTLs)/resonant tunneling diode line cascaded maps is investigated for long-distance propagation of short pulses. Applying perturbative analysis, we show that the dynamics of each line is reduced to an expanded Korteweg-de Vries-Burgers equation. Moreover, it is found by computer experiments that the soliton developed in NLTLs experiences an exponential amplitude decay on the one hand and an exponential amplitude growth on the other. As a result, the behavior of a pulse in special electrical networks made of concatenated pieces of lines is closely similar to the transmission of information in optical/electrical communication systems.

  1. Long-distance pulse propagation on high-frequency dissipative nonlinear transmission lines/resonant tunneling diode line cascaded maps

    Energy Technology Data Exchange (ETDEWEB)

    Klofai, Yerima [Department of Physics, Higher Teacher Training College, University of Maroua, PO Box 46 Maroua (Cameroon); Essimbi, B Z [Department of Physics, Faculty of Science, University of Yaounde 1, PO Box 812 Yaounde (Cameroon); Jaeger, D, E-mail: bessimb@yahoo.fr [ZHO, Optoelectronik, Universitaet Duisburg-Essen, D-47048 Duisburg (Germany)

    2011-10-15

    Pulse propagation on high-frequency dissipative nonlinear transmission lines (NLTLs)/resonant tunneling diode line cascaded maps is investigated for long-distance propagation of short pulses. Applying perturbative analysis, we show that the dynamics of each line is reduced to an expanded Korteweg-de Vries-Burgers equation. Moreover, it is found by computer experiments that the soliton developed in NLTLs experiences an exponential amplitude decay on the one hand and an exponential amplitude growth on the other. As a result, the behavior of a pulse in special electrical networks made of concatenated pieces of lines is closely similar to the transmission of information in optical/electrical communication systems.

  2. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Is Posidonia oceanica regression a general feature in the Mediterranean Sea?

    Directory of Open Access Journals (Sweden)

    M. BONACORSI

    2013-03-01

    Full Text Available Over the last few years, a widespread regression of Posidonia oceanica meadows has been noticed in the Mediterranean Sea. However, the magnitude of this decline is still debated. The objectives of this study are (i to assess the spatio-temporal evolution of Posidonia oceanica around Cap Corse (Corsica over time comparing available ancient maps (from 1960 with a new (2011 detailed map realized combining different techniques (aerial photographs, SSS, ROV, scuba diving; (ii evaluate the reliability of ancient maps; (iii discuss observed regression of the meadows in relation to human pressure along the 110 km of coast. Thus, the comparison with previous data shows that, apart from sites clearly identified with the actual evolution, there is a relative stability of the surfaces occupied by the seagrass Posidonia oceanica. The recorded differences seem more related to changes in mapping techniques. These results confirm that in areas characterized by a moderate anthropogenic impact, the Posidonia oceanica meadow has no significant regression and that the changes due to the evolution of mapping techniques are not negligible. However, others facts should be taken into account before extrapolating to the Mediterranean Sea (e.g. actually mapped surfaces and assessing the amplitude of the actual regression.

  4. Can double-peaked lines indicate merging effects in AGNs?

    Directory of Open Access Journals (Sweden)

    Popović L.Č.

    2000-01-01

    Full Text Available The influence of merging effects in the central part of an Active Galactic Nucleus (AGN on the emission spectral line shapes are discussed. We present a model of close binary Broad Line Region. The numerical experiments show that the merging effects can explain double peaked lines. The merging effects may also be present in the center of AGNs, although they emit slightly asymmetric as well as symmetric and relatively stable (in profile shape spectral lines. Depending on the black hole masses and their orbit elements such model may explain some of the line profile shapes observed in AGNs. This work shows that if one is looking for the merging effects in the central region as well as in the wide field structure of AGNs, he should first pay attention to objects which have double peaked lines.

  5. Spontaneous regression of primary cutaneous diffuse large B-cell lymphoma, leg type with significant T-cell immune response

    Directory of Open Access Journals (Sweden)

    Paul M. Graham, DO

    2018-05-01

    Full Text Available We report a case of histologically confirmed primary cutaneous diffuse large B-cell lymphoma, leg type (PCDLBCL-LT that subsequently underwent spontaneous regression in the absence of systemic treatment. The case showed an atypical lymphoid infiltrate that was CD20+ and MUM-1+ and CD10–. A subsequent biopsy of the spontaneously regressed lesion showed fibrosis associated with a lymphocytic infiltrate comprising reactive T cells. PCDLBCL-LT is a cutaneous B-cell lymphoma with a poor prognosis, which is usually treated with chemotherapy. We describe a case of clinical and histologic spontaneous regression in a patient with PCDLBCL-LT who had a negative systemic workup but a recurrence over a year after his initial presentation. Key words: B cell, lymphoma, primary cutaneous diffuse large B-cell lymphoma, leg type, regression

  6. Coupled Transmission Lines as Impedance Transformer

    DEFF Research Database (Denmark)

    Jensen, Thomas; Zhurbenko, Vitaliy; Krozer, Viktor

    2007-01-01

    A theoretical investigation of the use of a coupled line section as an impedance transformer is presented. We show how to properly select the terminations of the coupled line structures for effective matching of real and complex loads in both narrow and wide frequency ranges. The corresponding...... circuit configurations and the design procedures are proposed. Synthesis relations are derived and provided for efficient matching circuit construction. Design examples are given to demonstrate the flexibility and limitations of the design methods and to show their validity for practical applications...

  7. Short-term solar irradiation forecasting based on Dynamic Harmonic Regression

    International Nuclear Information System (INIS)

    Trapero, Juan R.; Kourentzes, Nikolaos; Martin, A.

    2015-01-01

    Solar power generation is a crucial research area for countries that have high dependency on fossil energy sources and is gaining prominence with the current shift to renewable sources of energy. In order to integrate the electricity generated by solar energy into the grid, solar irradiation must be reasonably well forecasted, where deviations of the forecasted value from the actual measured value involve significant costs. The present paper proposes a univariate Dynamic Harmonic Regression model set up in a State Space framework for short-term (1–24 h) solar irradiation forecasting. Time series hourly aggregated as the Global Horizontal Irradiation and the Direct Normal Irradiation will be used to illustrate the proposed approach. This method provides a fast automatic identification and estimation procedure based on the frequency domain. Furthermore, the recursive algorithms applied offer adaptive predictions. The good forecasting performance is illustrated with solar irradiance measurements collected from ground-based weather stations located in Spain. The results show that the Dynamic Harmonic Regression achieves the lowest relative Root Mean Squared Error; about 30% and 47% for the Global and Direct irradiation components, respectively, for a forecast horizon of 24 h ahead. - Highlights: • Solar irradiation forecasts at short-term are required to operate solar power plants. • This paper assesses the Dynamic Harmonic Regression to forecast solar irradiation. • Models are evaluated using hourly GHI and DNI data collected in Spain. • The results show that forecasting accuracy is improved by using the model proposed

  8. Model-based Quantile Regression for Discrete Data

    KAUST Repository

    Padellini, Tullia

    2018-04-10

    Quantile regression is a class of methods voted to the modelling of conditional quantiles. In a Bayesian framework quantile regression has typically been carried out exploiting the Asymmetric Laplace Distribution as a working likelihood. Despite the fact that this leads to a proper posterior for the regression coefficients, the resulting posterior variance is however affected by an unidentifiable parameter, hence any inferential procedure beside point estimation is unreliable. We propose a model-based approach for quantile regression that considers quantiles of the generating distribution directly, and thus allows for a proper uncertainty quantification. We then create a link between quantile regression and generalised linear models by mapping the quantiles to the parameter of the response variable, and we exploit it to fit the model with R-INLA. We extend it also in the case of discrete responses, where there is no 1-to-1 relationship between quantiles and distribution\\'s parameter, by introducing continuous generalisations of the most common discrete variables (Poisson, Binomial and Negative Binomial) to be exploited in the fitting.

  9. Human small-cell lung cancers show amplification and expression of the N-myc gene

    International Nuclear Information System (INIS)

    Nau, M.M.; Brooks, B.J. Jr.; Carney, D.N.; Gazdar, A.F.; Battey, J.F.; Sausville, E.A.; Minna, J.D.

    1986-01-01

    The authors have found that 6 of 31 independently derived human small-cell lung cancer (SCLC) cell lines have 5- to 170-fold amplified N-myc gene sequences. The amplification is seen with probes from two separate exons of N-myc, which are homologous to either the second or the third exon of the c-myc gene. Amplified N-myc sequences were found in a tumor cell line started prior to chemotherapy, in SCLC tumor samples harvested directly from tumor metastases at autopsy, and from a resected primary lung cancer. Several N-myc-amplified tumor cell lines also exhibited N-myc hybridizing fragments not in the germ-line position. In one patient's tumor, an additional amplitifed N-myc DNA fragment was observed and this fragment was heterogeneously distributed in liver metastases. In contrast to SCLC with neuroendocrine properties, no non-small-cell lung cancer lines examined were found to have N-myc amplification. Fragments encoding two N-myc exons also detect increased amounts of a 3.1-kilobase N-myc mRNA in N-myc-amplified SCLC lines and in one cell line that does not show N-myc gene amplification. Both DNA and RNA hybridization experiments, using a 32 P-labelled restriction probe, show that in any one SCLC cell line, only one myc-related gene is amplified and expressed. They conclude that N-myc amplification is both common and potentially significant in the tumorigenesis or tumor progression of SCLC

  10. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  11. Radiosensitivity of mesothelioma cell lines

    International Nuclear Information System (INIS)

    Haekkinen, A.M.; Laasonen, A.; Linnainmaa, K.; Mattson, K.; Pyrhoenen, S.

    1996-01-01

    The present study was carried out in order to examine the radiosensitivity of malignant pleural mesothelioma cell lines. Cell kinetics, radiation-induced delay of the cell cycle and DNA ploidy of the cell lines were also determined. For comparison an HeLa and a human foetal fibroblast cell line were simultaneously explored. Six previously cytogenetically and histologically characterized mesothelioma tumor cell lines were applied. A rapid tiazolyl blue microtiter (MTT) assay was used to analyze radiosensitivity and cell kinetics and DNA ploidy of the cultured cells were determined by flow cytometry. The survival fraction after a dose of 2 Gy (SF2), parameters α and β of the linear quadratic model (LQ-model) and mean inactivation dose (D MID ) were also estimated. The DNA index of four cell lines equaled 1.0 and two cell lines equaled 1.5 and 1.6. Different mesothelioma cell lines showed a great variation in radiosensitivity. Mean survival fraction after a radiation dose of 2 Gy (SF2) was 0.60 and ranged from 0.36 to 0.81 and mean α value was 0.26 (range 0.48-0.083). The SF2 of the most sensitive diploid mesothelioma cell line was 0.36: Less than that of the foetal fibroblast cell line (0.49). The survival fractions (0.81 and 0.74) of the two most resistant cell lines, which also were aneuploid, were equal to that of the HeLa cell line (0.78). The α/β ratios of the most sensitive cell lines were almost an order of magnitude greater than those of the two most resistant cell lines. Radiation-induced delay of the most resistant aneuploid cell line was similar to that of HeLa cells but in the most sensitive (diploid cells) there was practically no entry into the G1 phase following the 2 Gy radiation dose during 36 h. (orig.)

  12. Radiosensitivity of mesothelioma cell lines

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, A.M. [Dept. of Oncology, Univ. Central Hospital, Helsinki (Finland); Laasonen, A. [Dept. of Pathology, Central Hospital of Etelae-Pohjanmaa, Seinaejoki (Finland); Linnainmaa, K. [Dept. of Industrial Hygiene and Toxicology, Inst. of Occupational Health, Helsinki (Finland); Mattson, K. [Dept. Pulmonary Medicine, Univ. Central Hospital, Helsinki (Finland); Pyrhoenen, S. [Dept. of Oncology, Univ. Central Hospital, Helsinki (Finland)

    1996-10-01

    The present study was carried out in order to examine the radiosensitivity of malignant pleural mesothelioma cell lines. Cell kinetics, radiation-induced delay of the cell cycle and DNA ploidy of the cell lines were also determined. For comparison an HeLa and a human foetal fibroblast cell line were simultaneously explored. Six previously cytogenetically and histologically characterized mesothelioma tumor cell lines were applied. A rapid tiazolyl blue microtiter (MTT) assay was used to analyze radiosensitivity and cell kinetics and DNA ploidy of the cultured cells were determined by flow cytometry. The survival fraction after a dose of 2 Gy (SF2), parameters {alpha} and {beta} of the linear quadratic model (LQ-model) and mean inactivation dose (D{sub MID}) were also estimated. The DNA index of four cell lines equaled 1.0 and two cell lines equaled 1.5 and 1.6. Different mesothelioma cell lines showed a great variation in radiosensitivity. Mean survival fraction after a radiation dose of 2 Gy (SF2) was 0.60 and ranged from 0.36 to 0.81 and mean {alpha} value was 0.26 (range 0.48-0.083). The SF2 of the most sensitive diploid mesothelioma cell line was 0.36: Less than that of the foetal fibroblast cell line (0.49). The survival fractions (0.81 and 0.74) of the two most resistant cell lines, which also were aneuploid, were equal to that of the HeLa cell line (0.78). The {alpha}/{beta} ratios of the most sensitive cell lines were almost an order of magnitude greater than those of the two most resistant cell lines. Radiation-induced delay of the most resistant aneuploid cell line was similar to that of HeLa cells but in the most sensitive (diploid cells) there was practically no entry into the G1 phase following the 2 Gy radiation dose during 36 h. (orig.).

  13. A Trajectory Regression Clustering Technique Combining a Novel Fuzzy C-Means Clustering Algorithm with the Least Squares Method

    Directory of Open Access Journals (Sweden)

    Xiangbing Zhou

    2018-04-01

    Full Text Available Rapidly growing GPS (Global Positioning System trajectories hide much valuable information, such as city road planning, urban travel demand, and population migration. In order to mine the hidden information and to capture better clustering results, a trajectory regression clustering method (an unsupervised trajectory clustering method is proposed to reduce local information loss of the trajectory and to avoid getting stuck in the local optimum. Using this method, we first define our new concept of trajectory clustering and construct a novel partitioning (angle-based partitioning method of line segments; second, the Lagrange-based method and Hausdorff-based K-means++ are integrated in fuzzy C-means (FCM clustering, which are used to maintain the stability and the robustness of the clustering process; finally, least squares regression model is employed to achieve regression clustering of the trajectory. In our experiment, the performance and effectiveness of our method is validated against real-world taxi GPS data. When comparing our clustering algorithm with the partition-based clustering algorithms (K-means, K-median, and FCM, our experimental results demonstrate that the presented method is more effective and generates a more reasonable trajectory.

  14. Real estate value prediction using multivariate regression models

    Science.gov (United States)

    Manjula, R.; Jain, Shubham; Srivastava, Sharad; Rajiv Kher, Pranav

    2017-11-01

    The real estate market is one of the most competitive in terms of pricing and the same tends to vary significantly based on a lot of factors, hence it becomes one of the prime fields to apply the concepts of machine learning to optimize and predict the prices with high accuracy. Therefore in this paper, we present various important features to use while predicting housing prices with good accuracy. We have described regression models, using various features to have lower Residual Sum of Squares error. While using features in a regression model some feature engineering is required for better prediction. Often a set of features (multiple regressions) or polynomial regression (applying a various set of powers in the features) is used for making better model fit. For these models are expected to be susceptible towards over fitting ridge regression is used to reduce it. This paper thus directs to the best application of regression models in addition to other techniques to optimize the result.

  15. Regression analysis for LED color detection of visual-MIMO system

    Science.gov (United States)

    Banik, Partha Pratim; Saha, Rappy; Kim, Ki-Doo

    2018-04-01

    Color detection from a light emitting diode (LED) array using a smartphone camera is very difficult in a visual multiple-input multiple-output (visual-MIMO) system. In this paper, we propose a method to determine the LED color using a smartphone camera by applying regression analysis. We employ a multivariate regression model to identify the LED color. After taking a picture of an LED array, we select the LED array region, and detect the LED using an image processing algorithm. We then apply the k-means clustering algorithm to determine the number of potential colors for feature extraction of each LED. Finally, we apply the multivariate regression model to predict the color of the transmitted LEDs. In this paper, we show our results for three types of environmental light condition: room environmental light, low environmental light (560 lux), and strong environmental light (2450 lux). We compare the results of our proposed algorithm from the analysis of training and test R-Square (%) values, percentage of closeness of transmitted and predicted colors, and we also mention about the number of distorted test data points from the analysis of distortion bar graph in CIE1931 color space.

  16. Higher-order Multivariable Polynomial Regression to Estimate Human Affective States

    Science.gov (United States)

    Wei, Jie; Chen, Tong; Liu, Guangyuan; Yang, Jiemin

    2016-03-01

    From direct observations, facial, vocal, gestural, physiological, and central nervous signals, estimating human affective states through computational models such as multivariate linear-regression analysis, support vector regression, and artificial neural network, have been proposed in the past decade. In these models, linear models are generally lack of precision because of ignoring intrinsic nonlinearities of complex psychophysiological processes; and nonlinear models commonly adopt complicated algorithms. To improve accuracy and simplify model, we introduce a new computational modeling method named as higher-order multivariable polynomial regression to estimate human affective states. The study employs standardized pictures in the International Affective Picture System to induce thirty subjects’ affective states, and obtains pure affective patterns of skin conductance as input variables to the higher-order multivariable polynomial model for predicting affective valence and arousal. Experimental results show that our method is able to obtain efficient correlation coefficients of 0.98 and 0.96 for estimation of affective valence and arousal, respectively. Moreover, the method may provide certain indirect evidences that valence and arousal have their brain’s motivational circuit origins. Thus, the proposed method can serve as a novel one for efficiently estimating human affective states.

  17. Single image super-resolution using locally adaptive multiple linear regression.

    Science.gov (United States)

    Yu, Soohwan; Kang, Wonseok; Ko, Seungyong; Paik, Joonki

    2015-12-01

    This paper presents a regularized superresolution (SR) reconstruction method using locally adaptive multiple linear regression to overcome the limitation of spatial resolution of digital images. In order to make the SR problem better-posed, the proposed method incorporates the locally adaptive multiple linear regression into the regularization process as a local prior. The local regularization prior assumes that the target high-resolution (HR) pixel is generated by a linear combination of similar pixels in differently scaled patches and optimum weight parameters. In addition, we adapt a modified version of the nonlocal means filter as a smoothness prior to utilize the patch redundancy. Experimental results show that the proposed algorithm better restores HR images than existing state-of-the-art methods in the sense of the most objective measures in the literature.

  18. Transcriptome analysis of spermatogenically regressed, recrudescent and active phase testis of seasonally breeding wall lizards Hemidactylus flaviviridis.

    Directory of Open Access Journals (Sweden)

    Mukesh Gautam

    Full Text Available Reptiles are phylogenically important group of organisms as mammals have evolved from them. Wall lizard testis exhibits clearly distinct morphology during various phases of a reproductive cycle making them an interesting model to study regulation of spermatogenesis. Studies on reptile spermatogenesis are negligible hence this study will prove to be an important resource.Histological analyses show complete regression of seminiferous tubules during regressed phase with retracted Sertoli cells and spermatognia. In the recrudescent phase, regressed testis regain cellular activity showing presence of normal Sertoli cells and developing germ cells. In the active phase, testis reaches up to its maximum size with enlarged seminiferous tubules and presence of sperm in seminiferous lumen. Total RNA extracted from whole testis of regressed, recrudescent and active phase of wall lizard was hybridized on Mouse Whole Genome 8×60 K format gene chip. Microarray data from regressed phase was deemed as control group. Microarray data were validated by assessing the expression of some selected genes using Quantitative Real-Time PCR. The genes prominently expressed in recrudescent and active phase testis are cytoskeleton organization GO 0005856, cell growth GO 0045927, GTpase regulator activity GO: 0030695, transcription GO: 0006352, apoptosis GO: 0006915 and many other biological processes. The genes showing higher expression in regressed phase belonged to functional categories such as negative regulation of macromolecule metabolic process GO: 0010605, negative regulation of gene expression GO: 0010629 and maintenance of stem cell niche GO: 0045165.This is the first exploratory study profiling transcriptome of three drastically different conditions of any reptilian testis. The genes expressed in the testis during regressed, recrudescent and active phase of reproductive cycle are in concordance with the testis morphology during these phases. This study will pave

  19. In-line near real time monitoring of fluid streams in separation processes for used nuclear fuel - 5146

    International Nuclear Information System (INIS)

    Nee, K.; Nilsson, M.

    2015-01-01

    Applying spectroscopic tools for chemical processes has been intensively studied in various industries owing to its rapid and non-destructive analysis for detecting chemical components and determine physical characteristic in a process stream. The general complexity of separation processes for used nuclear fuel, e.g., chemical speciation, temperature variations, and prominent process security and safety concerns, require a well-secured and robust monitoring system to provide precise information of the process streams at real time without interference. Multivariate analysis accompanied with spectral measurements is a powerful statistic technique that can be used to monitor this complex chemical system. In this work, chemometric models that respond to the chemical components in the fluid samples were calibrated and validated to establish an in-line near real time monitoring system. The models show good prediction accuracy using partial least square regression analysis on the spectral data obtained from UV/Vis/NIR spectroscopies. The models were tested on a solvent extraction process using a single stage centrifugal contactor in our laboratory to determine the performance of an in-line near real time monitoring system. (authors)

  20. Laser based thermo-conductometry as an approach to determine ribbon solid fraction off-line and in-line.

    Science.gov (United States)

    Wiedey, Raphael; Šibanc, Rok; Kleinebudde, Peter

    2018-06-06

    Ribbon solid fraction is one of the most important quality attributes during roll compaction/dry granulation. Accurate and precise determination is challenging and no in-line measurement tool has been generally accepted, yet. In this study, a new analytical tool with potential off-line as well as in-line applicability is described. It is based on the thermo-conductivity of the compacted material, which is known to depend on the solid fraction. A laser diode was used to punctually heat the ribbon and the heat propagation monitored by infrared thermography. After performing a Gaussian fit of the transverse ribbon profile, the scale parameter σ showed correlation to ribbon solid fraction in off-line as well as in-line studies. Accurate predictions of the solid fraction were possible for a relevant range of process settings. Drug stability was not affected, as could be demonstrated for the model drug nifedipine. The application of this technique was limited when using certain fillers and working at higher roll speeds. This study showed the potentials of this new technique and is a starting point for additional work that has to be done to overcome these challenges. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    Science.gov (United States)

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric

  2. Computing multiple-output regression quantile regions

    Czech Academy of Sciences Publication Activity Database

    Paindaveine, D.; Šiman, Miroslav

    2012-01-01

    Roč. 56, č. 4 (2012), s. 840-853 ISSN 0167-9473 R&D Projects: GA MŠk(CZ) 1M06047 Institutional research plan: CEZ:AV0Z10750506 Keywords : halfspace depth * multiple-output regression * parametric linear programming * quantile regression Subject RIV: BA - General Mathematics Impact factor: 1.304, year: 2012 http://library.utia.cas.cz/separaty/2012/SI/siman-0376413.pdf

  3. Preface to Berk's "Regression Analysis: A Constructive Critique"

    OpenAIRE

    de Leeuw, Jan

    2003-01-01

    It is pleasure to write a preface for the book ”Regression Analysis” of my fellow series editor Dick Berk. And it is a pleasure in particular because the book is about regression analysis, the most popular and the most fundamental technique in applied statistics. And because it is critical of the way regression analysis is used in the sciences, in particular in the social and behavioral sciences. Although the book can be read as an introduction to regression analysis, it can also be read as a...

  4. Five cases of caudal regression with an aberrant abdominal umbilical artery: Further support for a caudal regression-sirenomelia spectrum.

    Science.gov (United States)

    Duesterhoeft, Sara M; Ernst, Linda M; Siebert, Joseph R; Kapur, Raj P

    2007-12-15

    Sirenomelia and caudal regression have sparked centuries of interest and recent debate regarding their classification and pathogenetic relationship. Specific anomalies are common to both conditions, but aside from fusion of the lower extremities, an aberrant abdominal umbilical artery ("persistent vitelline artery") has been invoked as the chief anatomic finding that distinguishes sirenomelia from caudal regression. This observation is important from a pathogenetic viewpoint, in that diversion of blood away from the caudal portion of the embryo through the abdominal umbilical artery ("vascular steal") has been proposed as the primary mechanism leading to sirenomelia. In contrast, caudal regression is hypothesized to arise from primary deficiency of caudal mesoderm. We present five cases of caudal regression that exhibit an aberrant abdominal umbilical artery similar to that typically associated with sirenomelia. Review of the literature identified four similar cases. Collectively, the series lends support for a caudal regression-sirenomelia spectrum with a common pathogenetic basis and suggests that abnormal umbilical arterial anatomy may be the consequence, rather than the cause, of deficient caudal mesoderm. (c) 2007 Wiley-Liss, Inc.

  5. Model-based Quantile Regression for Discrete Data

    KAUST Repository

    Padellini, Tullia; Rue, Haavard

    2018-01-01

    Quantile regression is a class of methods voted to the modelling of conditional quantiles. In a Bayesian framework quantile regression has typically been carried out exploiting the Asymmetric Laplace Distribution as a working likelihood. Despite

  6. A study of machine learning regression methods for major elemental analysis of rocks using laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Boucher, Thomas F., E-mail: boucher@cs.umass.edu [School of Computer Science, University of Massachusetts Amherst, 140 Governor' s Drive, Amherst, MA 01003, United States. (United States); Ozanne, Marie V. [Department of Astronomy, Mount Holyoke College, South Hadley, MA 01075 (United States); Carmosino, Marco L. [School of Computer Science, University of Massachusetts Amherst, 140 Governor' s Drive, Amherst, MA 01003, United States. (United States); Dyar, M. Darby [Department of Astronomy, Mount Holyoke College, South Hadley, MA 01075 (United States); Mahadevan, Sridhar [School of Computer Science, University of Massachusetts Amherst, 140 Governor' s Drive, Amherst, MA 01003, United States. (United States); Breves, Elly A.; Lepore, Kate H. [Department of Astronomy, Mount Holyoke College, South Hadley, MA 01075 (United States); Clegg, Samuel M. [Los Alamos National Laboratory, P.O. Box 1663, MS J565, Los Alamos, NM 87545 (United States)

    2015-05-01

    The ChemCam instrument on the Mars Curiosity rover is generating thousands of LIBS spectra and bringing interest in this technique to public attention. The key to interpreting Mars or any other types of LIBS data are calibrations that relate laboratory standards to unknowns examined in other settings and enable predictions of chemical composition. Here, LIBS spectral data are analyzed using linear regression methods including partial least squares (PLS-1 and PLS-2), principal component regression (PCR), least absolute shrinkage and selection operator (lasso), elastic net, and linear support vector regression (SVR-Lin). These were compared against results from nonlinear regression methods including kernel principal component regression (K-PCR), polynomial kernel support vector regression (SVR-Py) and k-nearest neighbor (kNN) regression to discern the most effective models for interpreting chemical abundances from LIBS spectra of geological samples. The results were evaluated for 100 samples analyzed with 50 laser pulses at each of five locations averaged together. Wilcoxon signed-rank tests were employed to evaluate the statistical significance of differences among the nine models using their predicted residual sum of squares (PRESS) to make comparisons. For MgO, SiO{sub 2}, Fe{sub 2}O{sub 3}, CaO, and MnO, the sparse models outperform all the others except for linear SVR, while for Na{sub 2}O, K{sub 2}O, TiO{sub 2}, and P{sub 2}O{sub 5}, the sparse methods produce inferior results, likely because their emission lines in this energy range have lower transition probabilities. The strong performance of the sparse methods in this study suggests that use of dimensionality-reduction techniques as a preprocessing step may improve the performance of the linear models. Nonlinear methods tend to overfit the data and predict less accurately, while the linear methods proved to be more generalizable with better predictive performance. These results are attributed to the high

  7. Composite marginal quantile regression analysis for longitudinal adolescent body mass index data.

    Science.gov (United States)

    Yang, Chi-Chuan; Chen, Yi-Hau; Chang, Hsing-Yi

    2017-09-20

    Childhood and adolescenthood overweight or obesity, which may be quantified through the body mass index (BMI), is strongly associated with adult obesity and other health problems. Motivated by the child and adolescent behaviors in long-term evolution (CABLE) study, we are interested in individual, family, and school factors associated with marginal quantiles of longitudinal adolescent BMI values. We propose a new method for composite marginal quantile regression analysis for longitudinal outcome data, which performs marginal quantile regressions at multiple quantile levels simultaneously. The proposed method extends the quantile regression coefficient modeling method introduced by Frumento and Bottai (Biometrics 2016; 72:74-84) to longitudinal data accounting suitably for the correlation structure in longitudinal observations. A goodness-of-fit test for the proposed modeling is also developed. Simulation results show that the proposed method can be much more efficient than the analysis without taking correlation into account and the analysis performing separate quantile regressions at different quantile levels. The application to the longitudinal adolescent BMI data from the CABLE study demonstrates the practical utility of our proposal. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Diagnosis of cranial hemangioma: Comparison between logistic regression analysis and neuronal network

    International Nuclear Information System (INIS)

    Arana, E.; Marti-Bonmati, L.; Bautista, D.; Paredes, R.

    1998-01-01

    To study the utility of logistic regression and the neuronal network in the diagnosis of cranial hemangiomas. Fifteen patients presenting hemangiomas were selected form a total of 167 patients with cranial lesions. All were evaluated by plain radiography and computed tomography (CT). Nineteen variables in their medical records were reviewed. Logistic regression and neuronal network models were constructed and validated by the jackknife (leave-one-out) approach. The yields of the two models were compared by means of ROC curves, using the area under the curve as parameter. Seven men and 8 women presented hemangiomas. The mean age of these patients was 38.4 (15.4 years (mea ± standard deviation). Logistic regression identified as significant variables the shape, soft tissue mass and periosteal reaction. The neuronal network lent more importance to the existence of ossified matrix, ruptured cortical vein and the mixed calcified-blastic (trabeculated) pattern. The neuronal network showed a greater yield than logistic regression (Az, 0.9409) (0.004 versus 0.7211± 0.075; p<0.001). The neuronal network discloses hidden interactions among the variables, providing a higher yield in the characterization of cranial hemangiomas and constituting a medical diagnostic acid. (Author)29 refs

  9. Moderation analysis using a two-level regression model.

    Science.gov (United States)

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  10. Meta-analytical synthesis of regression coefficients under different categorization scheme of continuous covariates.

    Science.gov (United States)

    Yoneoka, Daisuke; Henmi, Masayuki

    2017-11-30

    Recently, the number of clinical prediction models sharing the same regression task has increased in the medical literature. However, evidence synthesis methodologies that use the results of these regression models have not been sufficiently studied, particularly in meta-analysis settings where only regression coefficients are available. One of the difficulties lies in the differences between the categorization schemes of continuous covariates across different studies. In general, categorization methods using cutoff values are study specific across available models, even if they focus on the same covariates of interest. Differences in the categorization of covariates could lead to serious bias in the estimated regression coefficients and thus in subsequent syntheses. To tackle this issue, we developed synthesis methods for linear regression models with different categorization schemes of covariates. A 2-step approach to aggregate the regression coefficient estimates is proposed. The first step is to estimate the joint distribution of covariates by introducing a latent sampling distribution, which uses one set of individual participant data to estimate the marginal distribution of covariates with categorization. The second step is to use a nonlinear mixed-effects model with correction terms for the bias due to categorization to estimate the overall regression coefficients. Especially in terms of precision, numerical simulations show that our approach outperforms conventional methods, which only use studies with common covariates or ignore the differences between categorization schemes. The method developed in this study is also applied to a series of WHO epidemiologic studies on white blood cell counts. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Electromigration-induced plastic deformation in passivated metal lines

    Science.gov (United States)

    Valek, B. C.; Bravman, J. C.; Tamura, N.; MacDowell, A. A.; Celestre, R. S.; Padmore, H. A.; Spolenak, R.; Brown, W. L.; Batterman, B. W.; Patel, J. R.

    2002-11-01

    We have used scanning white beam x-ray microdiffraction to study microstructural evolution during an in situ electromigration experiment on a passivated Al(Cu) test line. The data show plastic deformation and grain rotations occurring under the influence of electromigration, seen as broadening, movement, and splitting of reflections diffracted from individual metal grains. We believe this deformation is due to localized shear stresses that arise due to the inhomogeneous transfer of metal along the line. Deviatoric stress measurements show changes in the components of stress within the line, including relaxation of stress when current is removed.

  12. Apoptosis in the transplanted canine transmissible venereal tumor during growth and regression phases Apoptose no tumor venéreo transmissível canino durante as fases de crescimento e regressão

    Directory of Open Access Journals (Sweden)

    F.G.A. Santos

    2008-06-01

    Full Text Available Twelve male, mongrel, adult dogs were subcutaneously transplanted with cells originated from two canine transmissible venereal tumors (TVT. The aim was to demonstrate and to quantify the occurrence of apoptosis in the TVT regression. After six months of transplantation, a tumor sample was obtained from each dog, being six dogs with TVT in the growing phase and six in the regression phase as verified by daily measurements. Samples were processed for histological and ultrastructural purposes as well as for DNA extraction. Sections of 4µm were stained by HE, Shorr, methyl green pyronine, Van Gieson, TUNEL reaction and immunostained for P53. The Shorr stained sections went through morphometry that demonstrated an increase of the apoptotic cells per field in the regressive tumors. It was also confirmed by transmission electron microscopy, which showed cells with typical morphology of apoptosis and by the TUNEL reaction that detected in situ the 3'OH nick end labeling mainly in the regressive tumors. The regressive TVTs also showed an intensified immunostaining for P53 besides a more intense genomic DNA fragmentation detected by the agarose gel electrophoresis. In conclusion, apoptosis has an important role in the regression of the experimental TVT in a way that is P53-dependent.Doze cães, adultos, machos e sem raça definida foram transplantados subcutaneamente, na região hipogástrica, com células originadas de dois tumores venéreos transmissíveis caninos (TVT. O objetivo do estudo foi demonstrar e quantificar a ocorrência de apoptose na regressão do TVT. Após seis meses, foi obtido um tumor de cada animal, totalizando seis em crescimento e seis em regressão. Fragmentos dos tumores foram processados para avaliação histológica, ultra-estrutural e também para extração de DNA. Cortes de 4µm foram corados em HE, Shorr, verde de metila pironina e Van Gieson e alguns foram submetidos à reação do TUNEL e à imunoistoquímica para P53

  13. Phase Space Prediction of Chaotic Time Series with Nu-Support Vector Machine Regression

    International Nuclear Information System (INIS)

    Ye Meiying; Wang Xiaodong

    2005-01-01

    A new class of support vector machine, nu-support vector machine, is discussed which can handle both classification and regression. We focus on nu-support vector machine regression and use it for phase space prediction of chaotic time series. The effectiveness of the method is demonstrated by applying it to the Henon map. This study also compares nu-support vector machine with back propagation (BP) networks in order to better evaluate the performance of the proposed methods. The experimental results show that the nu-support vector machine regression obtains lower root mean squared error than the BP networks and provides an accurate chaotic time series prediction. These results can be attributable to the fact that nu-support vector machine implements the structural risk minimization principle and this leads to better generalization than the BP networks.

  14. Plasma line observations in the auroral oval

    International Nuclear Information System (INIS)

    Valladares, C.E.; Kelley, M.C.; Vickrey, J.F.

    1988-01-01

    We report here a series of experiments conducted at the Sondre Stromfjord incoherent scatter radar, aimed at detected enhanced plasma lines associated with midnight sector auroral arcs. Using different receivers, we detected both ion and plasma lines simultaneously. The plasma line signal was recorded with the use of a filter bank of eight frequencies. Plasma lines were found to originate mainly from the topside of the particle-produced E layer. The enhanced plasma lines are sometimes a factor of 100 times larger than the thermal level. Our data show a rapid decay of the plasma lines, however. In some cases, only a 30-s integration time was needed in order to unambiguously detect both upshifted and downshifted lines. The level of the plasma lines reaches values of, for the larger cases, up to 40 0 K above the noise temperature. These are considerably higher than results from prior auroral zone plasma line experiments. In situ observations of enhanced plasma waves in this same region are reported in a companion paper. copyright American Geophysical Union 1988

  15. Asymmetries of the solar Ca II lines

    International Nuclear Information System (INIS)

    Heasley, J.N.

    1975-01-01

    A theoretical study of the influence of propagating acoustic pulses in the solar chromosphere upon the line profiles of the Ca II resonance and infrared triplet lines has been made. The major objective has been to explain the observed asymmetries seen in the cores of the H and K lines and to predict the temporal behavior of the infrared lines caused by passing acoustic or shock pulses. The velocities in the pulses, calculated from weak shock theory, have been included consistently in the non-LTE calculations. The results of the calculations show that these lines are very sensitive to perturbations in the background atmosphere caused by the pulses. Only minor changes in the line shapes result from including the velocities consistently in the line source function calculations. The qualitative changes in the line profiles vary markedly with the strength of the shock pulses. The observed differences in the K line profiles seen on the quiet Sun can be explained in terms of a spectrum of pulses with different wavelengths and initial amplitudes in the photosphere. (Auth.)

  16. Structural Break Tests Robust to Regression Misspecification

    Directory of Open Access Journals (Sweden)

    Alaa Abi Morshed

    2018-05-01

    Full Text Available Structural break tests for regression models are sensitive to model misspecification. We show—analytically and through simulations—that the sup Wald test for breaks in the conditional mean and variance of a time series process exhibits severe size distortions when the conditional mean dynamics are misspecified. We also show that the sup Wald test for breaks in the unconditional mean and variance does not have the same size distortions, yet benefits from similar power to its conditional counterpart in correctly specified models. Hence, we propose using it as an alternative and complementary test for breaks. We apply the unconditional and conditional mean and variance tests to three US series: unemployment, industrial production growth and interest rates. Both the unconditional and the conditional mean tests detect a break in the mean of interest rates. However, for the other two series, the unconditional mean test does not detect a break, while the conditional mean tests based on dynamic regression models occasionally detect a break, with the implied break-point estimator varying across different dynamic specifications. For all series, the unconditional variance does not detect a break while most tests for the conditional variance do detect a break which also varies across specifications.

  17. Gradient descent for robust kernel-based regression

    Science.gov (United States)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  18. Independent contrasts and PGLS regression estimators are equivalent.

    Science.gov (United States)

    Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary

    2012-05-01

    We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.

  19. Characterization of vegetative and grain filling periods of winter wheat by stepwise regression procedure. II. Grain filling period

    Directory of Open Access Journals (Sweden)

    Pržulj Novo

    2011-01-01

    Full Text Available In wheat, rate and duration of dry matter accumulation and remobilization depend on genotype and growing conditions. The objective of this study was to determine the most appropriate polynomial regression of stepwise regression procedure for describing grain filling period in three winter wheat cultivars. The stepwise regression procedure showed that grain filling is a complex biological process and that it is difficult to offer a simple and appropriate polynomial equation that fits the pattern of changes in dry matter accumulation during the grain filling period, i.e., from anthesis to maximum grain weight, in winter wheat. If grain filling is to be represented with a high power polynomial, quartic and quintic equations showed to be most appropriate. In spite of certain disadvantages, a cubic equation of stepwise regression could be used for describing the pattern of winter wheat grain filling.

  20. Methods for identifying SNP interactions: a review on variations of Logic Regression, Random Forest and Bayesian logistic regression.

    Science.gov (United States)

    Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula

    2011-01-01

    Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.

  1. Electricity consumption forecasting in Italy using linear regression models

    Energy Technology Data Exchange (ETDEWEB)

    Bianco, Vincenzo; Manca, Oronzio; Nardini, Sergio [DIAM, Seconda Universita degli Studi di Napoli, Via Roma 29, 81031 Aversa (CE) (Italy)

    2009-09-15

    The influence of economic and demographic variables on the annual electricity consumption in Italy has been investigated with the intention to develop a long-term consumption forecasting model. The time period considered for the historical data is from 1970 to 2007. Different regression models were developed, using historical electricity consumption, gross domestic product (GDP), gross domestic product per capita (GDP per capita) and population. A first part of the paper considers the estimation of GDP, price and GDP per capita elasticities of domestic and non-domestic electricity consumption. The domestic and non-domestic short run price elasticities are found to be both approximately equal to -0.06, while long run elasticities are equal to -0.24 and -0.09, respectively. On the contrary, the elasticities of GDP and GDP per capita present higher values. In the second part of the paper, different regression models, based on co-integrated or stationary data, are presented. Different statistical tests are employed to check the validity of the proposed models. A comparison with national forecasts, based on complex econometric models, such as Markal-Time, was performed, showing that the developed regressions are congruent with the official projections, with deviations of {+-}1% for the best case and {+-}11% for the worst. These deviations are to be considered acceptable in relation to the time span taken into account. (author)

  2. Electricity consumption forecasting in Italy using linear regression models

    International Nuclear Information System (INIS)

    Bianco, Vincenzo; Manca, Oronzio; Nardini, Sergio

    2009-01-01

    The influence of economic and demographic variables on the annual electricity consumption in Italy has been investigated with the intention to develop a long-term consumption forecasting model. The time period considered for the historical data is from 1970 to 2007. Different regression models were developed, using historical electricity consumption, gross domestic product (GDP), gross domestic product per capita (GDP per capita) and population. A first part of the paper considers the estimation of GDP, price and GDP per capita elasticities of domestic and non-domestic electricity consumption. The domestic and non-domestic short run price elasticities are found to be both approximately equal to -0.06, while long run elasticities are equal to -0.24 and -0.09, respectively. On the contrary, the elasticities of GDP and GDP per capita present higher values. In the second part of the paper, different regression models, based on co-integrated or stationary data, are presented. Different statistical tests are employed to check the validity of the proposed models. A comparison with national forecasts, based on complex econometric models, such as Markal-Time, was performed, showing that the developed regressions are congruent with the official projections, with deviations of ±1% for the best case and ±11% for the worst. These deviations are to be considered acceptable in relation to the time span taken into account. (author)

  3. Demonstration of a Fiber Optic Regression Probe

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt A.

    2010-01-01

    The capability to provide localized, real-time monitoring of material regression rates in various applications has the potential to provide a new stream of data for development testing of various components and systems, as well as serving as a monitoring tool in flight applications. These applications include, but are not limited to, the regression of a combusting solid fuel surface, the ablation of the throat in a chemical rocket or the heat shield of an aeroshell, and the monitoring of erosion in long-life plasma thrusters. The rate of regression in the first application is very fast, while the second and third are increasingly slower. A recent fundamental sensor development effort has led to a novel regression, erosion, and ablation sensor technology (REAST). The REAST sensor allows for measurement of real-time surface erosion rates at a discrete surface location. The sensor is optical, using two different, co-located fiber-optics to perform the regression measurement. The disparate optical transmission properties of the two fiber-optics makes it possible to measure the regression rate by monitoring the relative light attenuation through the fibers. As the fibers regress along with the parent material in which they are embedded, the relative light intensities through the two fibers changes, providing a measure of the regression rate. The optical nature of the system makes it relatively easy to use in a variety of harsh, high temperature environments, and it is also unaffected by the presence of electric and magnetic fields. In addition, the sensor could be used to perform optical spectroscopy on the light emitted by a process and collected by fibers, giving localized measurements of various properties. The capability to perform an in-situ measurement of material regression rates is useful in addressing a variety of physical issues in various applications. An in-situ measurement allows for real-time data regarding the erosion rates, providing a quick method for

  4. Caudal regression syndrome : a case report

    International Nuclear Information System (INIS)

    Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun

    1998-01-01

    Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging

  5. Caudal regression syndrome : a case report

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun [Chungang Gil Hospital, Incheon (Korea, Republic of)

    1998-07-01

    Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging.

  6. Modeling the number of car theft using Poisson regression

    Science.gov (United States)

    Zulkifli, Malina; Ling, Agnes Beh Yen; Kasim, Maznah Mat; Ismail, Noriszura

    2016-10-01

    Regression analysis is the most popular statistical methods used to express the relationship between the variables of response with the covariates. The aim of this paper is to evaluate the factors that influence the number of car theft using Poisson regression model. This paper will focus on the number of car thefts that occurred in districts in Peninsular Malaysia. There are two groups of factor that have been considered, namely district descriptive factors and socio and demographic factors. The result of the study showed that Bumiputera composition, Chinese composition, Other ethnic composition, foreign migration, number of residence with the age between 25 to 64, number of employed person and number of unemployed person are the most influence factors that affect the car theft cases. These information are very useful for the law enforcement department, insurance company and car owners in order to reduce and limiting the car theft cases in Peninsular Malaysia.

  7. Correlation and simple linear regression.

    Science.gov (United States)

    Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G

    2003-06-01

    In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.

  8. Effect of Abdominal Visceral Fat Change on Regression of Erosive Esophagitis: Prospective Cohort Study.

    Science.gov (United States)

    Nam, Su Youn; Kim, Young Woo; Park, Bum Joon; Ryu, Kum Hei; Kim, Hyun Boem

    2018-05-04

    Although abdominal visceral fat has been associated with erosive esophagitis in cross-sectional studies, there are few data on the longitudinal effect. We evaluated the effects of abdominal visceral fat change on the regression of erosive esophagitis in a prospective cohort study. A total of 163 participants with erosive esophagitis at baseline were followed up at 34 months and underwent esophagogastroduodenoscopy and computed tomography at both baseline and follow-up. The longitudinal effects of abdominal visceral fat on the regression of erosive esophagitis were evaluated using relative risk (RR) and 95% confidence intervals (CIs). Regression was observed in approximately 49% of participants (n=80). The 3rd (RR, 0.13; 95% CI, 0.02 to 0.71) and 4th quartiles (RR, 0.07; 95% CI, 0.01 to 0.38) of visceral fat at follow-up were associated with decreased regression of erosive esophagitis. The highest quartile of visceral fat change reduced the probability of the regression of erosive esophagitis compared to the lowest quartile (RR, 0.10; 95% CI, 0.03 to 0.28). Each trend showed a dose-dependent pattern (p for trend fat at follow-up and a greater increase in visceral fat reduced the regression of erosive esophagitis in a dose-dependent manner.

  9. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    Science.gov (United States)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  10. At-line determination of pharmaceuticals small molecule's blending end point using chemometric modeling combined with Fourier transform near infrared spectroscopy

    Science.gov (United States)

    Tewari, Jagdish; Strong, Richard; Boulas, Pierre

    2017-02-01

    This article summarizes the development and validation of a Fourier transform near infrared spectroscopy (FT-NIR) method for the rapid at-line prediction of active pharmaceutical ingredient (API) in a powder blend to optimize small molecule formulations. The method was used to determine the blend uniformity end-point for a pharmaceutical solid dosage formulation containing a range of API concentrations. A set of calibration spectra from samples with concentrations ranging from 1% to 15% of API (w/w) were collected at-line from 4000 to 12,500 cm- 1. The ability of the FT-NIR method to predict API concentration in the blend samples was validated against a reference high performance liquid chromatography (HPLC) method. The prediction efficiency of four different types of multivariate data modeling methods such as partial least-squares 1 (PLS1), partial least-squares 2 (PLS2), principal component regression (PCR) and artificial neural network (ANN), were compared using relevant multivariate figures of merit. The prediction ability of the regression models were cross validated against results generated with the reference HPLC method. PLS1 and ANN showed excellent and superior prediction abilities when compared to PLS2 and PCR. Based upon these results and because of its decreased complexity compared to ANN, PLS1 was selected as the best chemometric method to predict blend uniformity at-line. The FT-NIR measurement and the associated chemometric analysis were implemented in the production environment for rapid at-line determination of the end-point of the small molecule blending operation. FIGURE 1: Correlation coefficient vs Rank plot FIGURE 2: FT-NIR spectra of different steps of Blend and final blend FIGURE 3: Predictions ability of PCR FIGURE 4: Blend uniformity predication ability of PLS2 FIGURE 5: Prediction efficiency of blend uniformity using ANN FIGURE 6: Comparison of prediction efficiency of chemometric models TABLE 1: Order of Addition for Blending Steps

  11. bayesQR: A Bayesian Approach to Quantile Regression

    Directory of Open Access Journals (Sweden)

    Dries F. Benoit

    2017-01-01

    Full Text Available After its introduction by Koenker and Basset (1978, quantile regression has become an important and popular tool to investigate the conditional response distribution in regression. The R package bayesQR contains a number of routines to estimate quantile regression parameters using a Bayesian approach based on the asymmetric Laplace distribution. The package contains functions for the typical quantile regression with continuous dependent variable, but also supports quantile regression for binary dependent variables. For both types of dependent variables, an approach to variable selection using the adaptive lasso approach is provided. For the binary quantile regression model, the package also contains a routine that calculates the fitted probabilities for each vector of predictors. In addition, functions for summarizing the results, creating traceplots, posterior histograms and drawing quantile plots are included. This paper starts with a brief overview of the theoretical background of the models used in the bayesQR package. The main part of this paper discusses the computational problems that arise in the implementation of the procedure and illustrates the usefulness of the package through selected examples.

  12. Weighted SGD for ℓp Regression with Randomized Preconditioning*

    Science.gov (United States)

    Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.

    2018-01-01

    prediction norm in 𝒪(log n·nnz(A)+poly(d) log(1/ε)/ε) time. We show that for unconstrained ℓ2 regression, this complexity is comparable to that of RLA and is asymptotically better over several state-of-the-art solvers in the regime where the desired accuracy ε, high dimension n and low dimension d satisfy d ≥ 1/ε and n ≥ d2/ε. We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets, and the results are consistent with our theoretical findings and demonstrate that pwSGD converges to a medium-precision solution, e.g., ε = 10−3, more quickly. PMID:29782626

  13. Modified Regression Rate Formula of PMMA Combustion by a Single Plane Impinging Jet

    Directory of Open Access Journals (Sweden)

    Tsuneyoshi Matsuoka

    2017-01-01

    Full Text Available A modified regression rate formula for the uppermost stage of CAMUI-type hybrid rocket motor is proposed in this study. Assuming a quasi-steady, one-dimensional, an energy balance against a control volume near the fuel surface is considered. Accordingly, the regression rate formula which can calculate the local regression rate by the quenching distance between the flame and the regression surface is derived. An experimental setup which simulates the combustion phenomenon involved in the uppermost stage of a CAMUI-type hybrid rocket motor was constructed and the burning tests with various flow velocities and impinging distances were performed. A PMMA slab of 20 mm height, 60 mm width, and 20 mm thickness was chosen as a sample specimen and pure oxygen and O2/N2 mixture (50/50 vol.% were employed as the oxidizers. The time-averaged regression rate along the fuel surface was measured by a laser displacement sensor. The quenching distance during the combustion event was also identified from the observation. The comparison between the purely experimental and calculated values showed good agreement, although a large systematic error was expected due to the difficulty in accurately identifying the quenching distance.

  14. INTRINSIC FACTORS AND FIRM FINANCIAL ANALYSIS WITH TRIPPLE BOTTOM LINES AS INTERVENING VARIABLE AGAINST FIRM VALUE Empirical Studies on Property and Real Estate Companies Year 2010-2013

    Directory of Open Access Journals (Sweden)

    Mia Andika Sari

    2016-09-01

    Full Text Available This research conducted to examine the influence of intrinsic factors which being peroxided with Capital Structure, Firm Size, Firm Age and Financial factors that being peroxided with liquidity, profitability also with another activities using triple bottom lines as Intervening Variable against Firm Value of Property Industries. The data that being used in this study were obtained from published financial statements during the period 2010 to 2013, as well as annual reports that can be accessed through the IDX website. Data analysis technique used in this study is a regression with panel data and path analysis. The results of this research showed that intrinsic factors and financial variables have a significant influence on the firm value, as well as intrinsic factors and financial variables have a significant influence on the triple bottom lines. From the results of path analysis demonstrated that the indirect effect using the triple bottom lines as a intervening variable was greater than the direct effect.

  15. Testing Delays Resulting in Increased Identification Accuracy in Line-Ups and Show-Ups.

    Science.gov (United States)

    Dekle, Dawn J.

    1997-01-01

    Investigated time delays (immediate, two-three days, one week) between viewing a staged theft and attempting an eyewitness identification. Compared lineups to one-person showups in a laboratory analogue involving 412 subjects. Results show that across all time delays, participants maintained a higher identification accuracy with the showup…

  16. Optimization of lining design in deep clays

    International Nuclear Information System (INIS)

    Rousset, G.; Bublitz, D.

    1989-01-01

    The main features of the mechanical behaviour of deep clay are time dependent effects and also the existence of a long term cohesion which may be taken into account for dimensioning galleries. In this text, a lining optimization test is presented. It concerns a gallery driven in deep clay, 230 m. deep, at Mol (Belgium). We show that sliding rib lining gives both: - an optimal tunnel face advance speed, a minimal closure of the gallery wall before setting the lining and therefore less likelihood of failure developing inside the rock mass. - limitation of the length of the non-lined part of the gallery. The chosen process allows on one hand the preservation of the rock mass integrity, and, on the other, use of the confinement effect to allow closure under high average stress conditions; this process can be considered as an optimal application of the convergence-confinement method. An important set of measurement devices is then presented along with results obtained for one year's operation. We show in particular that stress distribution in the lining is homogeneous and that the sliding limit can be measured with high precision

  17. Linear scleroderma following Blaschko′s lines

    Directory of Open Access Journals (Sweden)

    Mukhopadhyay Amiya

    2005-01-01

    Full Text Available Blaschko′s lines form a pattern, which many diseases are found to follow, but linear scleroderma following Blaschko′s lines is a controversial entity rarely reported in the literature. A 24-year-old man presented with multiple linear, atrophic, hyperpigmented lesions punctuated by areas of depigmentations on the left half of the trunk distributed on the anterior, lateral and posterior aspects. The lesions were distributed in a typical S-shaped line. Antinuclear antibody and antihistone antibody tests were negative. Histopathological examination of the skin from the affected area showed features suggestive of scleroderma. Here, we present a case of linear scleroderma following Blaschko′s lines in a male patient - an entity reported only three times so far.

  18. Multivariate Linear Regression and CART Regression Analysis of TBM Performance at Abu Hamour Phase-I Tunnel

    Science.gov (United States)

    Jakubowski, J.; Stypulkowski, J. B.; Bernardeau, F. G.

    2017-12-01

    The first phase of the Abu Hamour drainage and storm tunnel was completed in early 2017. The 9.5 km long, 3.7 m diameter tunnel was excavated with two Earth Pressure Balance (EPB) Tunnel Boring Machines from Herrenknecht. TBM operation processes were monitored and recorded by Data Acquisition and Evaluation System. The authors coupled collected TBM drive data with available information on rock mass properties, cleansed, completed with secondary variables and aggregated by weeks and shifts. Correlations and descriptive statistics charts were examined. Multivariate Linear Regression and CART regression tree models linking TBM penetration rate (PR), penetration per revolution (PPR) and field penetration index (FPI) with TBM operational and geotechnical characteristics were performed for the conditions of the weak/soft rock of Doha. Both regression methods are interpretable and the data were screened with different computational approaches allowing enriched insight. The primary goal of the analysis was to investigate empirical relations between multiple explanatory and responding variables, to search for best subsets of explanatory variables and to evaluate the strength of linear and non-linear relations. For each of the penetration indices, a predictive model coupling both regression methods was built and validated. The resultant models appeared to be stronger than constituent ones and indicated an opportunity for more accurate and robust TBM performance predictions.

  19. ExoMol line lists - VIII. A variationally computed line list for hot formaldehyde

    Science.gov (United States)

    Al-Refaie, Ahmed F.; Yachmenev, Andrey; Tennyson, Jonathan; Yurchenko, Sergei N.

    2015-04-01

    A computed line list for formaldehyde, H212C16O, applicable to temperatures up to T = 1500 K is presented. An empirical potential energy and ab initio dipole moment surfaces are used as the input to the nuclear motion program TROVE. The resulting line list, referred to as AYTY, contains 10.3 million rotational-vibrational states and around 10 billion transition frequencies. Each transition includes associated Einstein-A coefficients and absolute transition intensities, for wavenumbers below 10 000 cm-1 and rotational excitations up to J = 70. Room-temperature spectra are compared with laboratory measurements and data currently available in the HITRAN data base. These spectra show excellent agreement with experimental spectra and highlight the gaps and limitations of the HITRAN data. The full line list is available from the CDS data base as well as at www.exomol.com.

  20. Bayesian logistic regression in detection of gene-steroid interaction for cancer at PDLIM5 locus.

    Science.gov (United States)

    Wang, Ke-Sheng; Owusu, Daniel; Pan, Yue; Xie, Changchun

    2016-06-01

    The PDZ and LIM domain 5 (PDLIM5) gene may play a role in cancer, bipolar disorder, major depression, alcohol dependence and schizophrenia; however, little is known about the interaction effect of steroid and PDLIM5 gene on cancer. This study examined 47 single-nucleotide polymorphisms (SNPs) within the PDLIM5 gene in the Marshfield sample with 716 cancer patients (any diagnosed cancer, excluding minor skin cancer) and 2848 noncancer controls. Multiple logistic regression model in PLINK software was used to examine the association of each SNP with cancer. Bayesian logistic regression in PROC GENMOD in SAS statistical software, ver. 9.4 was used to detect gene- steroid interactions influencing cancer. Single marker analysis using PLINK identified 12 SNPs associated with cancer (Plogistic regression in PROC GENMOD showed that both rs6532496 and rs951613 revealed strong gene-steroid interaction effects (OR=2.18, 95% CI=1.31-3.63 with P = 2.9 × 10⁻³ for rs6532496 and OR=2.07, 95% CI=1.24-3.45 with P = 5.43 × 10⁻³ for rs951613, respectively). Results from Bayesian logistic regression showed stronger interaction effects (OR=2.26, 95% CI=1.2-3.38 for rs6532496 and OR=2.14, 95% CI=1.14-3.2 for rs951613, respectively). All the 12 SNPs associated with cancer revealed significant gene-steroid interaction effects (P logistic regression and OR=2.59, 95% CI=1.4-3.97 from Bayesian logistic regression; respectively). This study provides evidence of common genetic variants within the PDLIM5 gene and interactions between PLDIM5 gene polymorphisms and steroid use influencing cancer.

  1. Statistical 21-cm Signal Separation via Gaussian Process Regression Analysis

    Science.gov (United States)

    Mertens, F. G.; Ghosh, A.; Koopmans, L. V. E.

    2018-05-01

    Detecting and characterizing the Epoch of Reionization and Cosmic Dawn via the redshifted 21-cm hyperfine line of neutral hydrogen will revolutionize the study of the formation of the first stars, galaxies, black holes and intergalactic gas in the infant Universe. The wealth of information encoded in this signal is, however, buried under foregrounds that are many orders of magnitude brighter. These must be removed accurately and precisely in order to reveal the feeble 21-cm signal. This requires not only the modeling of the Galactic and extra-galactic emission, but also of the often stochastic residuals due to imperfect calibration of the data caused by ionospheric and instrumental distortions. To stochastically model these effects, we introduce a new method based on `Gaussian Process Regression' (GPR) which is able to statistically separate the 21-cm signal from most of the foregrounds and other contaminants. Using simulated LOFAR-EoR data that include strong instrumental mode-mixing, we show that this method is capable of recovering the 21-cm signal power spectrum across the entire range k = 0.07 - 0.3 {h cMpc^{-1}}. The GPR method is most optimal, having minimal and controllable impact on the 21-cm signal, when the foregrounds are correlated on frequency scales ≳ 3 MHz and the rms of the signal has σ21cm ≳ 0.1 σnoise. This signal separation improves the 21-cm power-spectrum sensitivity by a factor ≳ 3 compared to foreground avoidance strategies and enables the sensitivity of current and future 21-cm instruments such as the Square Kilometre Array to be fully exploited.

  2. Altering graphene line defect properties using chemistry

    Science.gov (United States)

    Vasudevan, Smitha; White, Carter; Gunlycke, Daniel

    2012-02-01

    First-principles calculations are presented of a fundamental topological line defect in graphene that was observed and reported in Nature Nanotech. 5, 326 (2010). These calculations show that atoms and smaller molecules can bind covalently to the surface in the vicinity of the graphene line defect. It is also shown that the chemistry at the line defect has a strong effect on its electronic and magnetic properties, e.g. the ferromagnetically aligned moments along the line defect can be quenched by some adsorbates. The strong effect of the adsorbates on the line defect properties can be understood by examining how these adsorbates affect the boundary-localized states in the vicinity of the Fermi level. We also expect that the line defect chemistry will significantly affect the scattering properties of incident low-energy particles approaching it from graphene.

  3. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  4. Variable importance in latent variable regression models

    NARCIS (Netherlands)

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  5. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    Science.gov (United States)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-06-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  6. Brillouin Scattering Spectrum Analysis Based on Auto-Regressive Spectral Estimation

    Science.gov (United States)

    Huang, Mengyun; Li, Wei; Liu, Zhangyun; Cheng, Linghao; Guan, Bai-Ou

    2018-03-01

    Auto-regressive (AR) spectral estimation technology is proposed to analyze the Brillouin scattering spectrum in Brillouin optical time-domain refelectometry. It shows that AR based method can reliably estimate the Brillouin frequency shift with an accuracy much better than fast Fourier transform (FFT) based methods provided the data length is not too short. It enables about 3 times improvement over FFT at a moderate spatial resolution.

  7. Regression: The Apple Does Not Fall Far From the Tree.

    Science.gov (United States)

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  8. Research on the multiple linear regression in non-invasive blood glucose measurement.

    Science.gov (United States)

    Zhu, Jianming; Chen, Zhencheng

    2015-01-01

    A non-invasive blood glucose measurement sensor and the data process algorithm based on the metabolic energy conservation (MEC) method are presented in this paper. The physiological parameters of human fingertip can be measured by various sensing modalities, and blood glucose value can be evaluated with the physiological parameters by the multiple linear regression analysis. Five methods such as enter, remove, forward, backward and stepwise in multiple linear regression were compared, and the backward method had the best performance. The best correlation coefficient was 0.876 with the standard error of the estimate 0.534, and the significance was 0.012 (sig. regression equation was valid. The Clarke error grid analysis was performed to compare the MEC method with the hexokinase method, using 200 data points. The correlation coefficient R was 0.867 and all of the points were located in Zone A and Zone B, which shows the MEC method provides a feasible and valid way for non-invasive blood glucose measurement.

  9. Multiple regression and beyond an introduction to multiple regression and structural equation modeling

    CERN Document Server

    Keith, Timothy Z

    2014-01-01

    Multiple Regression and Beyond offers a conceptually oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. Covers both MR and SEM, while explaining their relevance to one another Also includes path analysis, confirmatory factor analysis, and latent growth modeling Figures and tables throughout provide examples and illustrate key concepts and techniques For additional resources, please visit: http://tzkeith.com/.

  10. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression.

    Science.gov (United States)

    Yu, Xu; Lin, Jun-Yu; Jiang, Feng; Du, Jun-Wei; Han, Ji-Zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  11. Estimation of Ordinary Differential Equation Parameters Using Constrained Local Polynomial Regression.

    Science.gov (United States)

    Ding, A Adam; Wu, Hulin

    2014-10-01

    We propose a new method to use a constrained local polynomial regression to estimate the unknown parameters in ordinary differential equation models with a goal of improving the smoothing-based two-stage pseudo-least squares estimate. The equation constraints are derived from the differential equation model and are incorporated into the local polynomial regression in order to estimate the unknown parameters in the differential equation model. We also derive the asymptotic bias and variance of the proposed estimator. Our simulation studies show that our new estimator is clearly better than the pseudo-least squares estimator in estimation accuracy with a small price of computational cost. An application example on immune cell kinetics and trafficking for influenza infection further illustrates the benefits of the proposed new method.

  12. Quasi-experimental evidence on tobacco tax regressivity.

    Science.gov (United States)

    Koch, Steven F

    2018-01-01

    Tobacco taxes are known to reduce tobacco consumption and to be regressive, such that tobacco control policy may have the perverse effect of further harming the poor. However, if tobacco consumption falls faster amongst the poor than the rich, tobacco control policy can actually be progressive. We take advantage of persistent and committed tobacco control activities in South Africa to examine the household tobacco expenditure burden. For the analysis, we make use of two South African Income and Expenditure Surveys (2005/06 and 2010/11) that span a series of such tax increases and have been matched across the years, yielding 7806 matched pairs of tobacco consuming households and 4909 matched pairs of cigarette consuming households. By matching households across the surveys, we are able to examine both the regressivity of the household tobacco burden, and any change in that regressivity, and since tobacco taxes have been a consistent component of tobacco prices, our results also relate to the regressivity of tobacco taxes. Like previous research into cigarette and tobacco expenditures, we find that the tobacco burden is regressive; thus, so are tobacco taxes. However, we find that over the five-year period considered, the tobacco burden has decreased, and, most importantly, falls less heavily on the poor. Thus, the tobacco burden and the tobacco tax is less regressive in 2010/11 than in 2005/06. Thus, increased tobacco taxes can, in at least some circumstances, reduce the financial burden that tobacco places on households. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Robust Estimator for Non-Line-of-Sight Error Mitigation in Indoor Localization

    Directory of Open Access Journals (Sweden)

    Marco A

    2006-01-01

    Full Text Available Indoor localization systems are undoubtedly of interest in many application fields. Like outdoor systems, they suffer from non-line-of-sight (NLOS errors which hinder their robustness and accuracy. Though many ad hoc techniques have been developed to deal with this problem, unfortunately most of them are not applicable indoors due to the high variability of the environment (movement of furniture and of people, etc.. In this paper, we describe the use of robust regression techniques to detect and reject NLOS measures in a location estimation using multilateration. We show how the least-median-of-squares technique can be used to overcome the effects of NLOS errors, even in environments with little infrastructure, and validate its suitability by comparing it to other methods described in the bibliography. We obtained remarkable results when using it in a real indoor positioning system that works with Bluetooth and ultrasound (BLUPS, even when nearly half the measures suffered from NLOS or other coarse errors.

  14. Robust Estimator for Non-Line-of-Sight Error Mitigation in Indoor Localization

    Science.gov (United States)

    Casas, R.; Marco, A.; Guerrero, J. J.; Falcó, J.

    2006-12-01

    Indoor localization systems are undoubtedly of interest in many application fields. Like outdoor systems, they suffer from non-line-of-sight (NLOS) errors which hinder their robustness and accuracy. Though many ad hoc techniques have been developed to deal with this problem, unfortunately most of them are not applicable indoors due to the high variability of the environment (movement of furniture and of people, etc.). In this paper, we describe the use of robust regression techniques to detect and reject NLOS measures in a location estimation using multilateration. We show how the least-median-of-squares technique can be used to overcome the effects of NLOS errors, even in environments with little infrastructure, and validate its suitability by comparing it to other methods described in the bibliography. We obtained remarkable results when using it in a real indoor positioning system that works with Bluetooth and ultrasound (BLUPS), even when nearly half the measures suffered from NLOS or other coarse errors.

  15. Influence diagnostics in meta-regression model.

    Science.gov (United States)

    Shi, Lei; Zuo, ShanShan; Yu, Dalei; Zhou, Xiaohua

    2017-09-01

    This paper studies the influence diagnostics in meta-regression model including case deletion diagnostic and local influence analysis. We derive the subset deletion formulae for the estimation of regression coefficient and heterogeneity variance and obtain the corresponding influence measures. The DerSimonian and Laird estimation and maximum likelihood estimation methods in meta-regression are considered, respectively, to derive the results. Internal and external residual and leverage measure are defined. The local influence analysis based on case-weights perturbation scheme, responses perturbation scheme, covariate perturbation scheme, and within-variance perturbation scheme are explored. We introduce a method by simultaneous perturbing responses, covariate, and within-variance to obtain the local influence measure, which has an advantage of capable to compare the influence magnitude of influential studies from different perturbations. An example is used to illustrate the proposed methodology. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Spontaneous Regression of Choroidal Neovascularization in a Patient with Pattern Dystrophy

    Directory of Open Access Journals (Sweden)

    Anastasios Anastasakis

    2016-01-01

    Full Text Available Purpose. To present a case of a patient with pattern dystrophy (PD associated choroidal neovascularization (CNV that resolved spontaneously without treatment. Methods. A 69-year-old male patient was referred to our unit, for evaluation of a recent visual loss (metamorphopsias in his left eye. Fundus examination, fundus autofluorescence imaging, and fluorescein angiography showed a choroidal neovascular membrane in his left eye. Since visual acuity was satisfactory the patient elected observation. Clinical examination and OCT testing were repeated at 6 and 12 months after presentation. Results. Visual acuity remained stable at the level of 0.9 (baseline BCVA during the follow-up period (12 months. Repeat OCT testing showed complete spontaneous regression of the choroidal neovascular membrane without evidence of intra- or subretinal fluid in both follow-up visits. Conclusions. Spontaneous regression of choroidal neovascularization can occur in patients with retinal dystrophies and associated choroidal neovascular membranes. The decision to treat or observe these patients relies strongly on the presenting visual acuity, since, in isolated instances, spontaneous resolution of choroidal neovascularization may occur.

  17. Line facilities outline

    International Nuclear Information System (INIS)

    1998-08-01

    This book deals with line facilities. The contents of this book are outline line of wire telecommunication ; development of line, classification of section of line and theory of transmission of line, cable line ; structure of line, line of cable in town, line out of town, domestic cable and other lines, Optical communication ; line of optical cable, transmission method, measurement of optical communication and cable of the sea bottom, Equipment of telecommunication line ; telecommunication line facilities and telecommunication of public works, construction of cable line and maintenance and Regulation of line equipment ; regulation on technique, construction and maintenance.

  18. Ridge Regression Signal Processing

    Science.gov (United States)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  19. BANK FAILURE PREDICTION WITH LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2013-04-01

    Full Text Available In recent years the economic and financial world is shaken by a wave of financial crisis and resulted in violent bank fairly huge losses. Several authors have focused on the study of the crises in order to develop an early warning model. It is in the same path that our work takes its inspiration. Indeed, we have tried to develop a predictive model of Tunisian bank failures with the contribution of the binary logistic regression method. The specificity of our prediction model is that it takes into account microeconomic indicators of bank failures. The results obtained using our provisional model show that a bank's ability to repay its debt, the coefficient of banking operations, bank profitability per employee and leverage financial ratio has a negative impact on the probability of failure.

  20. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  1. Regression filter for signal resolution

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-01-01

    The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)

  2. Direction of Effects in Multiple Linear Regression Models.

    Science.gov (United States)

    Wiedermann, Wolfgang; von Eye, Alexander

    2015-01-01

    Previous studies analyzed asymmetric properties of the Pearson correlation coefficient using higher than second order moments. These asymmetric properties can be used to determine the direction of dependence in a linear regression setting (i.e., establish which of two variables is more likely to be on the outcome side) within the framework of cross-sectional observational data. Extant approaches are restricted to the bivariate regression case. The present contribution extends the direction of dependence methodology to a multiple linear regression setting by analyzing distributional properties of residuals of competing multiple regression models. It is shown that, under certain conditions, the third central moments of estimated regression residuals can be used to decide upon direction of effects. In addition, three different approaches for statistical inference are discussed: a combined D'Agostino normality test, a skewness difference test, and a bootstrap difference test. Type I error and power of the procedures are assessed using Monte Carlo simulations, and an empirical example is provided for illustrative purposes. In the discussion, issues concerning the quality of psychological data, possible extensions of the proposed methods to the fourth central moment of regression residuals, and potential applications are addressed.

  3. Robust mislabel logistic regression without modeling mislabel probabilities.

    Science.gov (United States)

    Hung, Hung; Jou, Zhi-Yu; Huang, Su-Yun

    2018-03-01

    Logistic regression is among the most widely used statistical methods for linear discriminant analysis. In many applications, we only observe possibly mislabeled responses. Fitting a conventional logistic regression can then lead to biased estimation. One common resolution is to fit a mislabel logistic regression model, which takes into consideration of mislabeled responses. Another common method is to adopt a robust M-estimation by down-weighting suspected instances. In this work, we propose a new robust mislabel logistic regression based on γ-divergence. Our proposal possesses two advantageous features: (1) It does not need to model the mislabel probabilities. (2) The minimum γ-divergence estimation leads to a weighted estimating equation without the need to include any bias correction term, that is, it is automatically bias-corrected. These features make the proposed γ-logistic regression more robust in model fitting and more intuitive for model interpretation through a simple weighting scheme. Our method is also easy to implement, and two types of algorithms are included. Simulation studies and the Pima data application are presented to demonstrate the performance of γ-logistic regression. © 2017, The International Biometric Society.

  4. Complex Environmental Data Modelling Using Adaptive General Regression Neural Networks

    Science.gov (United States)

    Kanevski, Mikhail

    2015-04-01

    The research deals with an adaptation and application of Adaptive General Regression Neural Networks (GRNN) to high dimensional environmental data. GRNN [1,2,3] are efficient modelling tools both for spatial and temporal data and are based on nonparametric kernel methods closely related to classical Nadaraya-Watson estimator. Adaptive GRNN, using anisotropic kernels, can be also applied for features selection tasks when working with high dimensional data [1,3]. In the present research Adaptive GRNN are used to study geospatial data predictability and relevant feature selection using both simulated and real data case studies. The original raw data were either three dimensional monthly precipitation data or monthly wind speeds embedded into 13 dimensional space constructed by geographical coordinates and geo-features calculated from digital elevation model. GRNN were applied in two different ways: 1) adaptive GRNN with the resulting list of features ordered according to their relevancy; and 2) adaptive GRNN applied to evaluate all possible models N [in case of wind fields N=(2^13 -1)=8191] and rank them according to the cross-validation error. In both cases training were carried out applying leave-one-out procedure. An important result of the study is that the set of the most relevant features depends on the month (strong seasonal effect) and year. The predictabilities of precipitation and wind field patterns, estimated using the cross-validation and testing errors of raw and shuffled data, were studied in detail. The results of both approaches were qualitatively and quantitatively compared. In conclusion, Adaptive GRNN with their ability to select features and efficient modelling of complex high dimensional data can be widely used in automatic/on-line mapping and as an integrated part of environmental decision support systems. 1. Kanevski M., Pozdnoukhov A., Timonin V. Machine Learning for Spatial Environmental Data. Theory, applications and software. EPFL Press

  5. Extended emission-line regions in active galaxies

    International Nuclear Information System (INIS)

    Hutchings, J.B.; Hickson, P.

    1988-01-01

    Long-slit spectra of four active galaxies in the redshift range 0.06-0.10 are presented. Two have interacting companions. Spectra of the galaxies show extended narrow emission lines in all cases. Continuum color changes, emision-line ratio changes, and velocity changes with 1 arcsec resolution can be detected. Relative velocities between AGN and companion galaxies are also given. These objects appear to lie in galaxies in which there is considerable star-formation activity, and very extended line emision. 20 references

  6. Accounting for Zero Inflation of Mussel Parasite Counts Using Discrete Regression Models

    Directory of Open Access Journals (Sweden)

    Emel Çankaya

    2017-06-01

    Full Text Available In many ecological applications, the absences of species are inevitable due to either detection faults in samples or uninhabitable conditions for their existence, resulting in high number of zero counts or abundance. Usual practice for modelling such data is regression modelling of log(abundance+1 and it is well know that resulting model is inadequate for prediction purposes. New discrete models accounting for zero abundances, namely zero-inflated regression (ZIP and ZINB, Hurdle-Poisson (HP and Hurdle-Negative Binomial (HNB amongst others are widely preferred to the classical regression models. Due to the fact that mussels are one of the economically most important aquatic products of Turkey, the purpose of this study is therefore to examine the performances of these four models in determination of the significant biotic and abiotic factors on the occurrences of Nematopsis legeri parasite harming the existence of Mediterranean mussels (Mytilus galloprovincialis L.. The data collected from the three coastal regions of Sinop city in Turkey showed more than 50% of parasite counts on the average are zero-valued and model comparisons were based on information criterion. The results showed that the probability of the occurrence of this parasite is here best formulated by ZINB or HNB models and influential factors of models were found to be correspondent with ecological differences of the regions.

  7. Estimation of adjusted rate differences using additive negative binomial regression.

    Science.gov (United States)

    Donoghoe, Mark W; Marschner, Ian C

    2016-08-15

    Rate differences are an important effect measure in biostatistics and provide an alternative perspective to rate ratios. When the data are event counts observed during an exposure period, adjusted rate differences may be estimated using an identity-link Poisson generalised linear model, also known as additive Poisson regression. A problem with this approach is that the assumption of equality of mean and variance rarely holds in real data, which often show overdispersion. An additive negative binomial model is the natural alternative to account for this; however, standard model-fitting methods are often unable to cope with the constrained parameter space arising from the non-negativity restrictions of the additive model. In this paper, we propose a novel solution to this problem using a variant of the expectation-conditional maximisation-either algorithm. Our method provides a reliable way to fit an additive negative binomial regression model and also permits flexible generalisations using semi-parametric regression functions. We illustrate the method using a placebo-controlled clinical trial of fenofibrate treatment in patients with type II diabetes, where the outcome is the number of laser therapy courses administered to treat diabetic retinopathy. An R package is available that implements the proposed method. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. A Simulation Investigation of Principal Component Regression.

    Science.gov (United States)

    Allen, David E.

    Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…

  9. Hierarchical regression analysis in structural Equation Modeling

    NARCIS (Netherlands)

    de Jong, P.F.

    1999-01-01

    In a hierarchical or fixed-order regression analysis, the independent variables are entered into the regression equation in a prespecified order. Such an analysis is often performed when the extra amount of variance accounted for in a dependent variable by a specific independent variable is the main

  10. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  11. Managing first-line failure.

    Science.gov (United States)

    Cooper, David A

    2014-01-01

    WHO standard of care for failure of a first regimen, usually 2N(t)RTI's and an NNRTI, consists of a ritonavir-boosted protease inhibitor with a change in N(t)RTI's. Until recently, there was no evidence to support these recommendations which were based on expert opinion. Two large randomized clinical trials, SECOND LINE and EARNEST both showed excellent response rates (>80%) for the WHO standard of care and indicated that a novel regimen of a boosted protease inhibitor with an integrase inhibitor had equal efficacy with no difference in toxicity. In EARNEST, a third arm consisting of induction with the combined protease and integrase inhibitor followed by protease inhibitor monotherapy maintenance was inferior and led to substantial (20%) protease inhibitor resistance. These studies confirm the validity of the current recommendations of WHO and point to a novel public health approach of using two new classes for second line when standard first-line therapy has failed, which avoids resistance genotyping. Notwithstanding, adherence must be stressed in those failing first-line treatments. Protease inhibitor monotherapy is not suitable for a public health approach in low- and middle-income countries.

  12. and Multinomial Logistic Regression

    African Journals Online (AJOL)

    This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).

  13. Background stratified Poisson regression analysis of cohort data

    International Nuclear Information System (INIS)

    Richardson, David B.; Langholz, Bryan

    2012-01-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  14. Bias due to two-stage residual-outcome regression analysis in genetic association studies.

    Science.gov (United States)

    Demissie, Serkalem; Cupples, L Adrienne

    2011-11-01

    Association studies of risk factors and complex diseases require careful assessment of potential confounding factors. Two-stage regression analysis, sometimes referred to as residual- or adjusted-outcome analysis, has been increasingly used in association studies of single nucleotide polymorphisms (SNPs) and quantitative traits. In this analysis, first, a residual-outcome is calculated from a regression of the outcome variable on covariates and then the relationship between the adjusted-outcome and the SNP is evaluated by a simple linear regression of the adjusted-outcome on the SNP. In this article, we examine the performance of this two-stage analysis as compared with multiple linear regression (MLR) analysis. Our findings show that when a SNP and a covariate are correlated, the two-stage approach results in biased genotypic effect and loss of power. Bias is always toward the null and increases with the squared-correlation between the SNP and the covariate (). For example, for , 0.1, and 0.5, two-stage analysis results in, respectively, 0, 10, and 50% attenuation in the SNP effect. As expected, MLR was always unbiased. Since individual SNPs often show little or no correlation with covariates, a two-stage analysis is expected to perform as well as MLR in many genetic studies; however, it produces considerably different results from MLR and may lead to incorrect conclusions when independent variables are highly correlated. While a useful alternative to MLR under , the two -stage approach has serious limitations. Its use as a simple substitute for MLR should be avoided. © 2011 Wiley Periodicals, Inc.

  15. Adding a Parameter Increases the Variance of an Estimated Regression Function

    Science.gov (United States)

    Withers, Christopher S.; Nadarajah, Saralees

    2011-01-01

    The linear regression model is one of the most popular models in statistics. It is also one of the simplest models in statistics. It has received applications in almost every area of science, engineering and medicine. In this article, the authors show that adding a predictor to a linear model increases the variance of the estimated regression…

  16. Analysis of Multivariate Experimental Data Using A Simplified Regression Model Search Algorithm

    Science.gov (United States)

    Ulbrich, Norbert Manfred

    2013-01-01

    A new regression model search algorithm was developed in 2011 that may be used to analyze both general multivariate experimental data sets and wind tunnel strain-gage balance calibration data. The new algorithm is a simplified version of a more complex search algorithm that was originally developed at the NASA Ames Balance Calibration Laboratory. The new algorithm has the advantage that it needs only about one tenth of the original algorithm's CPU time for the completion of a search. In addition, extensive testing showed that the prediction accuracy of math models obtained from the simplified algorithm is similar to the prediction accuracy of math models obtained from the original algorithm. The simplified algorithm, however, cannot guarantee that search constraints related to a set of statistical quality requirements are always satisfied in the optimized regression models. Therefore, the simplified search algorithm is not intended to replace the original search algorithm. Instead, it may be used to generate an alternate optimized regression model of experimental data whenever the application of the original search algorithm either fails or requires too much CPU time. Data from a machine calibration of NASA's MK40 force balance is used to illustrate the application of the new regression model search algorithm.

  17. Large-Scale No-Show Patterns and Distributions for Clinic Operational Research

    Directory of Open Access Journals (Sweden)

    Michael L. Davies

    2016-02-01

    Full Text Available Patient no-shows for scheduled primary care appointments are common. Unused appointment slots reduce patient quality of care, access to services and provider productivity while increasing loss to follow-up and medical costs. This paper describes patterns of no-show variation by patient age, gender, appointment age, and type of appointment request for six individual service lines in the United States Veterans Health Administration (VHA. This retrospective observational descriptive project examined 25,050,479 VHA appointments contained in individual-level records for eight years (FY07-FY14 for 555,183 patients. Multifactor analysis of variance (ANOVA was performed, with no-show rate as the dependent variable, and gender, age group, appointment age, new patient status, and service line as factors. The analyses revealed that males had higher no-show rates than females to age 65, at which point males and females exhibited similar rates. The average no-show rates decreased with age until 75–79, whereupon rates increased. As appointment age increased, males and new patients had increasing no-show rates. Younger patients are especially prone to no-show as appointment age increases. These findings provide novel information to healthcare practitioners and management scientists to more accurately characterize no-show and attendance rates and the impact of certain patient factors. Future general population data could determine whether findings from VHA data generalize to others.

  18. Large-Scale No-Show Patterns and Distributions for Clinic Operational Research.

    Science.gov (United States)

    Davies, Michael L; Goffman, Rachel M; May, Jerrold H; Monte, Robert J; Rodriguez, Keri L; Tjader, Youxu C; Vargas, Dominic L

    2016-02-16

    Patient no-shows for scheduled primary care appointments are common. Unused appointment slots reduce patient quality of care, access to services and provider productivity while increasing loss to follow-up and medical costs. This paper describes patterns of no-show variation by patient age, gender, appointment age, and type of appointment request for six individual service lines in the United States Veterans Health Administration (VHA). This retrospective observational descriptive project examined 25,050,479 VHA appointments contained in individual-level records for eight years (FY07-FY14) for 555,183 patients. Multifactor analysis of variance (ANOVA) was performed, with no-show rate as the dependent variable, and gender, age group, appointment age, new patient status, and service line as factors. The analyses revealed that males had higher no-show rates than females to age 65, at which point males and females exhibited similar rates. The average no-show rates decreased with age until 75-79, whereupon rates increased. As appointment age increased, males and new patients had increasing no-show rates. Younger patients are especially prone to no-show as appointment age increases. These findings provide novel information to healthcare practitioners and management scientists to more accurately characterize no-show and attendance rates and the impact of certain patient factors. Future general population data could determine whether findings from VHA data generalize to others.

  19. On-line monitoring the extract process of Fu-fang Shuanghua oral solution using near infrared spectroscopy and different PLS algorithms

    Science.gov (United States)

    Kang, Qian; Ru, Qingguo; Liu, Yan; Xu, Lingyan; Liu, Jia; Wang, Yifei; Zhang, Yewen; Li, Hui; Zhang, Qing; Wu, Qing

    2016-01-01

    An on-line near infrared (NIR) spectroscopy monitoring method with an appropriate multivariate calibration method was developed for the extraction process of Fu-fang Shuanghua oral solution (FSOS). On-line NIR spectra were collected through two fiber optic probes, which were designed to transmit NIR radiation by a 2 mm flange. Partial least squares (PLS), interval PLS (iPLS) and synergy interval PLS (siPLS) algorithms were used comparatively for building the calibration regression models. During the extraction process, the feasibility of NIR spectroscopy was employed to determine the concentrations of chlorogenic acid (CA) content, total phenolic acids contents (TPC), total flavonoids contents (TFC) and soluble solid contents (SSC). High performance liquid chromatography (HPLC), ultraviolet spectrophotometric method (UV) and loss on drying methods were employed as reference methods. Experiment results showed that the performance of siPLS model is the best compared with PLS and iPLS. The calibration models for AC, TPC, TFC and SSC had high values of determination coefficients of (R2) (0.9948, 0.9992, 0.9950 and 0.9832) and low root mean square error of cross validation (RMSECV) (0.0113, 0.0341, 0.1787 and 1.2158), which indicate a good correlation between reference values and NIR predicted values. The overall results show that the on line detection method could be feasible in real application and would be of great value for monitoring the mixed decoction process of FSOS and other Chinese patent medicines.

  20. On directional multiple-output quantile regression

    Czech Academy of Sciences Publication Activity Database

    Paindaveine, D.; Šiman, Miroslav

    2011-01-01

    Roč. 102, č. 2 (2011), s. 193-212 ISSN 0047-259X R&D Projects: GA MŠk(CZ) 1M06047 Grant - others:Commision EC(BE) Fonds National de la Recherche Scientifique Institutional research plan: CEZ:AV0Z10750506 Keywords : multivariate quantile * quantile regression * multiple-output regression * halfspace depth * portfolio optimization * value-at risk Subject RIV: BA - General Mathematics Impact factor: 0.879, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/siman-0364128.pdf

  1. The Wilson-Bappu effect of the MgII k line - dependence on stellar temperature, activity and metallicity

    DEFF Research Database (Denmark)

    Elgaroy, O.; Engvold, O.; Lund, Niels

    1999-01-01

    widths around the regression lines. The sample contains slowly rotating stars of different activity levels and is suitable for investigations of a possible relation between line width and stellar activity. A difference in behavior between dwarfs and giants (and supergiants) of spectral class K seems......The Wilson-Bappu effect is investigated using accurate absolute magnitudes of 65 stars obtained through early release of data from the Hipparcos satellite together with MgII k fine widths determined from high resolution spectra observed with the International Ultraviolet Explorer (IUE) observatory....... Stars of spectral classes F, G, K and M and luminosity classes I-V are represented in the sample. Wilson-Bappu relations for the Mg II k line for stars of different temperatures i.e. spectral classes are determined. The relation varies with spectral class and there is a significant scatter of the line...

  2. WHY IS NON-THERMAL LINE BROADENING OF SPECTRAL LINES IN THE LOWER TRANSITION REGION OF THE SUN INDEPENDENT OF SPATIAL RESOLUTION?

    International Nuclear Information System (INIS)

    De Pontieu, B.; Martinez-Sykora, J.; McIntosh, S.; Peter, H.; Pereira, T. M. D.

    2015-01-01

    Spectral observations of the solar transition region (TR) and corona show broadening of spectral lines beyond what is expected from thermal and instrumental broadening. The remaining non-thermal broadening is significant (5–30 km s −1 ) and correlated with intensity. Here we study spectra of the TR Si iv 1403 Å line obtained at high resolution with the Interface Region Imaging Spectrograph (IRIS). We find that the large improvement in spatial resolution (0.″33) of IRIS compared to previous spectrographs (2″) does not resolve the non-thermal line broadening which, in most regions, remains at pre-IRIS levels of about 20 km s −1 . This invariance to spatial resolution indicates that the processes behind the broadening occur along the line-of-sight (LOS) and/or on spatial scales (perpendicular to the LOS) smaller than 250 km. Both effects appear to play a role. Comparison with IRIS chromospheric observations shows that, in regions where the LOS is more parallel to the field, magneto-acoustic shocks driven from below impact the TR and can lead to significant non-thermal line broadening. This scenario is supported by MHD simulations. While these do not show enough non-thermal line broadening, they do reproduce the long-known puzzling correlation between non-thermal line broadening and intensity. This correlation is caused by the shocks, but only if non-equilibrium ionization is taken into account. In regions where the LOS is more perpendicular to the field, the prevalence of small-scale twist is likely to play a significant role in explaining the invariance and correlation with intensity. (letters)

  3. Predicting company growth using logistic regression and neural networks

    Directory of Open Access Journals (Sweden)

    Marijana Zekić-Sušac

    2016-12-01

    Full Text Available The paper aims to establish an efficient model for predicting company growth by leveraging the strengths of logistic regression and neural networks. A real dataset of Croatian companies was used which described the relevant industry sector, financial ratios, income, and assets in the input space, with a dependent binomial variable indicating whether a company had high-growth if it had annualized growth in assets by more than 20% a year over a three-year period. Due to a large number of input variables, factor analysis was performed in the pre -processing stage in order to extract the most important input components. Building an efficient model with a high classification rate and explanatory ability required application of two data mining methods: logistic regression as a parametric and neural networks as a non -parametric method. The methods were tested on the models with and without variable reduction. The classification accuracy of the models was compared using statistical tests and ROC curves. The results showed that neural networks produce a significantly higher classification accuracy in the model when incorporating all available variables. The paper further discusses the advantages and disadvantages of both approaches, i.e. logistic regression and neural networks in modelling company growth. The suggested model is potentially of benefit to investors and economic policy makers as it provides support for recognizing companies with growth potential, especially during times of economic downturn.

  4. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  5. Clues to Quasar Broad Line Region Geometry and Kinematics

    DEFF Research Database (Denmark)

    Vestergaard, Marianne; Wilkes, B. J.; Barthel, P. D.

    2000-01-01

    width to show significant inverse correlations with the fractional radio core-flux density, R, the radio axis inclination indicator. Highly inclined systems have broader line wings, consistent with a high-velocity field perpendicular to the radio axis. By contrast, the narrow line-core shows...... no such relation with R, so the lowest velocity CIV-emitting gas has an inclination independent velocity field. We propose that this low-velocity gas is located at higher disk-altitudes than the high-velocity gas. A planar origin of the high-velocity CIV-emission is consistent with the current results...... and with an accretion disk-wind emitting the broad lines. A spherical distribution of randomly orbiting broad-line clouds and a polar high-ionization outflow are ruled out....

  6. Coping on the Front-line

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum; Lønsmann, Dorte

    language boundaries in their everyday work. Despite official English language policies in the three companies, our findings show that employees face a number of different language boundaries, and that ad hoc and informal solutions in many cases are vital for successful cross-language communication. Drawing......This article investigates how front-line employees respond to English language policies implemented by the management of three multinational corporations (MNCs) headquartered in Scandinavia. Based on interview and document data the article examines the ways in which front-line employees cross...

  7. Examination of influential observations in penalized spline regression

    Science.gov (United States)

    Türkan, Semra

    2013-10-01

    In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.

  8. Cultivation studies of Taiwanese native Miscanthus floridulus lines

    International Nuclear Information System (INIS)

    Huang, C.L.; Liao, W.C.; Lai, Y.C.

    2011-01-01

    Four Taiwanese native Miscanthus floridulus lines, collected at altitudes of 260, 500, 1000, and 1500 m were cultivated in 2009 and 2010. The plant height and tiller numbers of four M. floridulus lines increased gradually along with the growing time. These M. floridulus lines had the tallest plant height and most tiller number after these species were planted 210 days. Line 3, which was collected at the altitude of 1000 m, had the ability to grow at low temperature. Line 3 M. floridulus had the highest plant height, tiller number, fresh and dry yields than other three lines. Fresh and dry yields of Line 3 were positively correlated to the plant height, tiller number, and leaf width, but showed no correlation with the leaf length. The correlation between agronomic traits and climatic data was also studied. Results can be used as a model for developing a non-food crop-based energy production system in the future. -- Highlights: → Miscanthus floridulus collected at 1000 m altitude had the highest plant height, tiller number, fresh and dry yields. → Fresh and dry yields of were positively correlated to the plant height, tiller number, and leaf width. → Fresh and dry yields showed no correlation with the leaf length. → The accumulative rainfall, temperature, radiation, and exposure time to radiation were positively correlated to the plant height, leaf length and leaf width.

  9. Subset selection in regression

    CERN Document Server

    Miller, Alan

    2002-01-01

    Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...

  10. Stepwise versus Hierarchical Regression: Pros and Cons

    Science.gov (United States)

    Lewis, Mitzi

    2007-01-01

    Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…

  11. The size of the narrow-line-emitting region in the Seyfert 1 galaxy NGC 5548 from emission-line variability

    International Nuclear Information System (INIS)

    Peterson, B. M.; Denney, K. D.; De Rosa, G.; Grier, C. J.; Pogge, R. W.; Kochanek, C. S.; Bentz, M. C.; Vestergaard, M.; Kilerci-Eser, E.; G. Galilei, Università di Padova, Vicolo dell'Osservatorio 3 I-35122, Padova (Italy))" data-affiliation=" (Dipartimento di Fisica e Astronomia G. Galilei, Università di Padova, Vicolo dell'Osservatorio 3 I-35122, Padova (Italy))" >Dalla Bontà, E.; G. Galilei, Università di Padova, Vicolo dell'Osservatorio 3 I-35122, Padova (Italy))" data-affiliation=" (Dipartimento di Fisica e Astronomia G. Galilei, Università di Padova, Vicolo dell'Osservatorio 3 I-35122, Padova (Italy))" >Ciroi, S.

    2013-01-01

    The narrow [O III] λλ4959, 5007 emission-line fluxes in the spectrum of the well-studied Seyfert 1 galaxy NGC 5548 are shown to vary with time. From this we show that the narrow-line-emitting region has a radius of only 1-3 pc and is denser (n e ∼ 10 5 cm –3 ) than previously supposed. The [O III] line width is consistent with virial motions at this radius given previous determinations of the black hole mass. Since the [O III] emission-line flux is usually assumed to be constant and is therefore used to calibrate spectroscopic monitoring data, the variability has ramifications for the long-term secular variations of continuum and emission-line fluxes, though it has no effect on shorter-term reverberation studies. We present corrected optical continuum and broad Hβ emission-line light curves for the period 1988-2008.

  12. The size of the narrow-line-emitting region in the Seyfert 1 galaxy NGC 5548 from emission-line variability

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, B. M.; Denney, K. D.; De Rosa, G.; Grier, C. J.; Pogge, R. W.; Kochanek, C. S. [Department of Astronomy, The Ohio State University, 140 W 18th Avenue, Columbus, OH 43210 (United States); Bentz, M. C. [Department of Physics and Astronomy, Georgia State University, 25 Park Place, Suite 610, Atlanta, GA 30303 (United States); Vestergaard, M.; Kilerci-Eser, E. [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark); Dalla Bontà, E.; Ciroi, S. [Dipartimento di Fisica e Astronomia " G. Galilei," Università di Padova, Vicolo dell' Osservatorio 3 I-35122, Padova (Italy)

    2013-12-20

    The narrow [O III] λλ4959, 5007 emission-line fluxes in the spectrum of the well-studied Seyfert 1 galaxy NGC 5548 are shown to vary with time. From this we show that the narrow-line-emitting region has a radius of only 1-3 pc and is denser (n {sub e} ∼ 10{sup 5} cm{sup –3}) than previously supposed. The [O III] line width is consistent with virial motions at this radius given previous determinations of the black hole mass. Since the [O III] emission-line flux is usually assumed to be constant and is therefore used to calibrate spectroscopic monitoring data, the variability has ramifications for the long-term secular variations of continuum and emission-line fluxes, though it has no effect on shorter-term reverberation studies. We present corrected optical continuum and broad Hβ emission-line light curves for the period 1988-2008.

  13. An improved partial least-squares regression method for Raman spectroscopy

    Science.gov (United States)

    Momenpour Tehran Monfared, Ali; Anis, Hanan

    2017-10-01

    It is known that the performance of partial least-squares (PLS) regression analysis can be improved using the backward variable selection method (BVSPLS). In this paper, we further improve the BVSPLS based on a novel selection mechanism. The proposed method is based on sorting the weighted regression coefficients, and then the importance of each variable of the sorted list is evaluated using root mean square errors of prediction (RMSEP) criterion in each iteration step. Our Improved BVSPLS (IBVSPLS) method has been applied to leukemia and heparin data sets and led to an improvement in limit of detection of Raman biosensing ranged from 10% to 43% compared to PLS. Our IBVSPLS was also compared to the jack-knifing (simpler) and Genetic Algorithm (more complex) methods. Our method was consistently better than the jack-knifing method and showed either a similar or a better performance compared to the genetic algorithm.

  14. Evaluation of accuracy of linear regression models in predicting urban stormwater discharge characteristics.

    Science.gov (United States)

    Madarang, Krish J; Kang, Joo-Hyon

    2014-06-01

    Stormwater runoff has been identified as a source of pollution for the environment, especially for receiving waters. In order to quantify and manage the impacts of stormwater runoff on the environment, predictive models and mathematical models have been developed. Predictive tools such as regression models have been widely used to predict stormwater discharge characteristics. Storm event characteristics, such as antecedent dry days (ADD), have been related to response variables, such as pollutant loads and concentrations. However it has been a controversial issue among many studies to consider ADD as an important variable in predicting stormwater discharge characteristics. In this study, we examined the accuracy of general linear regression models in predicting discharge characteristics of roadway runoff. A total of 17 storm events were monitored in two highway segments, located in Gwangju, Korea. Data from the monitoring were used to calibrate United States Environmental Protection Agency's Storm Water Management Model (SWMM). The calibrated SWMM was simulated for 55 storm events, and the results of total suspended solid (TSS) discharge loads and event mean concentrations (EMC) were extracted. From these data, linear regression models were developed. R(2) and p-values of the regression of ADD for both TSS loads and EMCs were investigated. Results showed that pollutant loads were better predicted than pollutant EMC in the multiple regression models. Regression may not provide the true effect of site-specific characteristics, due to uncertainty in the data. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  15. Comparing Methodologies for Developing an Early Warning System: Classification and Regression Tree Model versus Logistic Regression. REL 2015-077

    Science.gov (United States)

    Koon, Sharon; Petscher, Yaacov

    2015-01-01

    The purpose of this report was to explicate the use of logistic regression and classification and regression tree (CART) analysis in the development of early warning systems. It was motivated by state education leaders' interest in maintaining high classification accuracy while simultaneously improving practitioner understanding of the rules by…

  16. The more total cognitive load is reduced by cues, the better retention and transfer of multimedia learning: A meta-analysis and two meta-regression analyses.

    Science.gov (United States)

    Xie, Heping; Wang, Fuxing; Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan

    2017-01-01

    Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = -0.11, 95% CI = [-0.19, -0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = -0.70, 95% CI = [-1.02, -0.38], p < 0.001), as well as dtransfer for cueing (β = -0.60, 95% CI = [-0.92, -0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning.

  17. On weighted and locally polynomial directional quantile regression

    Czech Academy of Sciences Publication Activity Database

    Boček, Pavel; Šiman, Miroslav

    2017-01-01

    Roč. 32, č. 3 (2017), s. 929-946 ISSN 0943-4062 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : Quantile regression * Nonparametric regression * Nonparametric regression Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.434, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/bocek-0458380.pdf

  18. ALFA: an automated line fitting algorithm

    Science.gov (United States)

    Wesson, R.

    2016-03-01

    I present the automated line fitting algorithm, ALFA, a new code which can fit emission line spectra of arbitrary wavelength coverage and resolution, fully automatically. In contrast to traditional emission line fitting methods which require the identification of spectral features suspected to be emission lines, ALFA instead uses a list of lines which are expected to be present to construct a synthetic spectrum. The parameters used to construct the synthetic spectrum are optimized by means of a genetic algorithm. Uncertainties are estimated using the noise structure of the residuals. An emission line spectrum containing several hundred lines can be fitted in a few seconds using a single processor of a typical contemporary desktop or laptop PC. I show that the results are in excellent agreement with those measured manually for a number of spectra. Where discrepancies exist, the manually measured fluxes are found to be less accurate than those returned by ALFA. Together with the code NEAT, ALFA provides a powerful way to rapidly extract physical information from observations, an increasingly vital function in the era of highly multiplexed spectroscopy. The two codes can deliver a reliable and comprehensive analysis of very large data sets in a few hours with little or no user interaction.

  19. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  20. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    Directory of Open Access Journals (Sweden)

    C. Wu

    2018-03-01

    Full Text Available Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS, Deming regression (DR, orthogonal distance regression (ODR, weighted ODR (WODR, and York regression (YR. We first introduce a new data generation scheme that employs the Mersenne twister (MT pseudorandom number generator. The numerical simulations are also improved by (a refining the parameterization of nonlinear measurement uncertainties, (b inclusion of a linear measurement uncertainty, and (c inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot was developed to facilitate the implementation of error-in-variables regressions.