WorldWideScience

Sample records for regression analyses assessed

  1. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    Science.gov (United States)

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    demonstrate our proposed approach for a two-sample summary data MR analysis to estimate the causal effect of low-density lipoprotein on heart disease risk. A high value of IGX2 close to 1 indicates that dilution does not materially affect the standard MR-Egger analyses for these data. : Care must be taken to assess the NOME assumption via the IGX2 statistic before implementing standard MR-Egger regression in the two-sample summary data context. If IGX2 is sufficiently low (less than 90%), inferences from the method should be interpreted with caution and adjustment methods considered. © The Author 2016. Published by Oxford University Press on behalf of the International Epidemiological Association.

  2. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  3. Applications of MIDAS regression in analysing trends in water quality

    Science.gov (United States)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  4. Pathological assessment of liver fibrosis regression

    Directory of Open Access Journals (Sweden)

    WANG Bingqiong

    2017-03-01

    Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.

  5. Supporting analyses and assessments

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1995-09-01

    Supporting analysis and assessments can provide a sound analytic foundation and focus for program planning, evaluation, and coordination, particularly if issues of hydrogen production, distribution, storage, safety, and infrastructure can be analyzed in a comprehensive and systematic manner. The overall purpose of this activity is to coordinate all key analytic tasks-such as technology and market status, opportunities, and trends; environmental costs and benefits; and regulatory constraints and opportunities-within a long-term and systematic analytic foundation for program planning and evaluation. Within this context, the purpose of the project is to help develop and evaluate programmatic pathway options that incorporate near and mid-term strategies to achieve the long-term goals of the Hydrogen Program. In FY 95, NREL will develop a comprehensive effort with industry, state and local agencies, and other federal agencies to identify and evaluate programmatic pathway options to achieve the long-term goals of the Program. Activity to date is reported.

  6. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    Science.gov (United States)

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  7. Statistical and regression analyses of detected extrasolar systems

    Czech Academy of Sciences Publication Activity Database

    Pintr, Pavel; Peřinová, V.; Lukš, A.; Pathak, A.

    2013-01-01

    Roč. 75, č. 1 (2013), s. 37-45 ISSN 0032-0633 Institutional support: RVO:61389021 Keywords : Exoplanets * Kepler candidates * Regression analysis Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.630, year: 2013 http://www.sciencedirect.com/science/article/pii/S0032063312003066

  8. Assessing risk factors for periodontitis using regression

    Science.gov (United States)

    Lobo Pereira, J. A.; Ferreira, Maria Cristina; Oliveira, Teresa

    2013-10-01

    Multivariate statistical analysis is indispensable to assess the associations and interactions between different factors and the risk of periodontitis. Among others, regression analysis is a statistical technique widely used in healthcare to investigate and model the relationship between variables. In our work we study the impact of socio-demographic, medical and behavioral factors on periodontal health. Using regression, linear and logistic models, we can assess the relevance, as risk factors for periodontitis disease, of the following independent variables (IVs): Age, Gender, Diabetic Status, Education, Smoking status and Plaque Index. The multiple linear regression analysis model was built to evaluate the influence of IVs on mean Attachment Loss (AL). Thus, the regression coefficients along with respective p-values will be obtained as well as the respective p-values from the significance tests. The classification of a case (individual) adopted in the logistic model was the extent of the destruction of periodontal tissues defined by an Attachment Loss greater than or equal to 4 mm in 25% (AL≥4mm/≥25%) of sites surveyed. The association measures include the Odds Ratios together with the correspondent 95% confidence intervals.

  9. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies

    OpenAIRE

    Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.

    2016-01-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epide...

  10. Analysing inequalities in Germany a structured additive distributional regression approach

    CERN Document Server

    Silbersdorff, Alexander

    2017-01-01

    This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals’ realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.

  11. Spatial vulnerability assessments by regression kriging

    Science.gov (United States)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor

    2016-04-01

    information representing IEW or GRP forming environmental factors were taken into account to support the spatial inference of the locally experienced IEW frequency and measured GRP values respectively. An efficient spatial prediction methodology was applied to construct reliable maps, namely regression kriging (RK) using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Application of RK also provides the possibility of inherent accuracy assessment. The resulting maps are characterized by global and local measures of its accuracy. Additionally the method enables interval estimation for spatial extension of the areas of predefined risk categories. All of these outputs provide useful contribution to spatial planning, action planning and decision making. Acknowledgement: Our work was partly supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  12. How to deal with continuous and dichotomic outcomes in epidemiological research: linear and logistic regression analyses

    NARCIS (Netherlands)

    Tripepi, Giovanni; Jager, Kitty J.; Stel, Vianda S.; Dekker, Friedo W.; Zoccali, Carmine

    2011-01-01

    Because of some limitations of stratification methods, epidemiologists frequently use multiple linear and logistic regression analyses to address specific epidemiological questions. If the dependent variable is a continuous one (for example, systolic pressure and serum creatinine), the researcher

  13. Reducing Inter-Laboratory Differences between Semen Analyses Using Z Score and Regression Transformations

    Directory of Open Access Journals (Sweden)

    Esther Leushuis

    2016-12-01

    Full Text Available Background: Standardization of the semen analysis may improve reproducibility. We assessed variability between laboratories in semen analyses and evaluated whether a transformation using Z scores and regression statistics was able to reduce this variability. Materials and Methods: We performed a retrospective cohort study. We calculated between-laboratory coefficients of variation (CVB for sperm concentration and for morphology. Subsequently, we standardized the semen analysis results by calculating laboratory specific Z scores, and by using regression. We used analysis of variance for four semen parameters to assess systematic differences between laboratories before and after the transformations, both in the circulation samples and in the samples obtained in the prospective cohort study in the Netherlands between January 2002 and February 2004. Results: The mean CVB was 7% for sperm concentration (range 3 to 13% and 32% for sperm morphology (range 18 to 51%. The differences between the laboratories were statistically significant for all semen parameters (all P<0.001. Standardization using Z scores did not reduce the differences in semen analysis results between the laboratories (all P<0.001. Conclusion: There exists large between-laboratory variability for sperm morphology and small, but statistically significant, between-laboratory variation for sperm concentration. Standardization using Z scores does not eliminate between-laboratory variability.

  14. USE OF THE SIMPLE LINEAR REGRESSION MODEL IN MACRO-ECONOMICAL ANALYSES

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2011-10-01

    Full Text Available The article presents the fundamental aspects of the linear regression, as a toolbox which can be used in macroeconomic analyses. The article describes the estimation of the parameters, the statistical tests used, the homoscesasticity and heteroskedasticity. The use of econometrics instrument in macroeconomics is an important factor that guarantees the quality of the models, analyses, results and possible interpretation that can be drawn at this level.

  15. Alpins and thibos vectorial astigmatism analyses: proposal of a linear regression model between methods

    Directory of Open Access Journals (Sweden)

    Giuliano de Oliveira Freitas

    2013-10-01

    Full Text Available PURPOSE: To determine linear regression models between Alpins descriptive indices and Thibos astigmatic power vectors (APV, assessing the validity and strength of such correlations. METHODS: This case series prospectively assessed 62 eyes of 31 consecutive cataract patients with preoperative corneal astigmatism between 0.75 and 2.50 diopters in both eyes. Patients were randomly assorted among two phacoemulsification groups: one assigned to receive AcrySof®Toric intraocular lens (IOL in both eyes and another assigned to have AcrySof Natural IOL associated with limbal relaxing incisions, also in both eyes. All patients were reevaluated postoperatively at 6 months, when refractive astigmatism analysis was performed using both Alpins and Thibos methods. The ratio between Thibos postoperative APV and preoperative APV (APVratio and its linear regression to Alpins percentage of success of astigmatic surgery, percentage of astigmatism corrected and percentage of astigmatism reduction at the intended axis were assessed. RESULTS: Significant negative correlation between the ratio of post- and preoperative Thibos APVratio and Alpins percentage of success (%Success was found (Spearman's ρ=-0.93; linear regression is given by the following equation: %Success = (-APVratio + 1.00x100. CONCLUSION: The linear regression we found between APVratio and %Success permits a validated mathematical inference concerning the overall success of astigmatic surgery.

  16. Analyses of Developmental Rate Isomorphy in Ectotherms: Introducing the Dirichlet Regression.

    Directory of Open Access Journals (Sweden)

    David S Boukal

    Full Text Available Temperature drives development in insects and other ectotherms because their metabolic rate and growth depends directly on thermal conditions. However, relative durations of successive ontogenetic stages often remain nearly constant across a substantial range of temperatures. This pattern, termed 'developmental rate isomorphy' (DRI in insects, appears to be widespread and reported departures from DRI are generally very small. We show that these conclusions may be due to the caveats hidden in the statistical methods currently used to study DRI. Because the DRI concept is inherently based on proportional data, we propose that Dirichlet regression applied to individual-level data is an appropriate statistical method to critically assess DRI. As a case study we analyze data on five aquatic and four terrestrial insect species. We find that results obtained by Dirichlet regression are consistent with DRI violation in at least eight of the studied species, although standard analysis detects significant departure from DRI in only four of them. Moreover, the departures from DRI detected by Dirichlet regression are consistently much larger than previously reported. The proposed framework can also be used to infer whether observed departures from DRI reflect life history adaptations to size- or stage-dependent effects of varying temperature. Our results indicate that the concept of DRI in insects and other ectotherms should be critically re-evaluated and put in a wider context, including the concept of 'equiproportional development' developed for copepods.

  17. Genetic analyses of partial egg production in Japanese quail using multi-trait random regression models.

    Science.gov (United States)

    Karami, K; Zerehdaran, S; Barzanooni, B; Lotfi, E

    2017-12-01

    1. The aim of the present study was to estimate genetic parameters for average egg weight (EW) and egg number (EN) at different ages in Japanese quail using multi-trait random regression (MTRR) models. 2. A total of 8534 records from 900 quail, hatched between 2014 and 2015, were used in the study. Average weekly egg weights and egg numbers were measured from second until sixth week of egg production. 3. Nine random regression models were compared to identify the best order of the Legendre polynomials (LP). The most optimal model was identified by the Bayesian Information Criterion. A model with second order of LP for fixed effects, second order of LP for additive genetic effects and third order of LP for permanent environmental effects (MTRR23) was found to be the best. 4. According to the MTRR23 model, direct heritability for EW increased from 0.26 in the second week to 0.53 in the sixth week of egg production, whereas the ratio of permanent environment to phenotypic variance decreased from 0.48 to 0.1. Direct heritability for EN was low, whereas the ratio of permanent environment to phenotypic variance decreased from 0.57 to 0.15 during the production period. 5. For each trait, estimated genetic correlations among weeks of egg production were high (from 0.85 to 0.98). Genetic correlations between EW and EN were low and negative for the first two weeks, but they were low and positive for the rest of the egg production period. 6. In conclusion, random regression models can be used effectively for analysing egg production traits in Japanese quail. Response to selection for increased egg weight would be higher at older ages because of its higher heritability and such a breeding program would have no negative genetic impact on egg production.

  18. Logistic regression and multiple classification analyses to explore risk factors of under-5 mortality in bangladesh

    International Nuclear Information System (INIS)

    Bhowmik, K.R.; Islam, S.

    2016-01-01

    Logistic regression (LR) analysis is the most common statistical methodology to find out the determinants of childhood mortality. However, the significant predictors cannot be ranked according to their influence on the response variable. Multiple classification (MC) analysis can be applied to identify the significant predictors with a priority index which helps to rank the predictors. The main objective of the study is to find the socio-demographic determinants of childhood mortality at neonatal, post-neonatal, and post-infant period by fitting LR model as well as to rank those through MC analysis. The study is conducted using the data of Bangladesh Demographic and Health Survey 2007 where birth and death information of children were collected from their mothers. Three dichotomous response variables are constructed from children age at death to fit the LR and MC models. Socio-economic and demographic variables significantly associated with the response variables separately are considered in LR and MC analyses. Both the LR and MC models identified the same significant predictors for specific childhood mortality. For both the neonatal and child mortality, biological factors of children, regional settings, and parents socio-economic status are found as 1st, 2nd, and 3rd significant groups of predictors respectively. Mother education and household environment are detected as major significant predictors of post-neonatal mortality. This study shows that MC analysis with or without LR analysis can be applied to detect determinants with rank which help the policy makers taking initiatives on a priority basis. (author)

  19. Predicting Performance on MOOC Assessments using Multi-Regression Models

    OpenAIRE

    Ren, Zhiyun; Rangwala, Huzefa; Johri, Aditya

    2016-01-01

    The past few years has seen the rapid growth of data min- ing approaches for the analysis of data obtained from Mas- sive Open Online Courses (MOOCs). The objectives of this study are to develop approaches to predict the scores a stu- dent may achieve on a given grade-related assessment based on information, considered as prior performance or prior ac- tivity in the course. We develop a personalized linear mul- tiple regression (PLMR) model to predict the grade for a student, prior to attempt...

  20. Using Regression Equations Built from Summary Data in the Psychological Assessment of the Individual Case: Extension to Multiple Regression

    Science.gov (United States)

    Crawford, John R.; Garthwaite, Paul H.; Denham, Annie K.; Chelune, Gordon J.

    2012-01-01

    Regression equations have many useful roles in psychological assessment. Moreover, there is a large reservoir of published data that could be used to build regression equations; these equations could then be employed to test a wide variety of hypotheses concerning the functioning of individual cases. This resource is currently underused because…

  1. The number of subjects per variable required in linear regression analyses

    NARCIS (Netherlands)

    P.C. Austin (Peter); E.W. Steyerberg (Ewout)

    2015-01-01

    textabstractObjectives To determine the number of independent variables that can be included in a linear regression model. Study Design and Setting We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression

  2. The number of subjects per variable required in linear regression analyses.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2015-06-01

    To determine the number of independent variables that can be included in a linear regression model. We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression coefficients and standard errors, on the empirical coverage of estimated confidence intervals, and on the accuracy of the estimated R(2) of the fitted model. A minimum of approximately two SPV tended to result in estimation of regression coefficients with relative bias of less than 10%. Furthermore, with this minimum number of SPV, the standard errors of the regression coefficients were accurately estimated and estimated confidence intervals had approximately the advertised coverage rates. A much higher number of SPV were necessary to minimize bias in estimating the model R(2), although adjusted R(2) estimates behaved well. The bias in estimating the model R(2) statistic was inversely proportional to the magnitude of the proportion of variation explained by the population regression model. Linear regression models require only two SPV for adequate estimation of regression coefficients, standard errors, and confidence intervals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Improved Dietary Guidelines for Vitamin D: Application of Individual Participant Data (IPD-Level Meta-Regression Analyses

    Directory of Open Access Journals (Sweden)

    Kevin D. Cashman

    2017-05-01

    Full Text Available Dietary Reference Values (DRVs for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years of the vitamin D intake–serum 25-hydroxyvitamin D (25(OHD dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OHD concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OHD >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OHD to vitamin D intake.

  4. Correcting for multivariate measurement error by regression calibration in meta-analyses of epidemiological studies.

    NARCIS (Netherlands)

    Kromhout, D.

    2009-01-01

    Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements of the

  5. Testing Mediation Using Multiple Regression and Structural Equation Modeling Analyses in Secondary Data

    Science.gov (United States)

    Li, Spencer D.

    2011-01-01

    Mediation analysis in child and adolescent development research is possible using large secondary data sets. This article provides an overview of two statistical methods commonly used to test mediated effects in secondary analysis: multiple regression and structural equation modeling (SEM). Two empirical studies are presented to illustrate the…

  6. Regression Analyses on the Butterfly Ballot Effect: A Statistical Perspective of the US 2000 Election

    Science.gov (United States)

    Wu, Dane W.

    2002-01-01

    The year 2000 US presidential election between Al Gore and George Bush has been the most intriguing and controversial one in American history. The state of Florida was the trigger for the controversy, mainly, due to the use of the misleading "butterfly ballot". Using prediction (or confidence) intervals for least squares regression lines…

  7. Check-all-that-apply data analysed by Partial Least Squares regression

    DEFF Research Database (Denmark)

    Rinnan, Åsmund; Giacalone, Davide; Frøst, Michael Bom

    2015-01-01

    are analysed by multivariate techniques. CATA data can be analysed both by setting the CATA as the X and the Y. The former is the PLS-Discriminant Analysis (PLS-DA) version, while the latter is the ANOVA-PLS (A-PLS) version. We investigated the difference between these two approaches, concluding...

  8. Alternative regression models to assess increase in childhood BMI

    OpenAIRE

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-01-01

    Abstract Background Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 childre...

  9. Differential item functioning (DIF) analyses of health-related quality of life instruments using logistic regression

    DEFF Research Database (Denmark)

    Scott, Neil W; Fayers, Peter M; Aaronson, Neil K

    2010-01-01

    Differential item functioning (DIF) methods can be used to determine whether different subgroups respond differently to particular items within a health-related quality of life (HRQoL) subscale, after allowing for overall subgroup differences in that scale. This article reviews issues that arise ...... when testing for DIF in HRQoL instruments. We focus on logistic regression methods, which are often used because of their efficiency, simplicity and ease of application....

  10. Application of Negative Binomial Regression for Assessing Public ...

    African Journals Online (AJOL)

    Because the variance was nearly two times greater than the mean, the negative binomial regression model provided an improved fit to the data and accounted better for overdispersion than the Poisson regression model, which assumed that the mean and variance are the same. The level of education and race were found

  11. Improved Dietary Guidelines for Vitamin D: Application of Individual Participant Data (IPD)-Level Meta-Regression Analyses

    Science.gov (United States)

    Cashman, Kevin D.; Ritz, Christian; Kiely, Mairead

    2017-01-01

    Dietary Reference Values (DRVs) for vitamin D have a key role in the prevention of vitamin D deficiency. However, despite adopting similar risk assessment protocols, estimates from authoritative agencies over the last 6 years have been diverse. This may have arisen from diverse approaches to data analysis. Modelling strategies for pooling of individual subject data from cognate vitamin D randomized controlled trials (RCTs) are likely to provide the most appropriate DRV estimates. Thus, the objective of the present work was to undertake the first-ever individual participant data (IPD)-level meta-regression, which is increasingly recognized as best practice, from seven winter-based RCTs (with 882 participants ranging in age from 4 to 90 years) of the vitamin D intake–serum 25-hydroxyvitamin D (25(OH)D) dose-response. Our IPD-derived estimates of vitamin D intakes required to maintain 97.5% of 25(OH)D concentrations >25, 30, and 50 nmol/L across the population are 10, 13, and 26 µg/day, respectively. In contrast, standard meta-regression analyses with aggregate data (as used by several agencies in recent years) from the same RCTs estimated that a vitamin D intake requirement of 14 µg/day would maintain 97.5% of 25(OH)D >50 nmol/L. These first IPD-derived estimates offer improved dietary recommendations for vitamin D because the underpinning modeling captures the between-person variability in response of serum 25(OH)D to vitamin D intake. PMID:28481259

  12. Correlation and regression analyses of genetic effects for different types of cells in mammals under radiation and chemical treatment

    International Nuclear Information System (INIS)

    Slutskaya, N.G.; Mosseh, I.B.

    2006-01-01

    Data about genetic mutations under radiation and chemical treatment for different types of cells have been analyzed with correlation and regression analyses. Linear correlation between different genetic effects in sex cells and somatic cells have found. The results may be extrapolated on sex cells of human and mammals. (authors)

  13. Correcting for multivariate measurement error by regression calibration in meta-analyses of epidemiological studies

    DEFF Research Database (Denmark)

    Tybjærg-Hansen, Anne

    2009-01-01

    Within-person variability in measured values of multiple risk factors can bias their associations with disease. The multivariate regression calibration (RC) approach can correct for such measurement error and has been applied to studies in which true values or independent repeat measurements...... of the risk factors are observed on a subsample. We extend the multivariate RC techniques to a meta-analysis framework where multiple studies provide independent repeat measurements and information on disease outcome. We consider the cases where some or all studies have repeat measurements, and compare study......-specific, averaged and empirical Bayes estimates of RC parameters. Additionally, we allow for binary covariates (e.g. smoking status) and for uncertainty and time trends in the measurement error corrections. Our methods are illustrated using a subset of individual participant data from prospective long-term studies...

  14. Alternative regression models to assess increase in childhood BMI

    Directory of Open Access Journals (Sweden)

    Mansmann Ulrich

    2008-09-01

    Full Text Available Abstract Background Body mass index (BMI data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs, quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS. We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. Results GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. Conclusion GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  15. Alternative regression models to assess increase in childhood BMI.

    Science.gov (United States)

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-09-08

    Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  16. Assessment of deforestation using regression; Hodnotenie odlesnenia s vyuzitim regresie

    Energy Technology Data Exchange (ETDEWEB)

    Juristova, J. [Univerzita Komenskeho, Prirodovedecka fakulta, Katedra kartografie, geoinformatiky a DPZ, 84215 Bratislava (Slovakia)

    2013-04-16

    This work is devoted to the evaluation of deforestation using regression methods through software Idrisi Taiga. Deforestation is evaluated by the method of logistic regression. The dependent variable has discrete values '0' and '1', indicating that the deforestation occurred or not. Independent variables have continuous values, expressing the distance from the edge of the deforested areas of forests from urban areas, the river and the road network. The results were also used in predicting the probability of deforestation in subsequent periods. The result is a map showing the output probability of deforestation for the periods 1990/2000 and 200/2006 in accordance with predetermined coefficients (values of independent variables). (authors)

  17. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    Changes in left ventricular structures and function have been reported in cardiomyopathies. No prediction models have been established in this environment. This study established regression models for prediction of left ventricular structures in normal subjects. A sample of normal subjects was drawn from a large urban ...

  18. Correlation, Regression and Path Analyses of Seed Yield Components in Crambe abyssinica, a Promising Industrial Oil Crop

    OpenAIRE

    Huang, Banglian; Yang, Yiming; Luo, Tingting; Wu, S.; Du, Xuezhu; Cai, Detian; Loo, van, E.N.; Huang Bangquan

    2013-01-01

    In the present study correlation, regression and path analyses were carried out to decide correlations among the agro- nomic traits and their contributions to seed yield per plant in Crambe abyssinica. Partial correlation analysis indicated that plant height (X1) was significantly correlated with branching height and the number of first branches (P <0.01); Branching height (X2) was significantly correlated with pod number of primary inflorescence (P <0.01) and number of secondary branch...

  19. Analyses of non-fatal accidents in an opencast mine by logistic regression model - a case study.

    Science.gov (United States)

    Onder, Seyhan; Mutlu, Mert

    2017-09-01

    Accidents cause major damage for both workers and enterprises in the mining industry. To reduce the number of occupational accidents, these incidents should be properly registered and carefully analysed. This study efficiently examines the Aegean Lignite Enterprise (ELI) of Turkish Coal Enterprises (TKI) in Soma between 2006 and 2011, and opencast coal mine occupational accident records were used for statistical analyses. A total of 231 occupational accidents were analysed for this study. The accident records were categorized into seven groups: area, reason, occupation, part of body, age, shift hour and lost days. The SPSS package program was used in this study for logistic regression analyses, which predicted the probability of accidents resulting in greater or less than 3 lost workdays for non-fatal injuries. Social facilities-area of surface installations, workshops and opencast mining areas are the areas with the highest probability for accidents with greater than 3 lost workdays for non-fatal injuries, while the reasons with the highest probability for these types of accidents are transporting and manual handling. Additionally, the model was tested for such reported accidents that occurred in 2012 for the ELI in Soma and estimated the probability of exposure to accidents with lost workdays correctly by 70%.

  20. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    Science.gov (United States)

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  1. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    Science.gov (United States)

    Weiss, Brandi A.; Dardick, William

    2016-01-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…

  2. Longitudinal changes in telomere length and associated genetic parameters in dairy cattle analysed using random regression models.

    Directory of Open Access Journals (Sweden)

    Luise A Seeker

    Full Text Available Telomeres cap the ends of linear chromosomes and shorten with age in many organisms. In humans short telomeres have been linked to morbidity and mortality. With the accumulation of longitudinal datasets the focus shifts from investigating telomere length (TL to exploring TL change within individuals over time. Some studies indicate that the speed of telomere attrition is predictive of future disease. The objectives of the present study were to 1 characterize the change in bovine relative leukocyte TL (RLTL across the lifetime in Holstein Friesian dairy cattle, 2 estimate genetic parameters of RLTL over time and 3 investigate the association of differences in individual RLTL profiles with productive lifespan. RLTL measurements were analysed using Legendre polynomials in a random regression model to describe TL profiles and genetic variance over age. The analyses were based on 1,328 repeated RLTL measurements of 308 female Holstein Friesian dairy cattle. A quadratic Legendre polynomial was fitted to the fixed effect of age in months and to the random effect of the animal identity. Changes in RLTL, heritability and within-trait genetic correlation along the age trajectory were calculated and illustrated. At a population level, the relationship between RLTL and age was described by a positive quadratic function. Individuals varied significantly regarding the direction and amount of RLTL change over life. The heritability of RLTL ranged from 0.36 to 0.47 (SE = 0.05-0.08 and remained statistically unchanged over time. The genetic correlation of RLTL at birth with measurements later in life decreased with the time interval between samplings from near unity to 0.69, indicating that TL later in life might be regulated by different genes than TL early in life. Even though animals differed in their RLTL profiles significantly, those differences were not correlated with productive lifespan (p = 0.954.

  3. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  4. SPECIFICS OF THE APPLICATIONS OF MULTIPLE REGRESSION MODEL IN THE ANALYSES OF THE EFFECTS OF GLOBAL FINANCIAL CRISES

    Directory of Open Access Journals (Sweden)

    Željko V. Račić

    2010-12-01

    Full Text Available This paper aims to present the specifics of the application of multiple linear regression model. The economic (financial crisis is analyzed in terms of gross domestic product which is in a function of the foreign trade balance (on one hand and the credit cards, i.e. indebtedness of the population on this basis (on the other hand, in the USA (from 1999. to 2008. We used the extended application model which shows how the analyst should run the whole development process of regression model. This process began with simple statistical features and the application of regression procedures, and ended with residual analysis, intended for the study of compatibility of data and model settings. This paper also analyzes the values of some standard statistics used in the selection of appropriate regression model. Testing of the model is carried out with the use of the Statistics PASW 17 program.

  5. Issues in weighting bioassay data for use in regressions for internal dose assessments

    International Nuclear Information System (INIS)

    Strom, D.J.

    1992-11-01

    For use of bioassay data in internal dose assessment, research should be done to clarify the goal desired, the choice of method to achieve the goal, the selection of adjustable parameters, and on the ensemble of information that is available. Understanding of these issues should determine choices of weighting factors for bioassay data used in regression models. This paper provides an assessment of the relative importance of the various factors

  6. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    Science.gov (United States)

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  7. Binary logistic regression-Instrument for assessing museum indoor air impact on exhibits.

    Science.gov (United States)

    Bucur, Elena; Danet, Andrei Florin; Lehr, Carol Blaziu; Lehr, Elena; Nita-Lazar, Mihai

    2017-04-01

    This paper presents a new way to assess the environmental impact on historical artifacts using binary logistic regression. The prediction of the impact on the exhibits during certain pollution scenarios (environmental impact) was calculated by a mathematical model based on the binary logistic regression; it allows the identification of those environmental parameters from a multitude of possible parameters with a significant impact on exhibitions and ranks them according to their severity effect. Air quality (NO 2 , SO 2 , O 3 and PM 2.5 ) and microclimate parameters (temperature, humidity) monitoring data from a case study conducted within exhibition and storage spaces of the Romanian National Aviation Museum Bucharest have been used for developing and validating the binary logistic regression method and the mathematical model. The logistic regression analysis was used on 794 data combinations (715 to develop of the model and 79 to validate it) by a Statistical Package for Social Sciences (SPSS 20.0). The results from the binary logistic regression analysis demonstrated that from six parameters taken into consideration, four of them present a significant effect upon exhibits in the following order: O 3 >PM 2.5 >NO 2 >humidity followed at a significant distance by the effects of SO 2 and temperature. The mathematical model, developed in this study, correctly predicted 95.1 % of the cumulated effect of the environmental parameters upon the exhibits. Moreover, this model could also be used in the decisional process regarding the preventive preservation measures that should be implemented within the exhibition space. The paper presents a new way to assess the environmental impact on historical artifacts using binary logistic regression. The mathematical model developed on the environmental parameters analyzed by the binary logistic regression method could be useful in a decision-making process establishing the best measures for pollution reduction and preventive

  8. Using synthetic data to evaluate multiple regression and principal component analyses for statistical modeling of daily building energy consumption

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, T.A. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States)); Claridge, D.E. (Energy Systems Lab., Texas A and M Univ., College Station, TX (United States))

    1994-01-01

    Multiple regression modeling of monitored building energy use data is often faulted as a reliable means of predicting energy use on the grounds that multicollinearity between the regressor variables can lead both to improper interpretation of the relative importance of the various physical regressor parameters and to a model with unstable regressor coefficients. Principal component analysis (PCA) has the potential to overcome such drawbacks. While a few case studies have already attempted to apply this technique to building energy data, the objectives of this study were to make a broader evaluation of PCA and multiple regression analysis (MRA) and to establish guidelines under which one approach is preferable to the other. Four geographic locations in the US with different climatic conditions were selected and synthetic data sequence representative of daily energy use in large institutional buildings were generated in each location using a linear model with outdoor temperature, outdoor specific humidity and solar radiation as the three regression variables. MRA and PCA approaches were then applied to these data sets and their relative performances were compared. Conditions under which PCA seems to perform better than MRA were identified and preliminary recommendations on the use of either modeling approach formulated. (orig.)

  9. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  10. Exploring reasons for the observed inconsistent trial reports on intra-articular injections with hyaluronic acid in the treatment of osteoarthritis: Meta-regression analyses of randomized trials.

    Science.gov (United States)

    Johansen, Mette; Bahrt, Henriette; Altman, Roy D; Bartels, Else M; Juhl, Carsten B; Bliddal, Henning; Lund, Hans; Christensen, Robin

    2016-08-01

    The aim was to identify factors explaining inconsistent observations concerning the efficacy of intra-articular hyaluronic acid compared to intra-articular sham/control, or non-intervention control, in patients with symptomatic osteoarthritis, based on randomized clinical trials (RCTs). A systematic review and meta-regression analyses of available randomized trials were conducted. The outcome, pain, was assessed according to a pre-specified hierarchy of potentially available outcomes. Hedges׳s standardized mean difference [SMD (95% CI)] served as effect size. REstricted Maximum Likelihood (REML) mixed-effects models were used to combine study results, and heterogeneity was calculated and interpreted as Tau-squared and I-squared, respectively. Overall, 99 studies (14,804 patients) met the inclusion criteria: Of these, only 71 studies (72%), including 85 comparisons (11,216 patients), had adequate data available for inclusion in the primary meta-analysis. Overall, compared with placebo, intra-articular hyaluronic acid reduced pain with an effect size of -0.39 [-0.47 to -0.31; P hyaluronic acid. Based on available trial data, intra-articular hyaluronic acid showed a better effect than intra-articular saline on pain reduction in osteoarthritis. Publication bias and the risk of selective outcome reporting suggest only small clinical effect compared to saline. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  12. The N400 as a snapshot of interactive processing: evidence from regression analyses of orthographic neighbor and lexical associate effects

    Science.gov (United States)

    Laszlo, Sarah; Federmeier, Kara D.

    2010-01-01

    Linking print with meaning tends to be divided into subprocesses, such as recognition of an input's lexical entry and subsequent access of semantics. However, recent results suggest that the set of semantic features activated by an input is broader than implied by a view wherein access serially follows recognition. EEG was collected from participants who viewed items varying in number and frequency of both orthographic neighbors and lexical associates. Regression analysis of single item ERPs replicated past findings, showing that N400 amplitudes are greater for items with more neighbors, and further revealed that N400 amplitudes increase for items with more lexical associates and with higher frequency neighbors or associates. Together, the data suggest that in the N400 time window semantic features of items broadly related to inputs are active, consistent with models in which semantic access takes place in parallel with stimulus recognition. PMID:20624252

  13. Association between biomarkers and clinical characteristics in chronic subdural hematoma patients assessed with lasso regression.

    Directory of Open Access Journals (Sweden)

    Are Hugo Pripp

    Full Text Available Chronic subdural hematoma (CSDH is characterized by an "old" encapsulated collection of blood and blood breakdown products between the brain and its outermost covering (the dura. Recognized risk factors for development of CSDH are head injury, old age and using anticoagulation medication, but its underlying pathophysiological processes are still unclear. It is assumed that a complex local process of interrelated mechanisms including inflammation, neomembrane formation, angiogenesis and fibrinolysis could be related to its development and propagation. However, the association between the biomarkers of inflammation and angiogenesis, and the clinical and radiological characteristics of CSDH patients, need further investigation. The high number of biomarkers compared to the number of observations, the correlation between biomarkers, missing data and skewed distributions may limit the usefulness of classical statistical methods. We therefore explored lasso regression to assess the association between 30 biomarkers of inflammation and angiogenesis at the site of lesions, and selected clinical and radiological characteristics in a cohort of 93 patients. Lasso regression performs both variable selection and regularization to improve the predictive accuracy and interpretability of the statistical model. The results from the lasso regression showed analysis exhibited lack of robust statistical association between the biomarkers in hematoma fluid with age, gender, brain infarct, neurological deficiencies and volume of hematoma. However, there were associations between several of the biomarkers with postoperative recurrence requiring reoperation. The statistical analysis with lasso regression supported previous findings that the immunological characteristics of CSDH are local. The relationship between biomarkers, the radiological appearance of lesions and recurrence requiring reoperation have been inclusive using classical statistical methods on these data

  14. Modeling the potential risk factors of bovine viral diarrhea prevalence in Egypt using univariable and multivariable logistic regression analyses

    Directory of Open Access Journals (Sweden)

    Abdelfattah M. Selim

    2018-03-01

    Full Text Available Aim: The present cross-sectional study was conducted to determine the seroprevalence and potential risk factors associated with Bovine viral diarrhea virus (BVDV disease in cattle and buffaloes in Egypt, to model the potential risk factors associated with the disease using logistic regression (LR models, and to fit the best predictive model for the current data. Materials and Methods: A total of 740 blood samples were collected within November 2012-March 2013 from animals aged between 6 months and 3 years. The potential risk factors studied were species, age, sex, and herd location. All serum samples were examined with indirect ELIZA test for antibody detection. Data were analyzed with different statistical approaches such as Chi-square test, odds ratios (OR, univariable, and multivariable LR models. Results: Results revealed a non-significant association between being seropositive with BVDV and all risk factors, except for species of animal. Seroprevalence percentages were 40% and 23% for cattle and buffaloes, respectively. OR for all categories were close to one with the highest OR for cattle relative to buffaloes, which was 2.237. Likelihood ratio tests showed a significant drop of the -2LL from univariable LR to multivariable LR models. Conclusion: There was an evidence of high seroprevalence of BVDV among cattle as compared with buffaloes with the possibility of infection in different age groups of animals. In addition, multivariable LR model was proved to provide more information for association and prediction purposes relative to univariable LR models and Chi-square tests if we have more than one predictor.

  15. Structural vascular disease in Africans: performance of ethnic-specific waist circumference cut points using logistic regression and neural network analyses: the SABPA study

    OpenAIRE

    Botha, J.; De Ridder, J.H.; Potgieter, J.C.; Steyn, H.S.; Malan, L.

    2013-01-01

    A recently proposed model for waist circumference cut points (RPWC), driven by increased blood pressure, was demonstrated in an African population. We therefore aimed to validate the RPWC by comparing the RPWC and the Joint Statement Consensus (JSC) models via Logistic Regression (LR) and Neural Networks (NN) analyses. Urban African gender groups (N=171) were stratified according to the JSC and RPWC cut point models. Ultrasound carotid intima media thickness (CIMT), blood pressure (BP) and fa...

  16. Tests of Alignment among Assessment, Standards, and Instruction Using Generalized Linear Model Regression

    Science.gov (United States)

    Fulmer, Gavin W.; Polikoff, Morgan S.

    2014-01-01

    An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…

  17. Quality assessment of published health economic analyses from South America.

    Science.gov (United States)

    Machado, Márcio; Iskedjian, Michael; Einarson, Thomas R

    2006-05-01

    Health economic analyses have become important to healthcare systems worldwide. No studies have previously examined South America's contribution in this area. To survey the literature with the purpose of reviewing, quantifying, and assessing the quality of published South American health economic analyses. A search of MEDLINE (1990-December 2004), EMBASE (1990-December 2004), International Pharmaceutical Abstracts (1990-December 2004), Literatura Latino-Americana e do Caribe em Ciências da Saúde (1982-December 2004), and Sistema de Informacion Esencial en Terapéutica y Salud (1980-December 2004) was completed using the key words cost-effectiveness analysis (CEA), cost-utility analysis (CUA), cost-minimization analysis (CMA), and cost-benefit analysis (CBA); abbreviations CEA, CUA, CMA, and CBA; and all South American country names. Papers were categorized by type and country by 2 independent reviewers. Quality was assessed using a 12 item checklist, characterizing scores as 4 (good), 3 (acceptable), 2 (poor), 1 (unable to judge), and 0 (unacceptable). To be included in our investigation, studies needed to have simultaneously examined costs and outcomes. We retrieved 25 articles; one duplicate article was rejected, leaving 24 (CEA = 15, CBA = 6, CMA = 3; Brazil = 9, Argentina = 5, Colombia = 3, Chile = 2, Ecuador = 2, 1 each from Peru, Uruguay, Venezuela). Variability between raters was less than 0.5 point on overall scores (OS) and less than 1 point on all individual items. Mean OS was 2.6 (SD 1.0, range 1.4-3.8). CBAs scored highest (OS 2.8, SD 0.8), CEAs next (OS 2.7, SD 0.7), and CMAs lowest (OS 2.0, SD 0.5). When scored by type of question, definition of study aim scored highest (OS 3.0, SD 0.8), while ethical issues scored lowest (OS 1.5, SD 0.9). By country, Peru scored highest (mean OS 3.8) and Uruguay had the lowest scores (mean OS 2.2). A nonsignificant time trend was noted for OS (R2 = 0.12; p = 0.104). Quality scores of health economic analyses

  18. Activation analyses updating the ITER radioactive waste assessment

    International Nuclear Information System (INIS)

    Pampin, R.; Zheng, S.; Lilley, S.; Na, B.C.; Loughlin, M.J.; Taylor, N.P.

    2012-01-01

    Highlights: ► Comprehensive updated of ITER radwaste assessment. ► Latest coupled neutronics and activation methods. ► Type A waste at shutdown decays to TFA within 100 years. ► Most type B waste at shutdown is still type B after 100 years. - Abstract: A study is reported which computes the radiation transport and activation response throughout the ITER machine and updates the ITER radioactive waste assessment using modern 3D models and up-to-date methods. The latest information on component design, maintenance, replacement schedules and materials is adopted. The radwaste classification is revised for all the major components of ITER, as well as several representative port plugs. Results include categorisation snapshots at different decay times, time histories of radiological quantities throughout the machine, and guidelines on interim decay times for components. All plasma-facing materials except tungsten are found to classify as type B due to the transmutation of their main constituents. Major contributors to the IRAS index of all materials are reported. Elemental concentration limits for type A classification of first wall and divertor materials are obtained; for the steels, only a reduction in service lifetime can reduce the waste class. Comparison of total waste amounts with earlier assessments is limited by the fact that analyses of some components are still preliminary; the trend, however, indicates a potential reduction in the total amount of waste if component segregation is demonstrated.

  19. Hybrid data mining-regression for infrastructure risk assessment based on zero-inflated data

    International Nuclear Information System (INIS)

    Guikema, S.D.; Quiring, S.M.

    2012-01-01

    Infrastructure disaster risk assessment seeks to estimate the probability of a given customer or area losing service during a disaster, sometimes in conjunction with estimating the duration of each outage. This is often done on the basis of past data about the effects of similar events impacting the same or similar systems. In many situations this past performance data from infrastructure systems is zero-inflated; it has more zeros than can be appropriately modeled with standard probability distributions. The data are also often non-linear and exhibit threshold effects due to the complexities of infrastructure system performance. Standard zero-inflated statistical models such as zero-inflated Poisson and zero-inflated negative binomial regression models do not adequately capture these complexities. In this paper we develop a novel method that is a hybrid classification tree/regression method for complex, zero-inflated data sets. We investigate its predictive accuracy based on a large number of simulated data sets and then demonstrate its practical usefulness with an application to hurricane power outage risk assessment for a large utility based on actual data from the utility. While formulated for infrastructure disaster risk assessment, this method is promising for data-driven analysis for other situations with zero-inflated, complex data exhibiting response thresholds.

  20. Assessment of diagnostic value of tumor markers for colorectal neoplasm by logistic regression and ROC curve

    International Nuclear Information System (INIS)

    Ping, G.

    2007-01-01

    Full text: Objective: To assess the diagnostic value of CEA CA199 and CA50 for colorectal neoplasm by logistic regression and ROC curve. Methods: The subjects include 75 patients of colorectal cancer, 35 patients of benign intestinal disease and 49 health controls. CEA CA199 and CA50 are measured by CLIA ECLIA and IRMA respectively. The area under the curve (AUC) of CEA CA 199 CA50 and logistic regression results are compared. [Result] In the cancer-benign group, the AUC of CA50 is larger than the AUC of CA199 Compared with the AUC of combination of CEA CA199 and CA50 (0.604),the AUC of combination of CEA and CA50 (0.875) is larger and it is also larger than any other AUC of CEA CA199 or CA50 alone. In the cancerhealth group, the AUC of combination of CEA CA199 and CA50 is larger than any other AUC of CEA CA199 or CA50 alone. No matter in the cancer-benign group or cancerhealth group. The AUC of CEA is larger than the AUC of CA199 or CA50. Conclusion: CEA is useful in the diagnosis of colorectal cancer. In the process of differential diagnosis, the combination of CEA and CA50 can give more information, while the combination of three tumor markers does not perform well. Furthermore, as a statistical method, logistic regression can improve the diagnostic sensitivity and specificity. (author)

  1. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    Science.gov (United States)

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Dental age assessment of young Iranian adults using third molars: A multivariate regression study.

    Science.gov (United States)

    Bagherpour, Ali; Anbiaee, Najmeh; Partovi, Parnia; Golestani, Shayan; Afzalinasab, Shakiba

    2012-10-01

    In recent years, a noticeable increase in forensic age estimations of living individuals has been observed. Radiologic assessment of the mineralisation stage of third molars is of particular importance, with regard to the relevant age group. To attain a referral database and regression equations for dental age estimation of unaccompanied minors in an Iranian population was the goal of this study. Moreover, determination was made concerning the probability of an individual being over the age of 18 in case of full third molar(s) development. Using the scoring system of Gleiser and Hunt, modified by Köhler, an investigation of a cross-sectional sample of 1274 orthopantomograms of 885 females and 389 males aged between 15 and 22 years was carried out. Using kappa statistics, intra-observer reliability was tested. With Spearman correlation coefficient, correlation between the scores of all four wisdom teeth, was evaluated. We also carried out the Wilcoxon signed-rank test on asymmetry and calculated the regression formulae. A strong intra-observer agreement was displayed by the kappa value. No significant difference (p-value for upper and lower jaws were 0.07 and 0.59, respectively) was discovered by Wilcoxon signed-rank test for left and right asymmetry. The developmental stage of upper right and upper left third molars yielded the greatest correlation coefficient. The probability of an individual being over the age of 18 is 95.6% for males and 100.0% for females in case four fully developed third molars are present. Taking into consideration gender, location and number of wisdom teeth, regression formulae were arrived at. Use of population-specific standards is recommended as a means of improving the accuracy of forensic age estimates based on third molars mineralisation. To obtain more exact regression formulae, wider age range studies are recommended. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  3. A comparison of cephalometric analyses for assessing sagittal jaw relationship

    International Nuclear Information System (INIS)

    Erum, G.; Fida, M.

    2008-01-01

    To compare the seven methods of cephalometric analysis for assessing sagittal jaw relationship and to determine the level of agreement between them. Seven methods, describing anteroposterior jaw relationships (A-B plane, ANB, Wits, AXB, AF-BF, FABA and Beta angle) were measured on the lateral cephalographs of 85 patients. Correlation analysis, using Cramer's V-test, was performed to determine the possible agreement between the pair of analyses. The mean age of the sample, comprising 35 males and 50 females was 15 years and 3 months. Statistically significant relationships were found among seven sagittal parameters with p-value <0.001. Very strong correlation was found between AXB and AF-BF distance (r=0.924); and weak correlation between ANB and Beta angle (r=0.377). Wits appraisal showed the greatest coefficient of variability. Despite varying strengths of association, statistically significant correlations were found among seven methods for assessing sagittal jaw relationship. FABA and A-B plane may be used to predict the skeletal class in addition to the established ANB angle. (author)

  4. Classification and regression tree (CART) analyses of genomic signatures reveal sets of tetramers that discriminate temperature optima of archaea and bacteria

    Science.gov (United States)

    Dyer, Betsey D.; Kahn, Michael J.; LeBlanc, Mark D.

    2008-01-01

    Classification and regression tree (CART) analysis was applied to genome-wide tetranucleotide frequencies (genomic signatures) of 195 archaea and bacteria. Although genomic signatures have typically been used to classify evolutionary divergence, in this study, convergent evolution was the focus. Temperature optima for most of the organisms examined could be distinguished by CART analyses of tetranucleotide frequencies. This suggests that pervasive (nonlinear) qualities of genomes may reflect certain environmental conditions (such as temperature) in which those genomes evolved. The predominant use of GAGA and AGGA as the discriminating tetramers in CART models suggests that purine-loading and codon biases of thermophiles may explain some of the results. PMID:19054742

  5. Assessment of bitter taste of pharmaceuticals with multisensor system employing 3 way PLS regression

    International Nuclear Information System (INIS)

    Rudnitskaya, Alisa; Kirsanov, Dmitry; Blinova, Yulia; Legin, Evgeny; Seleznev, Boris; Clapham, David; Ives, Robert S.; Saunders, Kenneth A.; Legin, Andrey

    2013-01-01

    Highlights: ► Chemically diverse APIs are studied with potentiometric “electronic tongue”. ► Bitter taste of APIs can be predicted with 3wayPLS regression from ET data. ► High correlation of ET assessment with human panel and rat in vivo model. -- Abstract: The application of the potentiometric multisensor system (electronic tongue, ET) for quantification of the bitter taste of structurally diverse active pharmaceutical ingredients (API) is reported. The measurements were performed using a set of bitter substances that had been assessed by a professional human sensory panel and the in vivo rat brief access taste aversion (BATA) model to produce bitterness intensity scores for each substance at different concentrations. The set consisted of eight substances, both inorganic and organic – azelastine, caffeine, chlorhexidine, potassium nitrate, naratriptan, paracetamol, quinine, and sumatriptan. With the aim of enhancing the response of the sensors to the studied APIs, measurements were carried out at different pH levels ranging from 2 to 10, thus promoting ionization of the compounds. This experiment yielded a 3 way data array (samples × sensors × pH levels) from which 3wayPLS regression models were constructed with both human panel and rat model reference data. These models revealed that artificial assessment of bitter taste with ET in the chosen set of API's is possible with average relative errors of 16% in terms of human panel bitterness score and 25% in terms of inhibition values from in vivo rat model data. Furthermore, these 3wayPLS models were applied for prediction of the bitterness in blind test samples of a further set of API's. The results of the prediction were compared with the inhibition values obtained from the in vivo rat model

  6. Assessment of bitter taste of pharmaceuticals with multisensor system employing 3 way PLS regression

    Energy Technology Data Exchange (ETDEWEB)

    Rudnitskaya, Alisa [CESAM and Chemistry Department, University of Aveiro, Aveiro (Portugal); Kirsanov, Dmitry, E-mail: d.kirsanov@gmail.com [Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Blinova, Yulia [Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Legin, Evgeny [Sensor Systems LLC, St. Petersburg (Russian Federation); Seleznev, Boris [Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation); Clapham, David; Ives, Robert S.; Saunders, Kenneth A. [GlaxoSmithKline Pharmaceuticals, Gunnels Wood Road, Stevenage (United Kingdom); Legin, Andrey [Chemistry Department, St. Petersburg University, St. Petersburg (Russian Federation)

    2013-04-03

    Highlights: ► Chemically diverse APIs are studied with potentiometric “electronic tongue”. ► Bitter taste of APIs can be predicted with 3wayPLS regression from ET data. ► High correlation of ET assessment with human panel and rat in vivo model. -- Abstract: The application of the potentiometric multisensor system (electronic tongue, ET) for quantification of the bitter taste of structurally diverse active pharmaceutical ingredients (API) is reported. The measurements were performed using a set of bitter substances that had been assessed by a professional human sensory panel and the in vivo rat brief access taste aversion (BATA) model to produce bitterness intensity scores for each substance at different concentrations. The set consisted of eight substances, both inorganic and organic – azelastine, caffeine, chlorhexidine, potassium nitrate, naratriptan, paracetamol, quinine, and sumatriptan. With the aim of enhancing the response of the sensors to the studied APIs, measurements were carried out at different pH levels ranging from 2 to 10, thus promoting ionization of the compounds. This experiment yielded a 3 way data array (samples × sensors × pH levels) from which 3wayPLS regression models were constructed with both human panel and rat model reference data. These models revealed that artificial assessment of bitter taste with ET in the chosen set of API's is possible with average relative errors of 16% in terms of human panel bitterness score and 25% in terms of inhibition values from in vivo rat model data. Furthermore, these 3wayPLS models were applied for prediction of the bitterness in blind test samples of a further set of API's. The results of the prediction were compared with the inhibition values obtained from the in vivo rat model.

  7. Personal, social, and game-related correlates of active and non-active gaming among dutch gaming adolescents: survey-based multivariable, multilevel logistic regression analyses.

    Science.gov (United States)

    Simons, Monique; de Vet, Emely; Chinapaw, Mai Jm; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes

    2014-04-04

    Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games-active games-seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a correlate of both active and non-active gaming

  8. Personal, Social, and Game-Related Correlates of Active and Non-Active Gaming Among Dutch Gaming Adolescents: Survey-Based Multivariable, Multilevel Logistic Regression Analyses

    Science.gov (United States)

    de Vet, Emely; Chinapaw, Mai JM; de Boer, Michiel; Seidell, Jacob C; Brug, Johannes

    2014-01-01

    Background Playing video games contributes substantially to sedentary behavior in youth. A new generation of video games—active games—seems to be a promising alternative to sedentary games to promote physical activity and reduce sedentary behavior. At this time, little is known about correlates of active and non-active gaming among adolescents. Objective The objective of this study was to examine potential personal, social, and game-related correlates of both active and non-active gaming in adolescents. Methods A survey assessing game behavior and potential personal, social, and game-related correlates was conducted among adolescents (12-16 years, N=353) recruited via schools. Multivariable, multilevel logistic regression analyses, adjusted for demographics (age, sex and educational level of adolescents), were conducted to examine personal, social, and game-related correlates of active gaming ≥1 hour per week (h/wk) and non-active gaming >7 h/wk. Results Active gaming ≥1 h/wk was significantly associated with a more positive attitude toward active gaming (OR 5.3, CI 2.4-11.8; Pgames (OR 0.30, CI 0.1-0.6; P=.002), a higher score on habit strength regarding gaming (OR 1.9, CI 1.2-3.2; P=.008) and having brothers/sisters (OR 6.7, CI 2.6-17.1; Pgame engagement (OR 0.95, CI 0.91-0.997; P=.04). Non-active gaming >7 h/wk was significantly associated with a more positive attitude toward non-active gaming (OR 2.6, CI 1.1-6.3; P=.035), a stronger habit regarding gaming (OR 3.0, CI 1.7-5.3; P7 h/wk. Active gaming is most strongly (negatively) associated with attitude with respect to non-active games, followed by observed active game behavior of brothers and sisters and attitude with respect to active gaming (positive associations). On the other hand, non-active gaming is most strongly associated with observed non-active game behavior of friends, habit strength regarding gaming and attitude toward non-active gaming (positive associations). Habit strength was a

  9. Assessment of participation bias in cohort studies: systematic review and meta-regression analysis

    Directory of Open Access Journals (Sweden)

    Sérgio Henrique Almeida da Silva Junior

    2015-11-01

    Full Text Available Abstract The proportion of non-participation in cohort studies, if associated with both the exposure and the probability of occurrence of the event, can introduce bias in the estimates of interest. The aim of this study is to evaluate the impact of participation and its characteristics in longitudinal studies. A systematic review (MEDLINE, Scopus and Web of Science for articles describing the proportion of participation in the baseline of cohort studies was performed. Among the 2,964 initially identified, 50 were selected. The average proportion of participation was 64.7%. Using a meta-regression model with mixed effects, only age, year of baseline contact and study region (borderline were associated with participation. Considering the decrease in participation in recent years, and the cost of cohort studies, it is essential to gather information to assess the potential for non-participation, before committing resources. Finally, journals should require the presentation of this information in the papers.

  10. Assessing the Liquidity of Firms: Robust Neural Network Regression as an Alternative to the Current Ratio

    Science.gov (United States)

    de Andrés, Javier; Landajo, Manuel; Lorca, Pedro; Labra, Jose; Ordóñez, Patricia

    Artificial neural networks have proven to be useful tools for solving financial analysis problems such as financial distress prediction and audit risk assessment. In this paper we focus on the performance of robust (least absolute deviation-based) neural networks on measuring liquidity of firms. The problem of learning the bivariate relationship between the components (namely, current liabilities and current assets) of the so-called current ratio is analyzed, and the predictive performance of several modelling paradigms (namely, linear and log-linear regressions, classical ratios and neural networks) is compared. An empirical analysis is conducted on a representative data base from the Spanish economy. Results indicate that classical ratio models are largely inadequate as a realistic description of the studied relationship, especially when used for predictive purposes. In a number of cases, especially when the analyzed firms are microenterprises, the linear specification is improved by considering the flexible non-linear structures provided by neural networks.

  11. Structured Additive Quantile Regression for Assessing the Determinants of Childhood Anemia in Rwanda.

    Science.gov (United States)

    Habyarimana, Faustin; Zewotir, Temesgen; Ramroop, Shaun

    2017-06-17

    Childhood anemia is among the most significant health problems faced by public health departments in developing countries. This study aims at assessing the determinants and possible spatial effects associated with childhood anemia in Rwanda. The 2014/2015 Rwanda Demographic and Health Survey (RDHS) data was used. The analysis was done using the structured spatial additive quantile regression model. The findings of this study revealed that the child's age; the duration of breastfeeding; gender of the child; the nutritional status of the child (whether underweight and/or wasting); whether the child had a fever; had a cough in the two weeks prior to the survey or not; whether the child received vitamin A supplementation in the six weeks before the survey or not; the household wealth index; literacy of the mother; mother's anemia status; mother's age at the birth are all significant factors associated with childhood anemia in Rwanda. Furthermore, significant structured spatial location effects on childhood anemia was found.

  12. A general framework for the regression analysis of pooled biomarker assessments.

    Science.gov (United States)

    Liu, Yan; McMahan, Christopher; Gallagher, Colin

    2017-07-10

    As a cost-efficient data collection mechanism, the process of assaying pooled biospecimens is becoming increasingly common in epidemiological research; for example, pooling has been proposed for the purpose of evaluating the diagnostic efficacy of biological markers (biomarkers). To this end, several authors have proposed techniques that allow for the analysis of continuous pooled biomarker assessments. Regretfully, most of these techniques proceed under restrictive assumptions, are unable to account for the effects of measurement error, and fail to control for confounding variables. These limitations are understandably attributable to the complex structure that is inherent to measurements taken on pooled specimens. Consequently, in order to provide practitioners with the tools necessary to accurately and efficiently analyze pooled biomarker assessments, herein, a general Monte Carlo maximum likelihood-based procedure is presented. The proposed approach allows for the regression analysis of pooled data under practically all parametric models and can be used to directly account for the effects of measurement error. Through simulation, it is shown that the proposed approach can accurately and efficiently estimate all unknown parameters and is more computational efficient than existing techniques. This new methodology is further illustrated using monocyte chemotactic protein-1 data collected by the Collaborative Perinatal Project in an effort to assess the relationship between this chemokine and the risk of miscarriage. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Selective principal component regression analysis of fluorescence hyperspectral image to assess aflatoxin contamination in corn

    Science.gov (United States)

    Selective principal component regression analysis (SPCR) uses a subset of the original image bands for principal component transformation and regression. For optimal band selection before the transformation, this paper used genetic algorithms (GA). In this case, the GA process used the regression co...

  14. The more total cognitive load is reduced by cues, the better retention and transfer of multimedia learning: A meta-analysis and two meta-regression analyses.

    Science.gov (United States)

    Xie, Heping; Wang, Fuxing; Hao, Yanbin; Chen, Jiaxue; An, Jing; Wang, Yuxin; Liu, Huashan

    2017-01-01

    Cueing facilitates retention and transfer of multimedia learning. From the perspective of cognitive load theory (CLT), cueing has a positive effect on learning outcomes because of the reduction in total cognitive load and avoidance of cognitive overload. However, this has not been systematically evaluated. Moreover, what remains ambiguous is the direct relationship between the cue-related cognitive load and learning outcomes. A meta-analysis and two subsequent meta-regression analyses were conducted to explore these issues. Subjective total cognitive load (SCL) and scores on a retention test and transfer test were selected as dependent variables. Through a systematic literature search, 32 eligible articles encompassing 3,597 participants were included in the SCL-related meta-analysis. Among them, 25 articles containing 2,910 participants were included in the retention-related meta-analysis and the following retention-related meta-regression, while there were 29 articles containing 3,204 participants included in the transfer-related meta-analysis and the transfer-related meta-regression. The meta-analysis revealed a statistically significant cueing effect on subjective ratings of cognitive load (d = -0.11, 95% CI = [-0.19, -0.02], p < 0.05), retention performance (d = 0.27, 95% CI = [0.08, 0.46], p < 0.01), and transfer performance (d = 0.34, 95% CI = [0.12, 0.56], p < 0.01). The subsequent meta-regression analyses showed that dSCL for cueing significantly predicted dretention for cueing (β = -0.70, 95% CI = [-1.02, -0.38], p < 0.001), as well as dtransfer for cueing (β = -0.60, 95% CI = [-0.92, -0.28], p < 0.001). Thus in line with CLT, adding cues in multimedia materials can indeed reduce SCL and promote learning outcomes, and the more SCL is reduced by cues, the better retention and transfer of multimedia learning.

  15. Meta-regression analyses, meta-analyses, and trial sequential analyses of the effects of supplementation with Beta-carotene, vitamin a, and vitamin e singly or in different combinations on all-cause mortality

    DEFF Research Database (Denmark)

    Bjelakovic, Goran; Nikolova, Dimitrinka; Gluud, Christian

    2013-01-01

    Evidence shows that antioxidant supplements may increase mortality. Our aims were to assess whether different doses of beta-carotene, vitamin A, and vitamin E affect mortality in primary and secondary prevention randomized clinical trials with low risk of bias.......Evidence shows that antioxidant supplements may increase mortality. Our aims were to assess whether different doses of beta-carotene, vitamin A, and vitamin E affect mortality in primary and secondary prevention randomized clinical trials with low risk of bias....

  16. Predictors of success of external cephalic version and cephalic presentation at birth among 1253 women with non-cephalic presentation using logistic regression and classification tree analyses.

    Science.gov (United States)

    Hutton, Eileen K; Simioni, Julia C; Thabane, Lehana

    2017-08-01

    Among women with a fetus with a non-cephalic presentation, external cephalic version (ECV) has been shown to reduce the rate of breech presentation at birth and cesarean birth. Compared with ECV at term, beginning ECV prior to 37 weeks' gestation decreases the number of infants in a non-cephalic presentation at birth. The purpose of this secondary analysis was to investigate factors associated with a successful ECV procedure and to present this in a clinically useful format. Data were collected as part of the Early ECV Pilot and Early ECV2 Trials, which randomized 1776 women with a fetus in breech presentation to either early ECV (34-36 weeks' gestation) or delayed ECV (at or after 37 weeks). The outcome of interest was successful ECV, defined as the fetus being in a cephalic presentation immediately following the procedure, as well as at the time of birth. The importance of several factors in predicting successful ECV was investigated using two statistical methods: logistic regression and classification and regression tree (CART) analyses. Among nulliparas, non-engagement of the presenting part and an easily palpable fetal head were independently associated with success. Among multiparas, non-engagement of the presenting part, gestation less than 37 weeks and an easily palpable fetal head were found to be independent predictors of success. These findings were consistent with results of the CART analyses. Regardless of parity, descent of the presenting part was the most discriminating factor in predicting successful ECV and cephalic presentation at birth. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  17. Structured Additive Quantile Regression for Assessing the Determinants of Childhood Anemia in Rwanda

    Directory of Open Access Journals (Sweden)

    Faustin Habyarimana

    2017-06-01

    Full Text Available Childhood anemia is among the most significant health problems faced by public health departments in developing countries. This study aims at assessing the determinants and possible spatial effects associated with childhood anemia in Rwanda. The 2014/2015 Rwanda Demographic and Health Survey (RDHS data was used. The analysis was done using the structured spatial additive quantile regression model. The findings of this study revealed that the child’s age; the duration of breastfeeding; gender of the child; the nutritional status of the child (whether underweight and/or wasting; whether the child had a fever; had a cough in the two weeks prior to the survey or not; whether the child received vitamin A supplementation in the six weeks before the survey or not; the household wealth index; literacy of the mother; mother’s anemia status; mother’s age at the birth are all significant factors associated with childhood anemia in Rwanda. Furthermore, significant structured spatial location effects on childhood anemia was found.

  18. Biodiversity analyses for risk assessment of genetically modified potato

    NARCIS (Netherlands)

    Lazebnik, Jenny; Dicke, Marcel; Braak, ter Cajo J.F.; Loon, van Joop J.A.

    2017-01-01

    An environmental risk assessment for the introduction of genetically modified crops includes assessing the consequences for biodiversity. In this study arthropod biodiversity was measured using pitfall traps in potato agro-ecosystems in Ireland and The Netherlands over two years. We tested the

  19. Assessing Commercial and Alternative Poultry Processing Methods using Microbiome Analyses

    Science.gov (United States)

    Assessing poultry processing methods/strategies has historically used culture-based methods to assess bacterial changes or reductions, both in terms of general microbial communities (e.g. total aerobic bacteria) or zoonotic pathogens of interest (e.g. Salmonella, Campylobacter). The advent of next ...

  20. Non-invasive diagnostic methods for atherosclerosis and use in assessing progression and regression in hypercholesterolemia

    International Nuclear Information System (INIS)

    Tsushima, Motoo; Fujii, Shigeki; Yutani, Chikao; Yamamoto, Akira; Naitoh, Hiroaki.

    1990-01-01

    We evaluated the wall thickening and stenosis rate (ASI), the calcification rate (ACI), and the wall thickening and calcification stenosis rate (SCI) of the lower abdominal aorta calculated by the 12 sector method from simple or enhanced computed tomography. The intra-observer variation of the calculation of ASI was 5.7% and that of ACI was 2.4%. In 9 patients who underwent an autopsy examination, ACI was significantly correlated with the rate of the calcification dimension to the whole objective area of the abdominal aorta (r=0.856, p<0.01). However, there were no correlations between ASI and the surface involvement or the atherosclerotic index obtained by the point-counting method of the autopsy materials. In the analysis of 40 patients with atherosclerotic vascular diseases, ASI and ACI were also highly correlated with the percentage volume of the arterial wall in relation to the whole volume of the observed artery (r=0.852, p<0.0001) and also the percentage calcification volume (r=0.913, p<0.0001) calculated by the computed method, respectively. The percentage of atherosclerotic vascular diseases increased in the group of both high ASI (over 10%) and high ACI (over 20%). We used SCI as a reliable index when the progression and regression of atherosclerosis was considered. Among patients of hypercholesterolemia consisting of 15 with familial hypercholesterolemia (FH) and 6 non-FH patients, the change of SCI (d-SCI) was significantly correlated with the change of total cholesterol concentration (d-TC) after the treatment (r=0.466, p<0.05) and the change of the right Achilles' tendon thickening (d-ATT) was also correlated with d-TC (r=0.634, p<0.005). However, no correlation between d-SCI and d-ATT was observed. In conclusion, CT indices of atherosclerosis were useful as a noninvasive quantitative diagnostic method and we were able to use them to assess the progression and regression of atherosclerosis. (author)

  1. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  2. Segmented regression analysis of interrupted time series data to assess outcomes of a South American road traffic alcohol policy change.

    Science.gov (United States)

    Nistal-Nuño, Beatriz

    2017-09-01

    In Chile, a new law introduced in March 2012 decreased the legal blood alcohol concentration (BAC) limit for driving while impaired from 1 to 0.8 g/l and the legal BAC limit for driving under the influence of alcohol from 0.5 to 0.3 g/l. The goal is to assess the impact of this new law on mortality and morbidity outcomes in Chile. A review of national databases in Chile was conducted from January 2003 to December 2014. Segmented regression analysis of interrupted time series was used for analyzing the data. In a series of multivariable linear regression models, the change in intercept and slope in the monthly incidence rate of traffic deaths and injuries and association with alcohol per 100,000 inhabitants was estimated from pre-intervention to postintervention, while controlling for secular changes. In nested regression models, potential confounding seasonal effects were accounted for. All analyses were performed at a two-sided significance level of 0.05. Immediate level drops in all the monthly rates were observed after the law from the end of the prelaw period in the majority of models and in all the de-seasonalized models, although statistical significance was reached only in the model for injures related to alcohol. After the law, the estimated monthly rate dropped abruptly by -0.869 for injuries related to alcohol and by -0.859 adjusting for seasonality (P < 0.001). Regarding the postlaw long-term trends, it was evidenced a steeper decreasing trend after the law in the models for deaths related to alcohol, although these differences were not statistically significant. A strong evidence of a reduction in traffic injuries related to alcohol was found following the law in Chile. Although insufficient evidence was found of a statistically significant effect for the beneficial effects seen on deaths and overall injuries, potential clinically important effects cannot be ruled out. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd

  3. A geographically weighted regression model for geothermal potential assessment in mediterranean cultural landscape

    Science.gov (United States)

    D'Arpa, S.; Zaccarelli, N.; Bruno, D. E.; Leucci, G.; Uricchio, V. F.; Zurlini, G.

    2012-04-01

    Geothermal heat can be used directly in many applications (agro-industrial processes, sanitary hot water production, heating/cooling systems, etc.). These applications respond to energetic and environmental sustainability criteria, ensuring substantial energy savings with low environmental impacts. In particular, in Mediterranean cultural landscapes the exploitation of geothermal energy offers a valuable alternative compared to other exploitation systems more land-consuming and visual-impact. However, low enthalpy geothermal energy applications at regional scale, require careful design and planning to fully exploit benefits and reduce drawbacks. We propose a first example of application of a Geographically Weighted Regression (GWR) for the modeling of geothermal potential in the Apulia Region (South Italy) by integrating hydrological (e.g. depth to water table, water speed and temperature), geological-geotechnical (e.g. lithology, thermal conductivity) parameters and land-use indicators. The GWR model can effectively cope with data quality, spatial anisotropy, lack of stationarity and presence of discontinuities in the underlying data maps. The geothermal potential assessment required a good knowledge of the space-time variation of the numerous parameters related to the status of geothermal resource, a contextual analysis of spatial and environmental features, as well as the presence and nature of regulations or infrastructures constraints. We create an ad hoc geodatabase within ArcGIS 10 collecting relevant data and performing a quality assessment. Cross-validation shows high level of consistency of the spatial local models, as well as error maps can depict areas of lower reliability. Based on low enthalpy geothermal potential map created, a first zoning of the study area is proposed, considering four level of possible exploitation. Such zoning is linked and refined by the actual legal constraints acting at regional or province level as enforced by the regional

  4. Analyses of polycyclic aromatic hydrocarbon (PAH) and chiral-PAH analogues-methyl-β-cyclodextrin guest-host inclusion complexes by fluorescence spectrophotometry and multivariate regression analysis.

    Science.gov (United States)

    Greene, LaVana; Elzey, Brianda; Franklin, Mariah; Fakayode, Sayo O

    2017-03-05

    The negative health impact of polycyclic aromatic hydrocarbons (PAHs) and differences in pharmacological activity of enantiomers of chiral molecules in humans highlights the need for analysis of PAHs and their chiral analogue molecules in humans. Herein, the first use of cyclodextrin guest-host inclusion complexation, fluorescence spectrophotometry, and chemometric approach to PAH (anthracene) and chiral-PAH analogue derivatives (1-(9-anthryl)-2,2,2-triflouroethanol (TFE)) analyses are reported. The binding constants (K b ), stoichiometry (n), and thermodynamic properties (Gibbs free energy (ΔG), enthalpy (ΔH), and entropy (ΔS)) of anthracene and enantiomers of TFE-methyl-β-cyclodextrin (Me-β-CD) guest-host complexes were also determined. Chemometric partial-least-square (PLS) regression analysis of emission spectra data of Me-β-CD-guest-host inclusion complexes was used for the determination of anthracene and TFE enantiomer concentrations in Me-β-CD-guest-host inclusion complex samples. The values of calculated K b and negative ΔG suggest the thermodynamic favorability of anthracene-Me-β-CD and enantiomeric of TFE-Me-β-CD inclusion complexation reactions. However, anthracene-Me-β-CD and enantiomer TFE-Me-β-CD inclusion complexations showed notable differences in the binding affinity behaviors and thermodynamic properties. The PLS regression analysis resulted in square-correlation-coefficients of 0.997530 or better and a low LOD of 3.81×10 -7 M for anthracene and 3.48×10 -8 M for TFE enantiomers at physiological conditions. Most importantly, PLS regression accurately determined the anthracene and TFE enantiomer concentrations with an average low error of 2.31% for anthracene, 4.44% for R-TFE and 3.60% for S-TFE. The results of the study are highly significant because of its high sensitivity and accuracy for analysis of PAH and chiral PAH analogue derivatives without the need of an expensive chiral column, enantiomeric resolution, or use of a polarized

  5. Bisphenol-A exposures and behavioural aberrations: median and linear spline and meta-regression analyses of 12 toxicity studies in rodents.

    Science.gov (United States)

    Peluso, Marco E M; Munnia, Armelle; Ceppi, Marcello

    2014-11-05

    Exposures to bisphenol-A, a weak estrogenic chemical, largely used for the production of plastic containers, can affect the rodent behaviour. Thus, we examined the relationships between bisphenol-A and the anxiety-like behaviour, spatial skills, and aggressiveness, in 12 toxicity studies of rodent offspring from females orally exposed to bisphenol-A, while pregnant and/or lactating, by median and linear splines analyses. Subsequently, the meta-regression analysis was applied to quantify the behavioural changes. U-shaped, inverted U-shaped and J-shaped dose-response curves were found to describe the relationships between bisphenol-A with the behavioural outcomes. The occurrence of anxiogenic-like effects and spatial skill changes displayed U-shaped and inverted U-shaped curves, respectively, providing examples of effects that are observed at low-doses. Conversely, a J-dose-response relationship was observed for aggressiveness. When the proportion of rodents expressing certain traits or the time that they employed to manifest an attitude was analysed, the meta-regression indicated that a borderline significant increment of anxiogenic-like effects was present at low-doses regardless of sexes (β)=-0.8%, 95% C.I. -1.7/0.1, P=0.076, at ≤120 μg bisphenol-A. Whereas, only bisphenol-A-males exhibited a significant inhibition of spatial skills (β)=0.7%, 95% C.I. 0.2/1.2, P=0.004, at ≤100 μg/day. A significant increment of aggressiveness was observed in both the sexes (β)=67.9,C.I. 3.4, 172.5, P=0.038, at >4.0 μg. Then, bisphenol-A treatments significantly abrogated spatial learning and ability in males (Pbisphenol-A, e.g. ≤120 μg/day, were associated to behavioural aberrations in offspring. Copyright © 2014. Published by Elsevier Ireland Ltd.

  6. Comparative and Predictive Multimedia Assessments Using Monte Carlo Uncertainty Analyses

    Science.gov (United States)

    Whelan, G.

    2002-05-01

    Multiple-pathway frameworks (sometimes referred to as multimedia models) provide a platform for combining medium-specific environmental models and databases, such that they can be utilized in a more holistic assessment of contaminant fate and transport in the environment. These frameworks provide a relatively seamless transfer of information from one model to the next and from databases to models. Within these frameworks, multiple models are linked, resulting in models that consume information from upstream models and produce information to be consumed by downstream models. The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) is an example, which allows users to link their models to other models and databases. FRAMES is an icon-driven, site-layout platform that is an open-architecture, object-oriented system that interacts with environmental databases; helps the user construct a Conceptual Site Model that is real-world based; allows the user to choose the most appropriate models to solve simulation requirements; solves the standard risk paradigm of release transport and fate; and exposure/risk assessments to people and ecology; and presents graphical packages for analyzing results. FRAMES is specifically designed allow users to link their own models into a system, which contains models developed by others. This paper will present the use of FRAMES to evaluate potential human health exposures using real site data and realistic assumptions from sources, through the vadose and saturated zones, to exposure and risk assessment at three real-world sites, using the Multimedia Environmental Pollutant Assessment System (MEPAS), which is a multimedia model contained within FRAMES. These real-world examples use predictive and comparative approaches coupled with a Monte Carlo analysis. A predictive analysis is where models are calibrated to monitored site data, prior to the assessment, and a comparative analysis is where models are not calibrated but

  7. Area under the curve predictions of dalbavancin, a new lipoglycopeptide agent, using the end of intravenous infusion concentration data point by regression analyses such as linear, log-linear and power models.

    Science.gov (United States)

    Bhamidipati, Ravi Kanth; Syed, Muzeeb; Mullangi, Ramesh; Srinivas, Nuggehally

    2018-02-01

    1. Dalbavancin, a lipoglycopeptide, is approved for treating gram-positive bacterial infections. Area under plasma concentration versus time curve (AUC inf ) of dalbavancin is a key parameter and AUC inf /MIC ratio is a critical pharmacodynamic marker. 2. Using end of intravenous infusion concentration (i.e. C max ) C max versus AUC inf relationship for dalbavancin was established by regression analyses (i.e. linear, log-log, log-linear and power models) using 21 pairs of subject data. 3. The predictions of the AUC inf were performed using published C max data by application of regression equations. The quotient of observed/predicted values rendered fold difference. The mean absolute error (MAE)/root mean square error (RMSE) and correlation coefficient (r) were used in the assessment. 4. MAE and RMSE values for the various models were comparable. The C max versus AUC inf exhibited excellent correlation (r > 0.9488). The internal data evaluation showed narrow confinement (0.84-1.14-fold difference) with a RMSE models predicted AUC inf with a RMSE of 3.02-27.46% with fold difference largely contained within 0.64-1.48. 5. Regardless of the regression models, a single time point strategy of using C max (i.e. end of 30-min infusion) is amenable as a prospective tool for predicting AUC inf of dalbavancin in patients.

  8. Item Response Theory Modeling and Categorical Regression Analyses of the Five-Factor Model Rating Form: A Study on Italian Community-Dwelling Adolescent Participants and Adult Participants.

    Science.gov (United States)

    Fossati, Andrea; Widiger, Thomas A; Borroni, Serena; Maffei, Cesare; Somma, Antonella

    2017-06-01

    To extend the evidence on the reliability and construct validity of the Five-Factor Model Rating Form (FFMRF) in its self-report version, two independent samples of Italian participants, which were composed of 510 adolescent high school students and 457 community-dwelling adults, respectively, were administered the FFMRF in its Italian translation. Adolescent participants were also administered the Italian translation of the Borderline Personality Features Scale for Children-11 (BPFSC-11), whereas adult participants were administered the Italian translation of the Triarchic Psychopathy Measure (TriPM). Cronbach α values were consistent with previous findings; in both samples, average interitem r values indicated acceptable internal consistency for all FFMRF scales. A multidimensional graded item response theory model indicated that the majority of FFMRF items had adequate discrimination parameters; information indices supported the reliability of the FFMRF scales. Both categorical (i.e., item-level) and scale-level regression analyses suggested that the FFMRF scores may predict a nonnegligible amount of variance in the BPFSC-11 total score in adolescent participants, and in the TriPM scale scores in adult participants.

  9. Plasma-safety assessment model and safety analyses of ITER

    International Nuclear Information System (INIS)

    Honda, T.; Okazaki, T.; Bartels, H.-H.; Uckan, N.A.; Sugihara, M.; Seki, Y.

    2001-01-01

    A plasma-safety assessment model has been provided on the basis of the plasma physics database of the International Thermonuclear Experimental Reactor (ITER) to analyze events including plasma behavior. The model was implemented in a safety analysis code (SAFALY), which consists of a 0-D dynamic plasma model and a 1-D thermal behavior model of the in-vessel components. Unusual plasma events of ITER, e.g., overfueling, were calculated using the code and plasma burning is found to be self-bounded by operation limits or passively shut down due to impurity ingress from overheated divertor targets. Sudden transition of divertor plasma might lead to failure of the divertor target because of a sharp increase of the heat flux. However, the effects of the aggravating failure can be safely handled by the confinement boundaries. (author)

  10. Appropriate assessment of neighborhood effects on individual health: integrating random and fixed effects in multilevel logistic regression

    DEFF Research Database (Denmark)

    Larsen, Klaus; Merlo, Juan

    2005-01-01

    The logistic regression model is frequently used in epidemiologic studies, yielding odds ratio or relative risk interpretations. Inspired by the theory of linear normal models, the logistic regression model has been extended to allow for correlated responses by introducing random effects. However......, the model does not inherit the interpretational features of the normal model. In this paper, the authors argue that the existing measures are unsatisfactory (and some of them are even improper) when quantifying results from multilevel logistic regression analyses. The authors suggest a measure...... of heterogeneity, the median odds ratio, that quantifies cluster heterogeneity and facilitates a direct comparison between covariate effects and the magnitude of heterogeneity in terms of well-known odds ratios. Quantifying cluster-level covariates in a meaningful way is a challenge in multilevel logistic...

  11. Effective behaviour change techniques for physical activity and healthy eating in overweight and obese adults; systematic review and meta-regression analyses.

    Science.gov (United States)

    Samdal, Gro Beate; Eide, Geir Egil; Barth, Tom; Williams, Geoffrey; Meland, Eivind

    2017-03-28

    This systematic review aims to explain the heterogeneity in results of interventions to promote physical activity and healthy eating for overweight and obese adults, by exploring the differential effects of behaviour change techniques (BCTs) and other intervention characteristics. The inclusion criteria specified RCTs with ≥ 12 weeks' duration, from January 2007 to October 2014, for adults (mean age ≥ 40 years, mean BMI ≥ 30). Primary outcomes were measures of healthy diet or physical activity. Two reviewers rated study quality, coded the BCTs, and collected outcome results at short (≤6 months) and long term (≥12 months). Meta-analyses and meta-regressions were used to estimate effect sizes (ES), heterogeneity indices (I 2 ) and regression coefficients. We included 48 studies containing a total of 82 outcome reports. The 32 long term reports had an overall ES = 0.24 with 95% confidence interval (CI): 0.15 to 0.33 and I 2  = 59.4%. The 50 short term reports had an ES = 0.37 with 95% CI: 0.26 to 0.48, and I 2  = 71.3%. The number of BCTs unique to the intervention group, and the BCTs goal setting and self-monitoring of behaviour predicted the effect at short and long term. The total number of BCTs in both intervention arms and using the BCTs goal setting of outcome, feedback on outcome of behaviour, implementing graded tasks, and adding objects to the environment, e.g. using a step counter, significantly predicted the effect at long term. Setting a goal for change; and the presence of reporting bias independently explained 58.8% of inter-study variation at short term. Autonomy supportive and person-centred methods as in Motivational Interviewing, the BCTs goal setting of behaviour, and receiving feedback on the outcome of behaviour, explained all of the between study variations in effects at long term. There are similarities, but also differences in effective BCTs promoting change in healthy eating and physical activity and

  12. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    Science.gov (United States)

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Clinical trials: odds ratios and multiple regression models--why and how to assess them

    NARCIS (Netherlands)

    Sobh, Mohamad; Cleophas, Ton J.; Hadj-Chaib, Amel; Zwinderman, Aeilko H.

    2008-01-01

    Odds ratios (ORs), unlike chi2 tests, provide direct insight into the strength of the relationship between treatment modalities and treatment effects. Multiple regression models can reduce the data spread due to certain patient characteristics and thus improve the precision of the treatment

  14. NPP Krsko periodic safety review. Safety assessment and analyses

    International Nuclear Information System (INIS)

    Basic, I.; Spiler, J.; Thaulez, F.

    2002-01-01

    Definition of a PSR (Periodic Safety Review) project is a comprehensive safety review of a plant after ten years of operation. The objective is a verification by means of a comprehensive review using current methods that the plant remains safe when judged against current safety objectives and practices and that adequate arrangements are in place to maintain plant safety. The overall goals of the NEK PSR Program are defined in compliance with the basic role of a PSR and the current practice typical for most of the countries in EU. This practice is described in the related guides and good practice documents issued by international organizations. The overall goals of the NEK PSR are formulated as follows: to demonstrate that the plant is as safe as originally intended; to evaluate the actual plant status with respect to aging and wear-out identifying any structures, systems or components that could limit the life of the plant in the foreseeable future, and to identify appropriate corrective actions, where needed; to compare current level of safety in the light of modern standards and knowledge, and to identify where improvements would be beneficial for minimizing deviations at justifiable costs. The Krsko PSR will address the following safety factors: Operational Experience, Safety Assessment, EQ and Aging Management, Safety Culture, Emergency Planning, Environmental Impact and Radioactive Waste.(author)

  15. Assessment of wastewater treatment facility compliance with decreasing ammonia discharge limits using a regression tree model.

    Science.gov (United States)

    Suchetana, Bihu; Rajagopalan, Balaji; Silverstein, JoAnn

    2017-11-15

    A regression tree-based diagnostic approach is developed to evaluate factors affecting US wastewater treatment plant compliance with ammonia discharge permit limits using Discharge Monthly Report (DMR) data from a sample of 106 municipal treatment plants for the period of 2004-2008. Predictor variables used to fit the regression tree are selected using random forests, and consist of the previous month's effluent ammonia, influent flow rates and plant capacity utilization. The tree models are first used to evaluate compliance with existing ammonia discharge standards at each facility and then applied assuming more stringent discharge limits, under consideration in many states. The model predicts that the ability to meet both current and future limits depends primarily on the previous month's treatment performance. With more stringent discharge limits predicted ammonia concentration relative to the discharge limit, increases. In-sample validation shows that the regression trees can provide a median classification accuracy of >70%. The regression tree model is validated using ammonia discharge data from an operating wastewater treatment plant and is able to accurately predict the observed ammonia discharge category approximately 80% of the time, indicating that the regression tree model can be applied to predict compliance for individual treatment plants providing practical guidance for utilities and regulators with an interest in controlling ammonia discharges. The proposed methodology is also used to demonstrate how to delineate reliable sources of demand and supply in a point source-to-point source nutrient credit trading scheme, as well as how planners and decision makers can set reasonable discharge limits in future. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Assessing the performance of variational methods for mixed logistic regression models

    Czech Academy of Sciences Publication Activity Database

    Rijmen, F.; Vomlel, Jiří

    2008-01-01

    Roč. 78, č. 8 (2008), s. 765-779 ISSN 0094-9655 R&D Projects: GA MŠk 1M0572 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Mixed models * Logistic regression * Variational methods * Lower bound approximation Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.353, year: 2008

  17. Linear and evolutionary polynomial regression models to forecast coastal dynamics: Comparison and reliability assessment

    Science.gov (United States)

    Bruno, Delia Evelina; Barca, Emanuele; Goncalves, Rodrigo Mikosz; de Araujo Queiroz, Heithor Alexandre; Berardi, Luigi; Passarella, Giuseppe

    2018-01-01

    In this paper, the Evolutionary Polynomial Regression data modelling strategy has been applied to study small scale, short-term coastal morphodynamics, given its capability for treating a wide database of known information, non-linearly. Simple linear and multilinear regression models were also applied to achieve a balance between the computational load and reliability of estimations of the three models. In fact, even though it is easy to imagine that the more complex the model, the more the prediction improves, sometimes a "slight" worsening of estimations can be accepted in exchange for the time saved in data organization and computational load. The models' outcomes were validated through a detailed statistical, error analysis, which revealed a slightly better estimation of the polynomial model with respect to the multilinear model, as expected. On the other hand, even though the data organization was identical for the two models, the multilinear one required a simpler simulation setting and a faster run time. Finally, the most reliable evolutionary polynomial regression model was used in order to make some conjecture about the uncertainty increase with the extension of extrapolation time of the estimation. The overlapping rate between the confidence band of the mean of the known coast position and the prediction band of the estimated position can be a good index of the weakness in producing reliable estimations when the extrapolation time increases too much. The proposed models and tests have been applied to a coastal sector located nearby Torre Colimena in the Apulia region, south Italy.

  18. Assessing the human cardiovascular response to moderate exercise: feature extraction by support vector regression

    International Nuclear Information System (INIS)

    Wang, Lu; Su, Steven W; Celler, Branko G; Chan, Gregory S H; Cheng, Teddy M; Savkin, Andrey V

    2009-01-01

    This study aims to quantitatively describe the steady-state relationships among percentage changes in key central cardiovascular variables (i.e. stroke volume, heart rate (HR), total peripheral resistance and cardiac output), measured using non-invasive means, in response to moderate exercise, and the oxygen uptake rate, using a new nonlinear regression approach—support vector regression. Ten untrained normal males exercised in an upright position on an electronically braked cycle ergometer with constant workloads ranging from 25 W to 125 W. Throughout the experiment, .VO 2 was determined breath by breath and the HR was monitored beat by beat. During the last minute of each exercise session, the cardiac output was measured beat by beat using a novel non-invasive ultrasound-based device and blood pressure was measured using a tonometric measurement device. Based on the analysis of experimental data, nonlinear steady-state relationships between key central cardiovascular variables and .VO 2 were qualitatively observed except for the HR which increased linearly as a function of increasing .VO 2 . Quantitative descriptions of these complex nonlinear behaviour were provided by nonparametric models which were obtained by using support vector regression

  19. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    bivariate analyses were applied to better characterize each vulnerability parameter. Multiple corresponding analyses revealed strong relationships between the "Distance to channel or bridges", "Structural building type", "Building footprint" and the observed damage. Logistic regression enabled quantification of the contribution of each explanatory parameter to potential damage, and determination of the significant parameters that express the damage susceptibility of a building. The model was applied 200 times on different calibration and validation data sets in order to examine performance. Results show that 90% of these tests have a success rate of more than 67%. Probabilities (at building scale) of experiencing different damage levels during a future event similar to the 8 February 2013 flash flood are the major outcomes of this study.

  20. Magnetic resonance imaging for assessment of parametrial tumour spread and regression patterns in adaptive cervix cancer radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Schmid, Maximilian P.; Fidarova, Elena [Dept. of Radiotherapy, Comprehensive Cancer Center, Medical Univ. of Vienna, Vienna (Austria)], e-mail: maximilian.schmid@akhwien.at; Poetter, Richard [Dept. of Radiotherapy, Comprehensive Cancer Center, Medical Univ. of Vienna, Vienna (Austria); Christian Doppler Lab. for Medical Radiation Research for Radiation Oncology, Medical Univ. of Vienna (Austria)] [and others

    2013-10-15

    Purpose: To investigate the impact of magnetic resonance imaging (MRI)-morphologic differences in parametrial infiltration on tumour response during primary radio chemotherapy in cervical cancer. Material and methods: Eighty-five consecutive cervical cancer patients with FIGO stages IIB (n = 59) and IIIB (n = 26), treated by external beam radiotherapy ({+-}chemotherapy) and image-guided adaptive brachytherapy, underwent T2-weighted MRI at the time of diagnosis and at the time of brachytherapy. MRI patterns of parametrial tumour infiltration at the time of diagnosis were assessed with regard to predominant morphology and maximum extent of parametrial tumour infiltration and were stratified into five tumour groups (TG): 1) expansive with spiculae; 2) expansive with spiculae and infiltrating parts; 3) infiltrative into the inner third of the parametrial space (PM); 4) infiltrative into the middle third of the PM; and 5) infiltrative into the outer third of the PM. MRI at the time of brachytherapy was used for identifying presence (residual vs. no residual disease) and signal intensity (high vs. intermediate) of residual disease within the PM. Left and right PM of each patient were evaluated separately at both time points. The impact of the TG on tumour remission status within the PM was analysed using {chi}2-test and logistic regression analysis. Results: In total, 170 PM were analysed. The TG 1, 2, 3, 4, 5 were present in 12%, 11%, 35%, 25% and 12% of the cases, respectively. Five percent of the PM were tumour-free. Residual tumour in the PM was identified in 19%, 68%, 88%, 90% and 85% of the PM for the TG 1, 2, 3, 4, and 5, respectively. The TG 3 - 5 had significantly higher rates of residual tumour in the PM in comparison to TG 1 + 2 (88% vs. 43%, p < 0.01). Conclusion: MRI-morphologic features of PM infiltration appear to allow for prediction of tumour response during external beam radiotherapy and chemotherapy. A predominantly infiltrative tumour spread at the

  1. Meta-regression analyses to explain statistical heterogeneity in a systematic review of strategies for guideline implementation in primary health care.

    Directory of Open Access Journals (Sweden)

    Susanne Unverzagt

    Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health

  2. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  3. Exponential Decay Nonlinear Regression Analysis of Patient Survival Curves: Preliminary Assessment in Non-Small Cell Lung Cancer

    Science.gov (United States)

    Stewart, David J.; Behrens, Carmen; Roth, Jack; Wistuba, Ignacio I.

    2010-01-01

    Background For processes that follow first order kinetics, exponential decay nonlinear regression analysis (EDNRA) may delineate curve characteristics and suggest processes affecting curve shape. We conducted a preliminary feasibility assessment of EDNRA of patient survival curves. Methods EDNRA was performed on Kaplan-Meier overall survival (OS) and time-to-relapse (TTR) curves for 323 patients with resected NSCLC and on OS and progression-free survival (PFS) curves from selected publications. Results and Conclusions In our resected patients, TTR curves were triphasic with a “cured” fraction of 60.7% (half-life [t1/2] >100,000 months), a rapidly-relapsing group (7.4%, t1/2=5.9 months) and a slowly-relapsing group (31.9%, t1/2=23.6 months). OS was uniphasic (t1/2=74.3 months), suggesting an impact of co-morbidities; hence, tumor molecular characteristics would more likely predict TTR than OS. Of 172 published curves analyzed, 72 (42%) were uniphasic, 92 (53%) were biphasic, 8 (5%) were triphasic. With first-line chemotherapy in advanced NSCLC, 87.5% of curves from 2-3 drug regimens were uniphasic vs only 20% of those with best supportive care or 1 drug (p<0.001). 54% of curves from 2-3 drug regimens had convex rapid-decay phases vs 0% with fewer agents (p<0.001). Curve convexities suggest that discontinuing chemotherapy after 3-6 cycles “synchronizes” patient progression and death. With postoperative adjuvant chemotherapy, the PFS rapid-decay phase accounted for a smaller proportion of the population than in controls (p=0.02) with no significant difference in rapid-decay t1/2, suggesting adjuvant chemotherapy may move a subpopulation of patients with sensitive tumors from the relapsing group to the cured group, with minimal impact on time to relapse for a larger group of patients with resistant tumors. In untreated patients, the proportion of patients in the rapid-decay phase increased (p=0.04) while rapid-decay t1/2 decreased (p=0.0004) with increasing

  4. Cardiovascular risk from water arsenic exposure in Vietnam: Application of systematic review and meta-regression analysis in chemical health risk assessment.

    Science.gov (United States)

    Phung, Dung; Connell, Des; Rutherford, Shannon; Chu, Cordia

    2017-06-01

    A systematic review (SR) and meta-analysis cannot provide the endpoint answer for a chemical risk assessment (CRA). The objective of this study was to apply SR and meta-regression (MR) analysis to address this limitation using a case study in cardiovascular risk from arsenic exposure in Vietnam. Published studies were searched from PubMed using the keywords of arsenic exposure and cardiovascular diseases (CVD). Random-effects meta-regression was applied to model the linear relationship between arsenic concentration in water and risk of CVD, and then the no-observable-adverse-effect level (NOAEL) were identified from the regression function. The probabilistic risk assessment (PRA) technique was applied to characterize risk of CVD due to arsenic exposure by estimating the overlapping coefficient between dose-response and exposure distribution curves. The risks were evaluated for groundwater, treated and drinking water. A total of 8 high quality studies for dose-response and 12 studies for exposure data were included for final analyses. The results of MR suggested a NOAEL of 50 μg/L and a guideline of 5 μg/L for arsenic in water which valued as a half of NOAEL and guidelines recommended from previous studies and authorities. The results of PRA indicated that the observed exposure level with exceeding CVD risk was 52% for groundwater, 24% for treated water, and 10% for drinking water in Vietnam, respectively. The study found that systematic review and meta-regression can be considered as an ideal method to chemical risk assessment due to its advantages to bring the answer for the endpoint question of a CRA. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Assessing relationships among properties of demolished concrete, recycled aggregate and recycled aggregate concrete using regression analysis.

    Science.gov (United States)

    Tam, Vivian W Y; Wang, K; Tam, C M

    2008-04-01

    Recycled demolished concrete (DC) as recycled aggregate (RA) and recycled aggregate concrete (RAC) is generally suitable for most construction applications. Low-grade applications, including sub-base and roadwork, have been implemented in many countries; however, higher-grade activities are rarely considered. This paper examines relationships among DC characteristics, properties of their RA and strength of their RAC using regression analysis. Ten samples collected from demolition sites are examined. The results show strong correlation among the DC samples, properties of RA and RAC. It should be highlighted that inferior quality of DC will lower the quality of RA and thus their RAC. Prediction of RAC strength is also formulated from the DC characteristics and the RA properties. From that, the RAC performance from DC and RA can be estimated. In addition, RAC design requirements can also be developed at the initial stage of concrete demolition. Recommendations are also given to improve the future concreting practice.

  6. Susceptibility assessment of earthquake-triggered landslides in El Salvador using logistic regression

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.; Díaz, M.

    2008-03-01

    This work has evaluated the probability of earthquake-triggered landslide occurrence in the whole of El Salvador, with a Geographic Information System (GIS) and a logistic regression model. Slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness are the predictor variables used to determine the dependent variable of occurrence or non-occurrence of landslides within an individual grid cell. The results illustrate the importance of terrain roughness and soil type as key factors within the model — using only these two variables the analysis returned a significance level of 89.4%. The results obtained from the model within the GIS were then used to produce a map of relative landslide susceptibility.

  7. Malignancy Risk Assessment in Patients with Thyroid Nodules Using Classification and Regression Trees

    Directory of Open Access Journals (Sweden)

    Shokouh Taghipour Zahir

    2013-01-01

    Full Text Available Purpose. We sought to investigate the utility of classification and regression trees (CART classifier to differentiate benign from malignant nodules in patients referred for thyroid surgery. Methods. Clinical and demographic data of 271 patients referred to the Sadoughi Hospital during 2006–2011 were collected. In a two-step approach, a CART classifier was employed to differentiate patients with a high versus low risk of thyroid malignancy. The first step served as the screening procedure and was tailored to produce as few false negatives as possible. The second step identified those with the lowest risk of malignancy, chosen from a high risk population. Sensitivity, specificity, positive and negative predictive values (PPV and NPV of the optimal tree were calculated. Results. In the first step, age, sex, and nodule size contributed to the optimal tree. Ultrasonographic features were employed in the second step with hypoechogenicity and/or microcalcifications yielding the highest discriminatory ability. The combined tree produced a sensitivity and specificity of 80.0% (95% CI: 29.9–98.9 and 94.1% (95% CI: 78.9–99.0, respectively. NPV and PPV were 66.7% (41.1–85.6 and 97.0% (82.5–99.8, respectively. Conclusion. CART classifier reliably identifies patients with a low risk of malignancy who can avoid unnecessary surgery.

  8. Landslide Fissure Inference Assessment by ANFIS and Logistic Regression Using UAS-Based Photogrammetry

    Directory of Open Access Journals (Sweden)

    Ozgun Akcay

    2015-10-01

    Full Text Available Unmanned Aerial Systems (UAS are now capable of gathering high-resolution data, therefore, landslides can be explored in detail at larger scales. In this research, 132 aerial photographs were captured, and 85,456 features were detected and matched automatically using UAS photogrammetry. The root mean square (RMS values of the image coordinates of the Ground Control Points (GPCs varied from 0.521 to 2.293 pixels, whereas maximum RMS values of automatically matched features was calculated as 2.921 pixels. Using the 3D point cloud, which was acquired by aerial photogrammetry, the raster datasets of the aspect, slope, and maximally stable extremal regions (MSER detecting visual uniformity, were defined as three variables, in order to reason fissure structures on the landslide surface. In this research, an Adaptive Neuro Fuzzy Inference System (ANFIS and a Logistic Regression (LR were implemented using training datasets to infer fissure data appropriately. The accuracy of the predictive models was evaluated by drawing receiver operating characteristic (ROC curves and by calculating the area under the ROC curve (AUC. The experiments exposed that high-resolution imagery is an indispensable data source to model and validate landslide fissures appropriately.

  9. Assessing and correcting for regression toward the mean in deviance-induced social conformity.

    Science.gov (United States)

    Schnuerch, Robert; Schnuerch, Martin; Gibbons, Henning

    2015-01-01

    Our understanding of the mechanisms underlying social conformity has recently advanced due to the employment of neuroscience methodology and novel experimental approaches. Most prominently, several studies have demonstrated the role of neural reinforcement-learning processes in conformal adjustments using a specifically designed and frequently replicated paradigm. Only very recently, the validity of the critical behavioral effect in this very paradigm was seriously questioned, as it invites the unwanted contribution of regression toward the mean. Using a straightforward control-group design, we corroborate this recent finding and demonstrate the involvement of statistical distortions. Additionally, however, we provide conclusive evidence that the paradigm nevertheless captures behavioral effects that can only be attributed to social influence. Finally, we present a mathematical approach that allows to isolate and quantify the paradigm's true conformity effect both at the group level and for each individual participant. These data as well as relevant theoretical considerations suggest that the groundbreaking findings regarding the brain mechanisms of social conformity that were obtained with this recently criticized paradigm were indeed valid. Moreover, we support earlier suggestions that distorted behavioral effects can be rectified by means of appropriate correction procedures.

  10. Development of a Watershed-Scale Long-Term Hydrologic Impact Assessment Model with the Asymptotic Curve Number Regression Equation

    Directory of Open Access Journals (Sweden)

    Jichul Ryu

    2016-04-01

    Full Text Available In this study, 52 asymptotic Curve Number (CN regression equations were developed for combinations of representative land covers and hydrologic soil groups. In addition, to overcome the limitations of the original Long-term Hydrologic Impact Assessment (L-THIA model when it is applied to larger watersheds, a watershed-scale L-THIA Asymptotic CN (ACN regression equation model (watershed-scale L-THIA ACN model was developed by integrating the asymptotic CN regressions and various modules for direct runoff/baseflow/channel routing. The watershed-scale L-THIA ACN model was applied to four watersheds in South Korea to evaluate the accuracy of its streamflow prediction. The coefficient of determination (R2 and Nash–Sutcliffe Efficiency (NSE values for observed versus simulated streamflows over intervals of eight days were greater than 0.6 for all four of the watersheds. The watershed-scale L-THIA ACN model, including the asymptotic CN regression equation method, can simulate long-term streamflow sufficiently well with the ten parameters that have been added for the characterization of streamflow.

  11. Assessing the health status of farmed mussels (Mytilus galloprovincialis) through histological, microbiological and biomarker analyses.

    Science.gov (United States)

    Matozzo, Valerio; Ercolini, Carlo; Serracca, Laura; Battistini, Roberta; Rossini, Irene; Granato, Giulia; Quaglieri, Elisabetta; Perolo, Alberto; Finos, Livio; Arcangeli, Giuseppe; Bertotto, Daniela; Radaelli, Giuseppe; Chollet, Bruno; Arzul, Isabelle; Quaglio, Francesco

    2018-03-01

    The Gulf of La Spezia (northern Tyrrhenian Sea, Italy) is a commercially important area both as a shipping port and for mussel farming. Recently, there has been increased concern over environmental disturbances caused by anthropogenic activities such as ship traffic and dredging and the effects they have on the health of farmed mussels. This paper reports the results of microbiological and histological analyses, as well as of measurement of several biomarkers which were performed to assess the health status of mussels (Mytilus galloprovincialis) from four rearing sites in the Gulf of La Spezia. Mussels were collected between October 2015 and September 2016 and histological analyses (including gonadal maturation stage), as well as the presence of pathogenic bacteria (Vibrio splendidus clade, V. aestuarianus and V. harveyi), viruses (Herpes virus and ostreid Herpes virus 1) and protozoa (Marteilia spp., in the summer season only) were carried out on a monthly basis. Conversely, biomarker responses in haemocyte/haemolymph (total haemocyte count, haemocyte diameter and volume, lysozyme and lactate dehydrogenase activities in cell-free haemolymph, and micronuclei frequency) and in gills and digestive gland (cortisol-like steroids and lipid peroxidation levels), were evaluated bimonthly. Microbiological data indicated that mussels contain a reservoir of potentially pathogenic bacteria, viruses and protozoa that in certain environmental conditions may cause a weakening of the immune system of animals leading to mortality episodes. The percentage of parasites detected in the mussels was generally low (9.6% for Steinhausia mytilovum, that is 17 samples out of 177 examined females; 3.4% for Proctoeces maculatus; 0.9% for Mytilicola intestinalis and 2% for ciliated protozoa), while symbiont loads were higher (31% for Eugymnanthea inquilina and Urastoma cyprinae). Interestingly, a previously undescribed haplosporidian was detected in a single mussel sample (0.2%) and was

  12. Consequences of kriging and land use regression for PM2.5 predictions in epidemiologic analyses: insights into spatial variability using high-resolution satellite data.

    Science.gov (United States)

    Alexeeff, Stacey E; Schwartz, Joel; Kloog, Itai; Chudnovsky, Alexandra; Koutrakis, Petros; Coull, Brent A

    2015-01-01

    Many epidemiological studies use predicted air pollution exposures as surrogates for true air pollution levels. These predicted exposures contain exposure measurement error, yet simulation studies have typically found negligible bias in resulting health effect estimates. However, previous studies typically assumed a statistical spatial model for air pollution exposure, which may be oversimplified. We address this shortcoming by assuming a realistic, complex exposure surface derived from fine-scale (1 km × 1 km) remote-sensing satellite data. Using simulation, we evaluate the accuracy of epidemiological health effect estimates in linear and logistic regression when using spatial air pollution predictions from kriging and land use regression models. We examined chronic (long-term) and acute (short-term) exposure to air pollution. Results varied substantially across different scenarios. Exposure models with low out-of-sample R(2) yielded severe biases in the health effect estimates of some models, ranging from 60% upward bias to 70% downward bias. One land use regression exposure model with >0.9 out-of-sample R(2) yielded upward biases up to 13% for acute health effect estimates. Almost all models drastically underestimated the SEs. Land use regression models performed better in chronic effect simulations. These results can help researchers when interpreting health effect estimates in these types of studies.

  13. Assessing variation in life-history tactics within a population using mixture regression models: a practical guide for evolutionary ecologists.

    Science.gov (United States)

    Hamel, Sandra; Yoccoz, Nigel G; Gaillard, Jean-Michel

    2017-05-01

    Mixed models are now well-established methods in ecology and evolution because they allow accounting for and quantifying within- and between-individual variation. However, the required normal distribution of the random effects can often be violated by the presence of clusters among subjects, which leads to multi-modal distributions. In such cases, using what is known as mixture regression models might offer a more appropriate approach. These models are widely used in psychology, sociology, and medicine to describe the diversity of trajectories occurring within a population over time (e.g. psychological development, growth). In ecology and evolution, however, these models are seldom used even though understanding changes in individual trajectories is an active area of research in life-history studies. Our aim is to demonstrate the value of using mixture models to describe variation in individual life-history tactics within a population, and hence to promote the use of these models by ecologists and evolutionary ecologists. We first ran a set of simulations to determine whether and when a mixture model allows teasing apart latent clustering, and to contrast the precision and accuracy of estimates obtained from mixture models versus mixed models under a wide range of ecological contexts. We then used empirical data from long-term studies of large mammals to illustrate the potential of using mixture models for assessing within-population variation in life-history tactics. Mixture models performed well in most cases, except for variables following a Bernoulli distribution and when sample size was small. The four selection criteria we evaluated [Akaike information criterion (AIC), Bayesian information criterion (BIC), and two bootstrap methods] performed similarly well, selecting the right number of clusters in most ecological situations. We then showed that the normality of random effects implicitly assumed by evolutionary ecologists when using mixed models was often

  14. Assessing the reliability of the borderline regression method as a standard setting procedure for objective structured clinical examination

    Directory of Open Access Journals (Sweden)

    Sara Mortaz Hejri

    2013-01-01

    Full Text Available Background: One of the methods used for standard setting is the borderline regression method (BRM. This study aims to assess the reliability of BRM when the pass-fail standard in an objective structured clinical examination (OSCE was calculated by averaging the BRM standards obtained for each station separately. Materials and Methods: In nine stations of the OSCE with direct observation the examiners gave each student a checklist score and a global score. Using a linear regression model for each station, we calculated the checklist score cut-off on the regression equation for the global scale cut-off set at 2. The OSCE pass-fail standard was defined as the average of all station′s standard. To determine the reliability, the root mean square error (RMSE was calculated. The R2 coefficient and the inter-grade discrimination were calculated to assess the quality of OSCE. Results: The mean total test score was 60.78. The OSCE pass-fail standard and its RMSE were 47.37 and 0.55, respectively. The R2 coefficients ranged from 0.44 to 0.79. The inter-grade discrimination score varied greatly among stations. Conclusion: The RMSE of the standard was very small indicating that BRM is a reliable method of setting standard for OSCE, which has the advantage of providing data for quality assurance.

  15. Assessment of Genetic Heterogeneity in Structured Plant Populations Using Multivariate Whole-Genome Regression Models.

    Science.gov (United States)

    Lehermeier, Christina; Schön, Chris-Carolin; de Los Campos, Gustavo

    2015-09-01

    Plant breeding populations exhibit varying levels of structure and admixture; these features are likely to induce heterogeneity of marker effects across subpopulations. Traditionally, structure has been dealt with as a potential confounder, and various methods exist to "correct" for population stratification. However, these methods induce a mean correction that does not account for heterogeneity of marker effects. The animal breeding literature offers a few recent studies that consider modeling genetic heterogeneity in multibreed data, using multivariate models. However, these methods have received little attention in plant breeding where population structure can have different forms. In this article we address the problem of analyzing data from heterogeneous plant breeding populations, using three approaches: (a) a model that ignores population structure [A-genome-based best linear unbiased prediction (A-GBLUP)], (b) a stratified (i.e., within-group) analysis (W-GBLUP), and (c) a multivariate approach that uses multigroup data and accounts for heterogeneity (MG-GBLUP). The performance of the three models was assessed on three different data sets: a diversity panel of rice (Oryza sativa), a maize (Zea mays L.) half-sib panel, and a wheat (Triticum aestivum L.) data set that originated from plant breeding programs. The estimated genomic correlations between subpopulations varied from null to moderate, depending on the genetic distance between subpopulations and traits. Our assessment of prediction accuracy features cases where ignoring population structure leads to a parsimonious more powerful model as well as others where the multivariate and stratified approaches have higher predictive power. In general, the multivariate approach appeared slightly more robust than either the A- or the W-GBLUP. Copyright © 2015 by the Genetics Society of America.

  16. Hip fracture risk assessment: artificial neural network outperforms conditional logistic regression in an age- and sex-matched case control study.

    Science.gov (United States)

    Tseng, Wo-Jan; Hung, Li-Wei; Shieh, Jiann-Shing; Abbod, Maysam F; Lin, Jinn

    2013-07-15

    Osteoporotic hip fractures with a significant morbidity and excess mortality among the elderly have imposed huge health and economic burdens on societies worldwide. In this age- and sex-matched case control study, we examined the risk factors of hip fractures and assessed the fracture risk by conditional logistic regression (CLR) and ensemble artificial neural network (ANN). The performances of these two classifiers were compared. The study population consisted of 217 pairs (149 women and 68 men) of fractures and controls with an age older than 60 years. All the participants were interviewed with the same standardized questionnaire including questions on 66 risk factors in 12 categories. Univariate CLR analysis was initially conducted to examine the unadjusted odds ratio of all potential risk factors. The significant risk factors were then tested by multivariate analyses. For fracture risk assessment, the participants were randomly divided into modeling and testing datasets for 10-fold cross validation analyses. The predicting models built by CLR and ANN in modeling datasets were applied to testing datasets for generalization study. The performances, including discrimination and calibration, were compared with non-parametric Wilcoxon tests. In univariate CLR analyses, 16 variables achieved significant level, and six of them remained significant in multivariate analyses, including low T score, low BMI, low MMSE score, milk intake, walking difficulty, and significant fall at home. For discrimination, ANN outperformed CLR in both 16- and 6-variable analyses in modeling and testing datasets (p?hip fracture are more personal than environmental. With adequate model construction, ANN may outperform CLR in both discrimination and calibration. ANN seems to have not been developed to its full potential and efforts should be made to improve its performance.

  17. Assessment of the expected construction company’s net profit using neural network and multiple regression models

    Directory of Open Access Journals (Sweden)

    H.H. Mohamad

    2013-09-01

    This research aims to develop a mathematical model for assessing the expected net profit of any construction company. To achieve the research objective, four steps were performed. First, the main factors affecting firms’ net profit were identified. Second, pertinent data regarding the net profit factors were collected. Third, two different net profit models were developed using the Multiple Regression (MR and the Neural Network (NN techniques. The validity of the proposed models was also investigated. Finally, the results of both MR and NN models were compared to investigate the predictive capabilities of the two models.

  18. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  19. Assessment of susceptibility to earth-flow landslide using logistic regression and multivariate adaptive regression splines: A case of the Belice River basin (western Sicily, Italy)

    Science.gov (United States)

    Conoscenti, Christian; Ciaccio, Marilena; Caraballo-Arias, Nathalie Almaru; Gómez-Gutiérrez, Álvaro; Rotigliano, Edoardo; Agnesi, Valerio

    2015-08-01

    In this paper, terrain susceptibility to earth-flow occurrence was evaluated by using geographic information systems (GIS) and two statistical methods: Logistic regression (LR) and multivariate adaptive regression splines (MARS). LR has been already demonstrated to provide reliable predictions of earth-flow occurrence, whereas MARS, as far as we know, has never been used to generate earth-flow susceptibility models. The experiment was carried out in a basin of western Sicily (Italy), which extends for 51 km2 and is severely affected by earth-flows. In total, we mapped 1376 earth-flows, covering an area of 4.59 km2. To explore the effect of pre-failure topography on earth-flow spatial distribution, we performed a reconstruction of topography before the landslide occurrence. This was achieved by preparing a digital terrain model (DTM) where altitude of areas hosting landslides was interpolated from the adjacent undisturbed land surface by using the algorithm topo-to-raster. This DTM was exploited to extract 15 morphological and hydrological variables that, in addition to outcropping lithology, were employed as explanatory variables of earth-flow spatial distribution. The predictive skill of the earth-flow susceptibility models and the robustness of the procedure were tested by preparing five datasets, each including a different subset of landslides and stable areas. The accuracy of the predictive models was evaluated by drawing receiver operating characteristic (ROC) curves and by calculating the area under the ROC curve (AUC). The results demonstrate that the overall accuracy of LR and MARS earth-flow susceptibility models is from excellent to outstanding. However, AUC values of the validation datasets attest to a higher predictive power of MARS-models (AUC between 0.881 and 0.912) with respect to LR-models (AUC between 0.823 and 0.870). The adopted procedure proved to be resistant to overfitting and stable when changes of the learning and validation samples are

  20. Assessing the Multidimensional Relationship Between Medication Beliefs and Adherence in Older Adults With Hypertension Using Polynomial Regression.

    Science.gov (United States)

    Dillon, Paul; Phillips, L Alison; Gallagher, Paul; Smith, Susan M; Stewart, Derek; Cousins, Gráinne

    2018-02-05

    The Necessity-Concerns Framework (NCF) is a multidimensional theory describing the relationship between patients' positive and negative evaluations of their medication which interplay to influence adherence. Most studies evaluating the NCF have failed to account for the multidimensional nature of the theory, placing the separate dimensions of medication "necessity beliefs" and "concerns" onto a single dimension (e.g., the Beliefs about Medicines Questionnaire-difference score model). To assess the multidimensional effect of patient medication beliefs (concerns and necessity beliefs) on medication adherence using polynomial regression with response surface analysis. Community-dwelling older adults >65 years (n = 1,211) presenting their own prescription for antihypertensive medication to 106 community pharmacies in the Republic of Ireland rated their concerns and necessity beliefs to antihypertensive medications at baseline and their adherence to antihypertensive medication at 12 months via structured telephone interview. Confirmatory polynomial regression found the difference-score model to be inaccurate; subsequent exploratory analysis identified a quadratic model to be the best-fitting polynomial model. Adherence was lowest among those with strong medication concerns and weak necessity beliefs, and adherence was greatest for those with weak concerns and strong necessity beliefs (slope β = -0.77, pnecessity beliefs had lower adherence than those with simultaneously low concerns and necessity beliefs (slope β = -0.36, p = .004; curvature β = -0.25, p = .003). The difference-score model fails to account for the potential nonreciprocal effects. Results extend evidence supporting the use of polynomial regression to assess the multidimensional effect of medication beliefs on adherence.

  1. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  2. Univariate and multiple linear regression analyses for 23 single nucleotide polymorphisms in 14 genes predisposing to chronic glomerular diseases and IgA nephropathy in Han Chinese.

    Science.gov (United States)

    Wang, Hui; Sui, Weiguo; Xue, Wen; Wu, Junyong; Chen, Jiejing; Dai, Yong

    2014-09-01

    Immunoglobulin A nephropathy (IgAN) is a complex trait regulated by the interaction among multiple physiologic regulatory systems and probably involving numerous genes, which leads to inconsistent findings in genetic studies. One possibility of failure to replicate some single-locus results is that the underlying genetics of IgAN nephropathy is based on multiple genes with minor effects. To learn the association between 23 single nucleotide polymorphisms (SNPs) in 14 genes predisposing to chronic glomerular diseases and IgAN in Han males, the 23 SNPs genotypes of 21 Han males were detected and analyzed with a BaiO gene chip, and their associations were analyzed with univariate analysis and multiple linear regression analysis. Analysis showed that CTLA4 rs231726 and CR2 rs1048971 revealed a significant association with IgAN. These findings support the multi-gene nature of the etiology of IgAN and propose a potential gene-gene interactive model for future studies.

  3. Implications of Interactions among Society, Education and Technology: A Comparison of Multiple Linear Regression and Multilevel Modeling in Mathematics Achievement Analyses

    Science.gov (United States)

    Deering, Pamela Rose

    2014-01-01

    This research compares and contrasts two approaches to predictive analysis of three years' of school district data to investigate relationships between student and teacher characteristics and math achievement as measured by the state-mandated Maryland School Assessment mathematics exam. The sample for the study consisted of 3,514 students taught…

  4. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part II: Evaluation of Sample Models

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.

  5. Propensity-score matching in economic analyses: comparison with regression models, instrumental variables, residual inclusion, differences-in-differences, and decomposition methods.

    Science.gov (United States)

    Crown, William H

    2014-02-01

    This paper examines the use of propensity score matching in economic analyses of observational data. Several excellent papers have previously reviewed practical aspects of propensity score estimation and other aspects of the propensity score literature. The purpose of this paper is to compare the conceptual foundation of propensity score models with alternative estimators of treatment effects. References are provided to empirical comparisons among methods that have appeared in the literature. These comparisons are available for a subset of the methods considered in this paper. However, in some cases, no pairwise comparisons of particular methods are yet available, and there are no examples of comparisons across all of the methods surveyed here. Irrespective of the availability of empirical comparisons, the goal of this paper is to provide some intuition about the relative merits of alternative estimators in health economic evaluations where nonlinearity, sample size, availability of pre/post data, heterogeneity, and missing variables can have important implications for choice of methodology. Also considered is the potential combination of propensity score matching with alternative methods such as differences-in-differences and decomposition methods that have not yet appeared in the empirical literature.

  6. Basic Diagnosis and Prediction of Persistent Contrail Occurrence using High-resolution Numerical Weather Analyses/Forecasts and Logistic Regression. Part I: Effects of Random Error

    Science.gov (United States)

    Duda, David P.; Minnis, Patrick

    2009-01-01

    Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.

  7. The Chinese Family Assessment Instrument (C-FAI): Hierarchical Confirmatory Factor Analyses and Factorial Invariance

    Science.gov (United States)

    Shek, Daniel T. L.; Ma, Cecilia M. S.

    2010-01-01

    Objective: This paper examines the dimensionality and factorial invariance of the Chinese Family Assessment Instrument (C-FAI) using multigroup confirmatory factor analyses (MCFAs). Method: A total of 3,649 students responded to the C-FAI in a community survey. Results: Results showed that there are five dimensions of the C-FAI (communication,…

  8. Radiologic assessment of third molar tooth and spheno-occipital synchondrosis for age estimation: a multiple regression analysis study.

    Science.gov (United States)

    Demirturk Kocasarac, Husniye; Sinanoglu, Alper; Noujeim, Marcel; Helvacioglu Yigit, Dilek; Baydemir, Canan

    2016-05-01

    For forensic age estimation, radiographic assessment of third molar mineralization is important between 14 and 21 years which coincides with the legal age in most countries. The spheno-occipital synchondrosis (SOS) is an important growth site during development, and its use for age estimation is beneficial when combined with other markers. In this study, we aimed to develop a regression model to estimate and narrow the age range based on the radiologic assessment of third molar and SOS in a Turkish subpopulation. Panoramic radiographs and cone beam CT scans of 349 subjects (182 males, 167 females) with age between 8 and 25 were evaluated. Four-stage system was used to evaluate the fusion degree of SOS, and Demirjian's eight stages of development for calcification for third molars. The Pearson correlation indicated a strong positive relationship between age and third molar calcification for both sexes (r = 0.850 for females, r = 0.839 for males, P < 0.001) and also between age and SOS fusion for females (r = 0.814), but a moderate relationship was found for males (r = 0.599), P < 0.001). Based on the results obtained, an age determination formula using these scores was established.

  9. Assessment of perfusion by dynamic contrast-enhanced imaging using a deconvolution approach based on regression and singular value decomposition.

    Science.gov (United States)

    Koh, T S; Wu, X Y; Cheong, L H; Lim, C C T

    2004-12-01

    The assessment of tissue perfusion by dynamic contrast-enhanced (DCE) imaging involves a deconvolution process. For analysis of DCE imaging data, we implemented a regression approach to select appropriate regularization parameters for deconvolution using the standard and generalized singular value decomposition methods. Monte Carlo simulation experiments were carried out to study the performance and to compare with other existing methods used for deconvolution analysis of DCE imaging data. The present approach is found to be robust and reliable at the levels of noise commonly encountered in DCE imaging, and for different models of the underlying tissue vasculature. The advantages of the present method, as compared with previous methods, include its efficiency of computation, ability to achieve adequate regularization to reproduce less noisy solutions, and that it does not require prior knowledge of the noise condition. The proposed method is applied on actual patient study cases with brain tumors and ischemic stroke, to illustrate its applicability as a clinical tool for diagnosis and assessment of treatment response.

  10. Proposed Testing to Assess the Accuracy of Glass-To-Metal Seal Stress Analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, Robert S.; Emery, John M; Tandon, Rajan; Antoun, Bonnie R.; Stavig, Mark E.; Newton, Clay S.; Gibson, Cory S; Bencoe, Denise N.

    2014-09-01

    The material characterization tests conducted on 304L VAR stainless steel and Schott 8061 glass have provided higher fidelity data for calibration of material models used in Glass - T o - Metal (GTM) seal analyses. Specifically, a Thermo - Multi - Linear Elastic Plastic ( thermo - MLEP) material model has be en defined for S S304L and the Simplified Potential Energy Clock nonlinear visc oelastic model has been calibrated for the S8061 glass. To assess the accuracy of finite element stress analyses of GTM seals, a suite of tests are proposed to provide data for comparison to mo del predictions.

  11. The relationship between limited MRI section analyses and volumetric assessment of synovitis in knee osteoarthritis

    International Nuclear Information System (INIS)

    Rhodes, L.A.; Keenan, A.-M.; Grainger, A.J.; Emery, P.; McGonagle, D.; Conaghan, P.G.

    2005-01-01

    AIM: To assess whether simple, limited section analysis can replace detailed volumetric assessment of synovitis in patients with osteoarthritis (OA) of the knee using contrast-enhanced magnetic resonance imaging (MRI). MATERIALS AND METHODS: Thirty-five patients with clinical and radiographic OA of the knee were assessed for synovitis using gadolinium-enhanced MRI. The volume of enhancing synovium was quantitatively assessed in four anatomical sites (the medial and lateral parapatellar recesses, the intercondylar notch and the suprapatellar pouch) by summing the volumes of synovitis in consecutive sections. Four different combinations of section analysis were evaluated for their ability to predict total synovial volume. RESULTS: A total of 114 intra-articular sites were assessed. Simple linear regression demonstrated that the best predictor of total synovial volume was the analysis containing the inferior, mid and superior sections of each of the intra-articular sites, which predicted between 40-80% (r 2 =0.396, p 2 =0.818, p<0.001 for medial parapatellar recess) of the total volume assessment. CONCLUSIONS: The results suggest that a three-section analysis on axial post-gadolinium sequences provides a simple surrogate measure of synovial volume in OA knees

  12. The relationship between limited MRI section analyses and volumetric assessment of synovitis in knee osteoarthritis

    Energy Technology Data Exchange (ETDEWEB)

    Rhodes, L.A. [Academic Unit of Medical Physics, University of Leeds and Leeds General Infirmary, Leeds (United Kingdom)]. E-mail: lar@medphysics.leeds.ac.uk; Keenan, A.-M. [Academic Unit of Musculoskeletal Disease, University of Leeds and Leeds General Infirmary, Leeds (United Kingdom); Grainger, A.J. [Department of Radiology, Leeds General Infirmary, Leeds (United Kingdom); Emery, P. [Academic Unit of Musculoskeletal Disease, University of Leeds and Leeds General Infirmary, Leeds (United Kingdom); McGonagle, D. [Academic Unit of Musculoskeletal Disease, University of Leeds and Leeds General Infirmary, Leeds (United Kingdom); Calderdale Royal Hospital, Salterhebble, Halifax (United Kingdom); Conaghan, P.G. [Academic Unit of Musculoskeletal Disease, University of Leeds and Leeds General Infirmary, Leeds (United Kingdom)

    2005-12-15

    AIM: To assess whether simple, limited section analysis can replace detailed volumetric assessment of synovitis in patients with osteoarthritis (OA) of the knee using contrast-enhanced magnetic resonance imaging (MRI). MATERIALS AND METHODS: Thirty-five patients with clinical and radiographic OA of the knee were assessed for synovitis using gadolinium-enhanced MRI. The volume of enhancing synovium was quantitatively assessed in four anatomical sites (the medial and lateral parapatellar recesses, the intercondylar notch and the suprapatellar pouch) by summing the volumes of synovitis in consecutive sections. Four different combinations of section analysis were evaluated for their ability to predict total synovial volume. RESULTS: A total of 114 intra-articular sites were assessed. Simple linear regression demonstrated that the best predictor of total synovial volume was the analysis containing the inferior, mid and superior sections of each of the intra-articular sites, which predicted between 40-80% (r {sup 2}=0.396, p<0.001 for notch; r {sup 2}=0.818, p<0.001 for medial parapatellar recess) of the total volume assessment. CONCLUSIONS: The results suggest that a three-section analysis on axial post-gadolinium sequences provides a simple surrogate measure of synovial volume in OA knees.

  13. An Innovative Technique to Assess Spontaneous Baroreflex Sensitivity with Short Data Segments: Multiple Trigonometric Regressive Spectral Analysis.

    Science.gov (United States)

    Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf

    2018-01-01

    Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.

  14. Overcooling transient selection and thermal hydraulic analyses of the Loviisa PTS assessments

    Energy Technology Data Exchange (ETDEWEB)

    Tuomisto, H [IVO Power Engineering Ltd, Vantaa (Finland)

    1997-09-01

    This paper describes transients selection and thermal hydraulic analyses of various PTS assessment studies performed for the pressure vessels of the Loviisa WWER-reactors. Deterministic analyses have been performed in various stages of the PTS studies and they have always made the formal basis for design and licensing of the reactor pressure vessel. The integrated, probabilistic PTS study was carried out to give an overview of the severity of all different PTS sequences, and give a quantitative estimate of the importance of the PTS issues in relation to the overall safety of the plant. Later, the sequences including external flooding of the pressure vessels were added to the PTS assessment. Thermal recovery annealing of the Loviisa 1 reactor pressure vessel took place during refuelling outage in 1996. (author). 10 refs, 4 figs, 3 tabs.

  15. Autistic Regression

    Science.gov (United States)

    Matson, Johnny L.; Kozlowski, Alison M.

    2010-01-01

    Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…

  16. Reproductive risk factors assessment for anaemia among pregnant women in India using a multinomial logistic regression model.

    Science.gov (United States)

    Perumal, Vanamail

    2014-07-01

    To assess reproductive risk factors for anaemia among pregnant women in urban and rural areas of India. The International Institute of Population Sciences, India, carried out third National Family Health Survey in 2005-2006 to estimate a key indicator from a sample of ever-married women in the reproductive age group 15-49 years. Data on various dimensions were collected using a structured questionnaire, and anaemia was measured using a portable HemoCue instrument. Anaemia prevalence among pregnant women was compared between rural and urban areas using chi-square test and odds ratio. Multinomial logistic regression analysis was used to determine risk factors. Anaemia prevalence was assessed among 3355 pregnant women from rural areas and 1962 pregnant women from urban areas. Moderate-to-severe anaemia in rural areas (32.4%) is significantly more common than in urban areas (27.3%) with an excess risk of 30%. Gestational age specific prevalence of anaemia significantly increases in rural areas after 6 months. Pregnancy duration is a significant risk factor in both urban and rural areas. In rural areas, increasing age at marriage and mass media exposure are significant protective factors of anaemia. However, more births in the last five years, alcohol consumption and smoking habits are significant risk factors. In rural areas, various reproductive factors and lifestyle characteristics constitute significant risk factors for moderate-to-severe anaemia. Therefore, intensive health education on reproductive practices and the impact of lifestyle characteristics are warranted to reduce anaemia prevalence. © 2014 John Wiley & Sons Ltd.

  17. A methodology for analysing human errors of commission in accident scenarios for risk assessment

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. D.; Park, J. K

    2003-01-01

    As the concern on the impact of the operator's inappropriate interventions, so-called Errors Of Commissions(EOCs), on the plant safety has been raised, the interest in the identification and analysis of EOC events from the risk assessment perspective becomes increasing accordingly. To this purpose, we propose a new methodology for identifying and analysing human errors of commission that might be caused from the failures in situation assessment and decision making during accident progressions given an initiating event. The proposed methodology was applied to the accident scenarios of YGN 3 and 4 NPPs, which resulted in about 10 EOC situations that need careful attention

  18. Performance assessment analyses unique to Department of Energy spent nuclear fuel

    International Nuclear Information System (INIS)

    Loo, H.H.; Duguid, J.J.

    2000-01-01

    This paper describes the iterative process of grouping and performance assessment that has led to the current grouping of the U.S. Department of Energy (DOE) spent nuclear fuel (SNF). The unique sensitivity analyses that form the basis for incorporating DOE fuel into the total system performance assessment (TSPA) base case model are described. In addition, the chemistry that results from dissolution of DOE fuel and high level waste (HLW) glass in a failed co-disposal package, and the effects of disposal of selected DOE SNF in high integrity cans are presented

  19. Compilation of Quality Assurance Documentation for Analyses Performed for the Resumption of Transient Testing Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Annette L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sondrup, A. Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This is a companion document to the analyses performed in support of the environmental assessment for the Resumption of Transient Fuels and Materials Testing. It is provided to allow transparency of the supporting calculations. It provides computer code input and output. The basis for the calculations is documented separately in INL (2013) and is referenced, as appropriate. Spreadsheets used to manipulate the code output are not provided.

  20. Performance Assessment Modeling and Sensitivity Analyses of Generic Disposal System Concepts.

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David; Freeze, Geoffrey A.; Gardner, William Payton; Hammond, Glenn Edward; Mariner, Paul

    2014-09-01

    directly, rather than through simplified abstractions. It also a llows for complex representations of the source term, e.g., the explicit representation of many individual waste packages (i.e., meter - scale detail of an entire waste emplacement drift). This report fulfills the Generic Disposal System Analysis Work Packa ge Level 3 Milestone - Performance Assessment Modeling and Sensitivity Analyses of Generic Disposal System Concepts (M 3 FT - 1 4 SN08080 3 2 ).

  1. Biosphere Modeling and Analyses in Support of Total System Performance Assessment

    International Nuclear Information System (INIS)

    Tappen, J. J.; Wasiolek, M. A.; Wu, D. W.; Schmitt, J. F.; Smith, A. J.

    2002-01-01

    The Nuclear Waste Policy Act of 1982 established the obligations of and the relationship between the U.S. Environmental Protection Agency (EPA), the U.S. Nuclear Regulatory Commission (NRC), and the U.S. Department of Energy (DOE) for the management and disposal of high-level radioactive wastes. In 1985, the EPA promulgated regulations that included a definition of performance assessment that did not consider potential dose to a member of the general public. This definition would influence the scope of activities conducted by DOE in support of the total system performance assessment program until 1995. The release of a National Academy of Sciences (NAS) report on the technical basis for a Yucca Mountain-specific standard provided the impetus for the DOE to initiate activities that would consider the attributes of the biosphere, i.e. that portion of the earth where living things, including man, exist and interact with the environment around them. The evolution of NRC and EPA Yucca Mountain-specific regulations, originally proposed in 1999, was critical to the development and integration of biosphere modeling and analyses into the total system performance assessment program. These proposed regulations initially differed in the conceptual representation of the receptor of interest to be considered in assessing performance. The publication in 2001 of final regulations in which the NRC adopted standard will permit the continued improvement and refinement of biosphere modeling and analyses activities in support of assessment activities

  2. Biosphere Modeling and Analyses in Support of Total System Performance Assessment

    International Nuclear Information System (INIS)

    Jeff Tappen; M.A. Wasiolek; D.W. Wu; J.F. Schmitt

    2001-01-01

    The Nuclear Waste Policy Act of 1982 established the obligations of and the relationship between the U.S. Environmental Protection Agency (EPA), the U.S. Nuclear Regulatory Commission (NRC), and the U.S. Department of Energy (DOE) for the management and disposal of high-level radioactive wastes. In 1985, the EPA promulgated regulations that included a definition of performance assessment that did not consider potential dose to a member of the general public. This definition would influence the scope of activities conducted by DOE in support of the total system performance assessment program until 1995. The release of a National Academy of Sciences (NAS) report on the technical basis for a Yucca Mountain-specific standard provided the impetus for the DOE to initiate activities that would consider the attributes of the biosphere, i.e. that portion of the earth where living things, including man, exist and interact with the environment around them. The evolution of NRC and EPA Yucca Mountain-specific regulations, originally proposed in 1999, was critical to the development and integration of biosphere modeling and analyses into the total system performance assessment program. These proposed regulations initially differed in the conceptual representation of the receptor of interest to be considered in assessing performance. The publication in 2001 of final regulations in which the NRC adopted standard will permit the continued improvement and refinement of biosphere modeling and analyses activities in support of assessment activities

  3. Regression Analysis

    CERN Document Server

    Freund, Rudolf J; Sa, Ping

    2006-01-01

    The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design

  4. Review of radionuclide source terms used for performance-assessment analyses

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1993-06-01

    Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ''uranium'' and 4n+3 ''actinium'' decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides

  5. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    International Nuclear Information System (INIS)

    Moinereau, D.; Faidy, C.; Valeta, M.P.; Bhandari, S.; Guichard, D.

    1997-01-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs

  6. Evaluation of fracture mechanics analyses used in RPV integrity assessment regarding brittle fracture

    Energy Technology Data Exchange (ETDEWEB)

    Moinereau, D [Electricite de France, Dept. MTC, Moret-sur-Loing (France); Faidy, C [Electricite de France, SEPTEN, Villeurbanne (France); Valeta, M P [Commisariat a l` Energie Atomique, Dept. DMT, Gif-sur-Yvette (France); Bhandari, S; Guichard, D [Societe Franco-Americaine de Constructions Atomiques (FRAMATOME), 92 - Paris-La-Defense (France)

    1997-09-01

    Electricite de France has conducted during these last years some experimental and numerical research programmes in order to evaluate fracture mechanics analyses used in nuclear reactor pressure vessels structural integrity assessment, regarding the risk of brittle fracture. These programmes included cleavage fracture tests on large scale cladded specimens containing subclad flaws with their interpretations by 2D and 3D numerical computations, and validation of finite element codes for pressurized thermal shocks analyses. Four cladded specimens made of ferritic steel A508 C13 with stainless steel cladding, and containing shallow subclad flaws, have been tested in four point bending at very low temperature in order to obtain cleavage failure. The specimen failure was obtained in each case in base metal by cleavage fracture. These tests have been interpreted by two-dimensional and three-dimensional finite element computations using different fracture mechanics approaches (elastic analysis with specific plasticity corrections, elastic-plastic analysis, local approach to cleavage fracture). The failure of specimens are conservatively predicted by different analyses. The comparison between the elastic analyses and elastic-plastic analyses shows the conservatism of specific plasticity corrections used in French RPV elastic analyses. Numerous finite element calculations have also been performed between EDF, CEA and Framatome in order to compare and validate several fracture mechanics post processors implemented in finite element programmes used in pressurized thermal shock analyses. This work includes two-dimensional numerical computations on specimens with different geometries and loadings. The comparisons show a rather good agreement on main results, allowing to validate the finite element codes and their post-processors. (author). 11 refs, 24 figs, 3 tabs.

  7. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 3

    International Nuclear Information System (INIS)

    1983-01-01

    Critical review of the analyses of the German Risk Assessment Study on Nuclear Power Plants (DRS) concerning the reliability of the containment under accident conditions and the conditions of fission product release (transport and distribution in the environment). Main point of interest in this context is an explosion in the steam section and its impact on the containment. Critical comments are given on the models used in the DRS for determining the accident consequences. The analyses made deal with the mathematical models and database for propagation calculations, the methods of dose computation and assessment of health hazards, and the modelling of protective and safety measures. Social impacts of reactor accidents are also considered. (RF) [de

  8. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  9. Geostatistical analyses and hazard assessment on soil lead in Silvermines area, Ireland

    International Nuclear Information System (INIS)

    McGrath, David; Zhang Chaosheng; Carton, Owen T.

    2004-01-01

    Spatial distribution and hazard assessment of soil lead in the mining site of Silvermines, Ireland, were investigated using statistics, geostatistics and geographic information system (GIS) techniques. Positively skewed distribution and possible outlying values of Pb and other heavy metals were observed. Box-Cox transformation was applied in order to achieve normality in the data set and to reduce the effect of outliers. Geostatistical analyses were carried out, including calculation of experimental variograms and model fitting. The ordinary point kriging estimates of Pb concentration were mapped. Kriging standard deviations were regarded as the standard deviations of the interpolated pixel values, and a second map was produced, that quantified the probability of Pb concentration higher than a threshold value of 1000 mg/kg. These maps provide valuable information for hazard assessment and for decision support. - A probability map was produced that was useful for hazard assessment and decision support

  10. Assessment of CONTAIN and MELCOR for performing LOCA and LOVA analyses in ITER

    International Nuclear Information System (INIS)

    Merrill, B.J.; Hagrman, D.L.; Gaeta, M.J.; Petti, D.A.

    1994-09-01

    This report describes the results of an assessment of the CONTAIN and MELCOR computer codes for ITER LOCA and LOVA applications. As part of the assessment, the results of running a test problem that describes an ITER LOCA are presented. It is concluded that the MELCOR code should be the preferred code for ITER severe accident thermal hydraulic analyses. This code will require the least modification to be appropriate for calculating thermal hydraulic behavior in ITER relevant conditions that include vacuum, cryogenics, ITER temperatures, and the presence of a liquid metal test module. The assessment of the aerosol transport models in these codes concludes that several modifications would have to be made to CONTAIN and/or MELCOR to make them applicable to the aerosol transport part of severe accident analysis in ITER

  11. Geostatistical analyses and hazard assessment on soil lead in Silvermines area, Ireland

    Energy Technology Data Exchange (ETDEWEB)

    McGrath, David; Zhang Chaosheng; Carton, Owen T

    2004-01-01

    Spatial distribution and hazard assessment of soil lead in the mining site of Silvermines, Ireland, were investigated using statistics, geostatistics and geographic information system (GIS) techniques. Positively skewed distribution and possible outlying values of Pb and other heavy metals were observed. Box-Cox transformation was applied in order to achieve normality in the data set and to reduce the effect of outliers. Geostatistical analyses were carried out, including calculation of experimental variograms and model fitting. The ordinary point kriging estimates of Pb concentration were mapped. Kriging standard deviations were regarded as the standard deviations of the interpolated pixel values, and a second map was produced, that quantified the probability of Pb concentration higher than a threshold value of 1000 mg/kg. These maps provide valuable information for hazard assessment and for decision support. - A probability map was produced that was useful for hazard assessment and decision support.

  12. ANALYSING PERFORMANCE ASSESSMENT IN PUBLIC SERVICES: HOW USEFUL IS THE CONCEPT OF A PERFORMANCE REGIME?

    Science.gov (United States)

    Martin, Steve; Nutley, Sandra; Downe, James; Grace, Clive

    2016-03-01

    Approaches to performance assessment have been described as 'performance regimes', but there has been little analysis of what is meant by this concept and whether it has any real value. We draw on four perspectives on regimes - 'institutions and instruments', 'risk regulation regimes', 'internal logics and effects' and 'analytics of government' - to explore how the concept of a multi-dimensional regime can be applied to performance assessment in public services. We conclude that the concept is valuable. It helps to frame comparative and longitudinal analyses of approaches to performance assessment and draws attention to the ways in which public service performance regimes operate at different levels, how they change over time and what drives their development. Areas for future research include analysis of the impacts of performance regimes and interactions between their visible features (such as inspections, performance indicators and star ratings) and the veiled rationalities which underpin them.

  13. A study on modeling nitrogen dioxide concentrations using land-use regression and conventionally used exposure assessment methods

    Science.gov (United States)

    Choi, Giehae; Bell, Michelle L.; Lee, Jong-Tae

    2017-04-01

    The land-use regression (LUR) approach to estimate the levels of ambient air pollutants is becoming popular due to its high validity in predicting small-area variations. However, only a few studies have been conducted in Asian countries, and much less research has been conducted on comparing the performances and applied estimates of different exposure assessments including LUR. The main objectives of the current study were to conduct nitrogen dioxide (NO2) exposure assessment with four methods including LUR in the Republic of Korea, to compare the model performances, and to estimate the empirical NO2 exposures of a cohort. The study population was defined as the year 2010 participants of a government-supported cohort established for bio-monitoring in Ulsan, Republic of Korea. The annual ambient NO2 exposures of the 969 study participants were estimated with LUR, nearest station, inverse distance weighting, and ordinary kriging. Modeling was based on the annual NO2 average, traffic-related data, land-use data, and altitude of the 13 regularly monitored stations. The final LUR model indicated that area of transportation, distance to residential area, and area of wetland were important predictors of NO2. The LUR model explained 85.8% of the variation observed in the 13 monitoring stations of the year 2009. The LUR model outperformed the others based on leave-one out cross-validation comparing the correlations and root-mean square error. All NO2 estimates ranged from 11.3-18.0 ppb, with that of LUR having the widest range. The NO2 exposure levels of the residents differed by demographics. However, the average was below the national annual guidelines of the Republic of Korea (30 ppb). The LUR models showed high performances in an industrial city in the Republic of Korea, despite the small sample size and limited data. Our findings suggest that the LUR method may be useful in similar settings in Asian countries where the target region is small and availability of data is

  14. Stable crack growth behaviors in welded CT specimens -- finite element analyses and simplified assessments

    International Nuclear Information System (INIS)

    Yagawa, Genki; Yoshimura, Shinobu; Aoki, Shigeru; Kikuchi, Masanori; Arai, Yoshio; Kashima, Koichi; Watanabe, Takayuki; Shimakawa, Takashi

    1993-01-01

    The paper describes stable crack growth behaviors in welded CT specimens made of nuclear pressure vessel A533B class 1 steel, in which initial cracks are placed to be normal to fusion line. At first, using the relations between the load-line displacement (δ) and the crack extension amount (Δa) measured in experiments, the generation phase finite element crack growth analyses are performed, calculating the applied load (P) and various kinds of J-integrals. Next, the simplified crack growth analyses based on the GE/EPRI method and the reference stress method are performed using the same experimental results. Some modification procedures of the two simplified assessment schemes are discussed to make them applicable to inhomogeneous materials. Finally, a neural network approach is proposed to optimize the above modification procedures. 20 refs., 13 figs., 1 tab

  15. Total System Performance Assessment Sensitivity Analyses for Final Nuclear Regulatory Commission Regulations

    International Nuclear Information System (INIS)

    Bechtel SAIC Company

    2001-01-01

    This Letter Report presents the results of supplemental evaluations and analyses designed to assess long-term performance of the potential repository at Yucca Mountain. The evaluations were developed in the context of the Nuclear Regulatory Commission (NRC) final public regulation, or rule, 10 CFR Part 63 (66 FR 55732 [DIRS 156671]), which was issued on November 2, 2001. This Letter Report addresses the issues identified in the Department of Energy (DOE) technical direction letter dated October 2, 2001 (Adams 2001 [DIRS 156708]). The main objective of this Letter Report is to evaluate performance of the potential Yucca Mountain repository using assumptions consistent with performance-assessment-related provisions of 10 CFR Part 63. The incorporation of the final Environmental Protection Agency (EPA) standard, 40 CFR Part 197 (66 FR 32074 [DIRS 155216]), and the analysis of the effect of the 40 CFR Part 197 EPA final rule on long-term repository performance are presented in the Total System Performance Assessment--Analyses for Disposal of Commercial and DOE Waste Inventories at Yucca Mountain--Input to Final Environmental Impact Statement and Site Suitability Evaluation (BSC 2001 [DIRS 156460]), referred to hereafter as the FEIS/SSE Letter Report. The Total System Performance Assessment (TSPA) analyses conducted and documented prior to promulgation of the NRC final rule 10 CFR Part 63 (66 FR 55732 [DIRS 156671]), were based on the NRC proposed rule (64 FR 8640 [DIRS 101680]). Slight differences exist between the NRC's proposed and final rules which were not within the scope of the FEIS/SSE Letter Report (BSC 2001 [DIRS 156460]), the Preliminary Site Suitability Evaluation (PSSE) (DOE 2001 [DIRS 155743]), and supporting documents for these reports. These differences include (1) the possible treatment of ''unlikely'' features, events and processes (FEPs) in evaluation of both the groundwater protection standard and the human-intrusion scenario of the individual

  16. Reactivity initiated accident analyses for the safety assessment of upgraded JRR-3

    International Nuclear Information System (INIS)

    Harami, Taikan; Uemura, Mutsumi; Ohnishi, Nobuaki

    1984-08-01

    JRR-3, currently a heavy water moderated and cooled 10 MW reactor, is to be upgraded to a light water moderated and cooled, heavy water reflected 20 MW reactor. This report describes the analytical results of reactivity initiated accidents for the safety assessment of upgraded JRR-3. The following five cases have been selected for the assessment; (1) uncontrolled control rod withdrawal from zero power, (2) uncontrolled control rod withdrawal from full power, (3) removal of irradiation samples, (4) increase of primary coolant flow, (5) failure of heavy water tank. Parameter studies have been made for each of the above cases to cover possible uncertainties. All analyses have been made by a computer code EUREKA-2. The results show that the safety criteria for upgraded JRR-3 are all met and the adequacy of the design is confirmed. (author)

  17. HIFSA: Heavy-Ion Fusion Systems Assessment Project: Volume 2, Technical analyses

    International Nuclear Information System (INIS)

    Dudziak, D.J.

    1987-12-01

    A two-year project was undertaken to assess the commercial potential of heavy-ion fusion (HIF) as an economical electric power production technology. Because the US HIF development program is oriented toward the use of multiple-beam induction linacs, the study was confined to this particular driver technology. The HIF systems assessment (HIFSA) study involved several subsystem design, performance, and cost studies (e.g., the induction linac, final beam transport, beam transport in reactor cavity environments, cavity clearing, target manufacturing, and reactor plant). In addition, overall power plant systems integration, parametric analyses, and tradeoff studies were performed using a systems code developed specifically for the HIFSA project. Systems analysis results show values for cost of electricity (COE) comparable to those from other inertial- and magnetic-confinement fusion plant studies; viz., 50 to 60 mills/kWh (1985 dollars) for 1-GWe plant sizes. Also, significant COE insensitivity to major accelerator, target, and reactor parameters near the minima was demonstrated. Conclusions from the HIFSA study have already led to substantial modifications of the US HIF research and development program. Separate abstracts were prepared for 17 papers in these analyses

  18. LWR safety studies. Analyses and further assessments relating to the German Risk Assessment Study on Nuclear Power Plants. Vol. 1

    International Nuclear Information System (INIS)

    1983-01-01

    This documentation of the activities of the Oeko-Institut is intended to show errors made and limits encountered in the experimental approaches and in results obtained by the work performed under phase A of the German Risk Assessment Study on Nuclear Power Plants (DRS). Concern is expressed and explained relating to the risk definition used in the Study, and the results of other studies relied on; specific problems of methodology are discussed with regard to the value of fault-tree/accident analyses for describing the course of safety-related events, and to the evaluations presented in the DRS. The Markov model is explained as an approach offering alternative solutions. The identification and quantification of common-mode failures is discussed. Origin, quality and methods of assessing the reliability characteristics used in the DRS as well as the statistical models for describing failure scenarios of reactor components and systems are critically reviewed. (RF) [de

  19. Metagenomic analyses of bacteria on human hairs: a qualitative assessment for applications in forensic science.

    Science.gov (United States)

    Tridico, Silvana R; Murray, Dáithí C; Addison, Jayne; Kirkbride, Kenneth P; Bunce, Michael

    2014-01-01

    Mammalian hairs are one of the most ubiquitous types of trace evidence collected in the course of forensic investigations. However, hairs that are naturally shed or that lack roots are problematic substrates for DNA profiling; these hair types often contain insufficient nuclear DNA to yield short tandem repeat (STR) profiles. Whilst there have been a number of initial investigations evaluating the value of metagenomics analyses for forensic applications (e.g. examination of computer keyboards), there have been no metagenomic evaluations of human hairs-a substrate commonly encountered during forensic practice. This present study attempts to address this forensic capability gap, by conducting a qualitative assessment into the applicability of metagenomic analyses of human scalp and pubic hair. Forty-two DNA extracts obtained from human scalp and pubic hairs generated a total of 79,766 reads, yielding 39,814 reads post control and abundance filtering. The results revealed the presence of unique combinations of microbial taxa that can enable discrimination between individuals and signature taxa indigenous to female pubic hairs. Microbial data from a single co-habiting couple added an extra dimension to the study by suggesting that metagenomic analyses might be of evidentiary value in sexual assault cases when other associative evidence is not present. Of all the data generated in this study, the next-generation sequencing (NGS) data generated from pubic hair held the most potential for forensic applications. Metagenomic analyses of human hairs may provide independent data to augment other forensic results and possibly provide association between victims of sexual assault and offender when other associative evidence is absent. Based on results garnered in the present study, we believe that with further development, bacterial profiling of hair will become a valuable addition to the forensic toolkit.

  20. Assessment of Tools and Data for System-Level Dynamic Analyses

    International Nuclear Information System (INIS)

    Piet, Steven J.; Soelberg, Nick R.

    2011-01-01

    The only fuel cycle for which dynamic analyses and assessments are not needed is the null fuel cycle - no nuclear power. For every other concept, dynamic analyses are needed and can influence relative desirability of options. Dynamic analyses show how a fuel cycle might work during transitions from today's partial fuel cycle to something more complete, impact of technology deployments, location of choke points, the key time lags, when benefits can manifest, and how well parts of fuel cycles work together. This report summarizes the readiness of existing Fuel Cycle Technology (FCT) tools and data for conducting dynamic analyses on the range of options. VISION is the primary dynamic analysis tool. Not only does it model mass flows, as do other dynamic system analysis models, but it allows users to explore various potential constraints. The only fuel cycle for which constraints are not important are those in concept advocates PowerPoint presentations; in contrast, comparative analyses of fuel cycles must address what constraints exist and how they could impact performance. The most immediate tool need is extending VISION to the thorium/U233 fuel cycle. Depending on further clarification of waste management strategies in general and for specific fuel cycle candidates, waste management sub-models in VISION may need enhancement, e.g., more on 'co-flows' of non-fuel materials, constraints in waste streams, or automatic classification of waste streams on the basis of user-specified rules. VISION originally had an economic sub-model. The economic calculations were deemed unnecessary in later versions so it was retired. Eventually, the program will need to restore and improve the economics sub-model of VISION to at least the cash flow stage and possibly to incorporating cost constraints and feedbacks. There are multiple sources of data that dynamic analyses can draw on. In this report, 'data' means experimental data, data from more detailed theoretical or empirical

  1. Assessment of Tools and Data for System-Level Dynamic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Steven J. Piet; Nick R. Soelberg

    2011-06-01

    The only fuel cycle for which dynamic analyses and assessments are not needed is the null fuel cycle - no nuclear power. For every other concept, dynamic analyses are needed and can influence relative desirability of options. Dynamic analyses show how a fuel cycle might work during transitions from today's partial fuel cycle to something more complete, impact of technology deployments, location of choke points, the key time lags, when benefits can manifest, and how well parts of fuel cycles work together. This report summarizes the readiness of existing Fuel Cycle Technology (FCT) tools and data for conducting dynamic analyses on the range of options. VISION is the primary dynamic analysis tool. Not only does it model mass flows, as do other dynamic system analysis models, but it allows users to explore various potential constraints. The only fuel cycle for which constraints are not important are those in concept advocates PowerPoint presentations; in contrast, comparative analyses of fuel cycles must address what constraints exist and how they could impact performance. The most immediate tool need is extending VISION to the thorium/U233 fuel cycle. Depending on further clarification of waste management strategies in general and for specific fuel cycle candidates, waste management sub-models in VISION may need enhancement, e.g., more on 'co-flows' of non-fuel materials, constraints in waste streams, or automatic classification of waste streams on the basis of user-specified rules. VISION originally had an economic sub-model. The economic calculations were deemed unnecessary in later versions so it was retired. Eventually, the program will need to restore and improve the economics sub-model of VISION to at least the cash flow stage and possibly to incorporating cost constraints and feedbacks. There are multiple sources of data that dynamic analyses can draw on. In this report, 'data' means experimental data, data from more detailed

  2. Flexible meta-regression to assess the shape of the benzene-leukemia exposure-response curve.

    NARCIS (Netherlands)

    Vlaanderen, J.J.|info:eu-repo/dai/nl/31403160X; Portengen, L.|info:eu-repo/dai/nl/269224742; Rothman, N.; Lan, Q.; Kromhout, H.|info:eu-repo/dai/nl/074385224; Vermeulen, R.|info:eu-repo/dai/nl/216532620

    2010-01-01

    BACKGROUND: Previous evaluations of the shape of the benzene-leukemia exposure-response curve (ERC) were based on a single set or on small sets of human occupational studies. Integrating evidence from all available studies that are of sufficient quality combined with flexible meta-regression models

  3. Systematic comparative and sensitivity analyses of additive and outranking techniques for supporting impact significance assessments

    International Nuclear Information System (INIS)

    Cloquell-Ballester, Vicente-Agustin; Monterde-Diaz, Rafael; Cloquell-Ballester, Victor-Andres; Santamarina-Siurana, Maria-Cristina

    2007-01-01

    Assessing the significance of environmental impacts is one of the most important and all together difficult processes of Environmental Impact Assessment. This is largely due to the multicriteria nature of the problem. To date, decision techniques used in the process suffer from two drawbacks, namely the problem of compensation and the problem of identification of the 'exact boundary' between sub-ranges. This article discusses these issues and proposes a methodology for determining the significance of environmental impacts based on comparative and sensitivity analyses using the Electre TRI technique. An application of the methodology for the environmental assessment of a Power Plant project within the Valencian Region (Spain) is presented, and its performance evaluated. It is concluded that contrary to other techniques, Electre TRI automatically identifies those cases where allocation of significance categories is most difficult and, when combined with sensitivity analysis, offers greatest robustness in the face of variation in weights of the significance attributes. Likewise, this research demonstrates the efficacy of systematic comparison between Electre TRI and sum-based techniques, in the solution of assignment problems. The proposed methodology can therefore be regarded as a successful aid to the decision-maker, who will ultimately take the final decision

  4. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  5. Assessing trends in fishery resources and lake-water aluminum from paleolimnological analyses of siliceous algae

    International Nuclear Information System (INIS)

    Kingston, J.C.; Birks, H.J.B.; Uutala, A.J.; Cummings, B.F.; Smol, J.P.

    1992-01-01

    Lake water aluminum concentrations have a significant influence on the composition of microfossil assemblages of diatoms and chrysophytes deposited in lake sediments. With the paleolimnological approach of multilake datasets in the Adirondack region of New York, USA, the authors use canonical correspondence analysis to describe past trends in lake water Al. Four lakes, previously investigated regarding acidification and fishery trends, are used to demonstrate that paleolimnological assessment can also provide direction, timing, and magnitude of trends for both toxic metals and fish resources. Additionally, the authors use weighted average regression and calibration to obtain quantitative reconstructions of past lake water Al concentrations. Such reconstructions provide further insight into fishery resource damage and can be compared with modelling results. According to paleolimnological reconstructions, some of the naturally most acidic lakes in the Adirondack region had preindustrial lake water concentrations of inorganic monomeric Al near 4/micromol times L. Although these high concentrations are surprising from a geochemical point of view, they may partially explain the preindustrial absence of fish, as has been independently determined by paleolimnological analysis of phantom midges (Chaoborus). Fishery resource deterioration in acidified Adirondack lakes was coincident with major increases in lake water Al concentrations

  6. Scenario sensitivity analyses performed on the PRESTO-EPA LLW risk assessment models

    International Nuclear Information System (INIS)

    Bandrowski, M.S.

    1988-01-01

    The US Environmental Protection Agency (EPA) is currently developing standards for the land disposal of low-level radioactive waste. As part of the standard development, EPA has performed risk assessments using the PRESTO-EPA codes. A program of sensitivity analysis was conducted on the PRESTO-EPA codes, consisting of single parameter sensitivity analysis and scenario sensitivity analysis. The results of the single parameter sensitivity analysis were discussed at the 1987 DOE LLW Management Conference. Specific scenario sensitivity analyses have been completed and evaluated. Scenario assumptions that were analyzed include: site location, disposal method, form of waste, waste volume, analysis time horizon, critical radionuclides, use of buffer zones, and global health effects

  7. Energy system analyses of the marginal energy technology in life cycle assessments

    DEFF Research Database (Denmark)

    Mathiesen, B.V.; Münster, Marie; Fruergaard, Thilde

    2007-01-01

    in historical and potential future energy systems. Subsequently, key LCA studies of products and different waste flows are analysed in relation to the recom- mendations in consequential LCA. Finally, a case of increased waste used for incineration is examined using an energy system analysis model......In life cycle assessments consequential LCA is used as the “state-of-the-art” methodology, which focuses on the consequences of decisions made in terms of system boundaries, allocation and selection of data, simple and dynamic marginal technology, etc.(Ekvall & Weidema 2004). In many LCA studies...... marginal technology? How is the marginal technology identified and used today? What is the consequence of not using energy system analy- sis for identifying the marginal energy technologies? The use of the methodology is examined from three angles. First, the marginal electricity technology is identified...

  8. A Bayesian approach to assess data from radionuclide activity analyses in environmental samples

    International Nuclear Information System (INIS)

    Barrera, Manuel; Lourdes Romero, M.; Nunez-Lagos, Rafael; Bernardo, Jose M.

    2007-01-01

    A Bayesian statistical approach is introduced to assess experimental data from the analyses of radionuclide activity concentration in environmental samples (low activities). A theoretical model has been developed that allows the use of known prior information about the value of the measurand (activity), together with the experimental value determined through the measurement. The model has been applied to data of the Inter-laboratory Proficiency Test organised periodically among Spanish environmental radioactivity laboratories that are producing the radiochemical results for the Spanish radioactive monitoring network. A global improvement of laboratories performance is produced when this prior information is taken into account. The prior information used in this methodology is an interval within which the activity is known to be contained, but it could be extended to any other experimental quantity with a different type of prior information available

  9. Assessing the validity of road safety evaluation studies by analysing causal chains.

    Science.gov (United States)

    Elvik, Rune

    2003-09-01

    This paper discusses how the validity of road safety evaluation studies can be assessed by analysing causal chains. A causal chain denotes the path through which a road safety measure influences the number of accidents. Two cases are examined. One involves chemical de-icing of roads (salting). The intended causal chain of this measure is: spread of salt --> removal of snow and ice from the road surface --> improved friction --> shorter stopping distance --> fewer accidents. A Norwegian study that evaluated the effects of salting on accident rate provides information that describes this causal chain. This information indicates that the study overestimated the effect of salting on accident rate, and suggests that this estimate is influenced by confounding variables the study did not control for. The other case involves a traffic club for children. The intended causal chain in this study was: join the club --> improve knowledge --> improve behaviour --> reduce accident rate. In this case, results are rather messy, which suggests that the observed difference in accident rate between members and non-members of the traffic club is not primarily attributable to membership in the club. The two cases show that by analysing causal chains, one may uncover confounding factors that were not adequately controlled in a study. Lack of control for confounding factors remains the most serious threat to the validity of road safety evaluation studies.

  10. Three-dimensional analyses of fluid flow and heat transfer for moderator integrity assessment in PHWR

    International Nuclear Information System (INIS)

    Yu, S.-O.; Kim, M.; Kim, H.-J.

    2002-01-01

    A CANDU reactor has the unique features and the intrinsic safety related characteristics that distinguish it from other water-cooled thermal reactors. If there is the loss of coolant accident (LOCA) and a coincident failure of the emergency coolant injection (ECI) system, the heavy water moderator is continuously cooled, providing a heat sink for decay heat produced in the fuel. Therefore, it is one of major concerns to estimate the local subcooling of moderator inside the calandria vessel under postulated accident in CANDU safety analyses. The Canadian Nuclear Safety Commission (CNSC), a regulatory body in Canada, categorized the integrity of moderator as a generic safety issue and recommended that a series of experimental works be performed to verify the safety evaluation codes for individual simulated condition of nuclear power plant, comparing with the results of three-dimensional experimental data. In this study, three-dimensional analyses of fluid flow and heat transfer have been performed to assess thermal-hydraulic characteristics for moderator simulation conducted by SPEL (Sheridan Park Experimental Laboratory) experimental facility. The parametric study has also carried out to investigate the effect of major parameters such as flowrate, temperature, and heat load generated from the heaters on the temperature and flow distribution inside the moderator. Three flow patterns have been identified in the moderator with flowrate, heat generation, or both. As the transition of fluid flow is progressed, it is found that the dimensionless numbers (Ar) and the ratio of buoyancy to inertia forces are constant. (author)

  11. Assessment of the Turkish utility sector through energy and exergy analyses

    International Nuclear Information System (INIS)

    Utlu, Zafer; Hepbasli, Arif

    2007-01-01

    The present study deals with evaluating the utility sector in terms of energetic and exergetic aspects. In this regard, energy and exergy utilization efficiencies in the Turkish utility sector over a wide range of period from 1990 to 2004 are assessed in this study. Energy and exergy analyses are performed for eight power plant modes, while they are based on the actual data over the period studied. Sectoral energy and exergy analyses are conducted to study the variations of energy and exergy efficiencies for each power plants throughout the years, and overall energy and exergy efficiencies are compared for these power plants. The energy utilization efficiencies for the overall Turkish utility sector range from 32.64% to 45.69%, while the exergy utilization efficiencies vary from 32.20% to 46.81% in the analyzed years. Exergetic improvement potential for this sector are also determined to be 332 PJ in 2004. It may be concluded that the methodology used in this study is practical and useful for analyzing sectoral and subsectoral energy and exergy utilization to determine how efficient energy and exergy are used in the sector studied. It is also expected that the results of this study will be helpful in developing highly applicable and productive planning for energy policies

  12. Local seismic hazard assessment in explosive volcanic settings by 3D numerical analyses

    Science.gov (United States)

    Razzano, Roberto; Pagliaroli, Alessandro; Moscatelli, Massimiliano; Gaudiosi, Iolanda; Avalle, Alessandra; Giallini, Silvia; Marcini, Marco; Polpetta, Federica; Simionato, Maurizio; Sirianni, Pietro; Sottili, Gianluca; Vignaroli, Gianluca; Bellanova, Jessica; Calamita, Giuseppe; Perrone, Angela; Piscitelli, Sabatino

    2017-04-01

    This work deals with the assessment of local seismic response in the explosive volcanic settings by reconstructing the subsoil model of the Stracciacappa maar (Sabatini Volcanic District, central Italy), whose pyroclastic succession records eruptive phases ended about 0.09 Ma ago. Heterogeneous characteristics of the Stracciacappa maar (stratification, structural setting, lithotypes, and thickness variation of depositional units) make it an ideal case history for understanding mechanisms and processes leading to modifications of amplitude-frequency-duration of seismic waves generated at earthquake sources and propagating through volcanic settings. New geological map and cross sections, constrained with recently acquired geotechnical and geophysical data, illustrate the complex geometric relationships among different depositional units forming the maar. A composite interfingering between internal lacustrine sediments and epiclastic debris, sourced from the rim, fills the crater floor; a 45 meters thick continuous coring borehole was drilled in the maar with sampling of undisturbed samples. Electrical Resistivity Tomography surveys and 2D passive seismic arrays were also carried out for constraining the geological model and the velocity profile of the S-waves, respectively. Single station noise measurements were collected in order to define natural amplification frequencies. Finally, the nonlinear cyclic soil behaviour was investigated through simple shear tests on the undisturbed samples. The collected dataset was used to define the subsoil model for 3D finite difference site response numerical analyses by using FLAC 3D software (ITASCA). Moreover, 1D and 2D numerical analyses were carried out for comparison purposes. Two different scenarios were selected as input motions: a moderate magnitude (volcanic event) and a high magnitude (tectonic event). Both earthquake scenarios revealed significant ground motion amplification (up to 15 in terms of spectral acceleration

  13. Relationships of Functional Tests Following ACL Reconstruction: Exploratory Factor Analyses of the Lower Extremity Assessment Protocol.

    Science.gov (United States)

    DiFabio, Melissa; Slater, Lindsay V; Norte, Grant; Goetschius, John; Hart, Joseph M; Hertel, Jay

    2018-03-01

    After ACL reconstruction (ACLR), deficits are often assessed using a variety of functional tests, which can be time consuming. It is unknown whether these tests provide redundant or unique information. To explore relationships between components of a battery of functional tests, the Lower Extremity Assessment Protocol (LEAP) was created to aid in developing the most informative, concise battery of tests for evaluating ACLR patients. Descriptive, cross-sectional. Laboratory. 76 ACLR patients (6.86±3.07 months postoperative) and 54 healthy participants. Isokinetic knee flexion and extension at 90 and 180 degrees/second, maximal voluntary isometric contraction for knee extension and flexion, single leg balance, 4 hopping tasks (single, triple, crossover, and 6-meter timed hop), and a bilateral drop vertical jump that was scored with the Landing Error Scoring System (LESS). Peak torque, average torque, average power, total work, fatigue indices, center of pressure area and velocity, hop distance and time, and LESS score. A series of factor analyses were conducted to assess grouping of functional tests on the LEAP for each limb in the ACLR and healthy groups and limb symmetry indices (LSI) for both groups. Correlations were run between measures that loaded on retained factors. Isokinetic and isometric strength tests for knee flexion and extension, hopping, balance, and fatigue index were identified as unique factors for all limbs. The LESS score loaded with various factors across the different limbs. The healthy group LSI analysis produced more factors than the ACLR LSI analysis. Individual measures within each factor had moderate to strong correlations. Isokinetic and isometric strength, hopping, balance, and fatigue index provided unique information. Within each category of measures, not all tests may need to be included for a comprehensive functional assessment of ACLR patients due to the high amount of shared variance between them.

  14. Waterborne toxoplasmosis investigated and analysed under hydrogeological assessment: new data and perspectives for further research.

    Science.gov (United States)

    Vieira, Flávia Pereira; Alves, Maria da Glória; Martins, Livia Mattos; Rangel, Alba Lucínia Peixoto; Dubey, Jitender Prakash; Hill, Dolores; Bahia-Oliveira, Lilian Maria Garcia

    2015-11-01

    We present a set of data on human and chicken Toxoplasma gondii seroprevalence that was investigated and analysed in light of groundwater vulnerability information in an area endemic for waterborne toxoplasmosis in Brazil. Hydrogeological assessment was undertaken to select sites for water collection from wells for T. gondii oocyst testing and for collecting blood from free-range chickens and humans for anti-T. gondii serologic testing. Serologic testing of human specimens was done using conventional commercial tests and a sporozoite-specific embryogenesis-related protein (TgERP), which is able to differentiate whether infection resulted from tissue cysts or oocysts. Water specimens were negative for the presence of viable T. gondii oocysts. However, seroprevalence in free-range chickens was significantly associated with vulnerability of groundwater to surface contamination (p toxoplasmosis in light of groundwater vulnerability information associated with prevalence in humans estimated by oocyst antigens recognition have implications for the potential role of hydrogeological assessment in researching waterborne toxoplasmosis at a global scale.

  15. The feasibility and utility of grocery receipt analyses for dietary assessment

    Directory of Open Access Journals (Sweden)

    Duan Yan

    2006-03-01

    Full Text Available Abstract Objective To establish the feasibility and utility of a simple data collection methodology for dietary assessment. Design Using a cross-sectional design, trained data collectors approached adults (~20 – 40 years of age at local grocery stores and asked whether they would volunteer their grocery receipts and answer a few questions for a small stipend ($1. Methods The grocery data were divided into 3 categories: "fats, oils, and sweets," "processed foods," and "low-fat/low-calorie substitutions" as a percentage of the total food purchase price. The questions assessed the shopper's general eating habits (eg, fast-food consumption and a few demographic characteristics and health aspects (eg, perception of body size. Statistical Analyses Performed. Descriptive and analytic analyses using non-parametric tests were conducted in SAS. Results Forty-eight receipts and questionnaires were collected. Nearly every respondent reported eating fast food at least once per month; 27% ate out once or twice a day. Frequency of fast-food consumption was positively related to perceived body size of the respondent (p = 0.02. Overall, 30% of the food purchase price was for fats, oils, sweets, 10% was for processed foods, and almost 6% was for low-fat/low-calorie substitutions. Households where no one was perceived to be overweight spent a smaller proportion of their food budget on fats, oils, and sweets than did households where at least one person was perceived to be overweight (p = 0.10; household where the spouse was not perceived to be overweight spent less on fats, oils, and sweets (p = 0.02 and more on low-fat/low-calorie substitutions (p = 0.09 than did households where the spouse was perceived to be overweight; and, respondents who perceived themselves to be overweight spent more on processed foods than did respondents who did not perceive themselves to be overweight (p = 0.06. Conclusion This simple dietary assessment method, although global in

  16. Reporting characteristics of meta-analyses in orthodontics: methodological assessment and statistical recommendations.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Papadopoulos, Moschos A; Athanasiou, Athanasios E

    2014-02-01

    Ideally meta-analyses (MAs) should consolidate the characteristics of orthodontic research in order to produce an evidence-based answer. However severe flaws are frequently observed in most of them. The aim of this study was to evaluate the statistical methods, the methodology, and the quality characteristics of orthodontic MAs and to assess their reporting quality during the last years. Electronic databases were searched for MAs (with or without a proper systematic review) in the field of orthodontics, indexed up to 2011. The AMSTAR tool was used for quality assessment of the included articles. Data were analyzed with Student's t-test, one-way ANOVA, and generalized linear modelling. Risk ratios with 95% confidence intervals were calculated to represent changes during the years in reporting of key items associated with quality. A total of 80 MAs with 1086 primary studies were included in this evaluation. Using the AMSTAR tool, 25 (27.3%) of the MAs were found to be of low quality, 37 (46.3%) of medium quality, and 18 (22.5%) of high quality. Specific characteristics like explicit protocol definition, extensive searches, and quality assessment of included trials were associated with a higher AMSTAR score. Model selection and dealing with heterogeneity or publication bias were often problematic in the identified reviews. The number of published orthodontic MAs is constantly increasing, while their overall quality is considered to range from low to medium. Although the number of MAs of medium and high level seems lately to rise, several other aspects need improvement to increase their overall quality.

  17. An empirical study on open position risk assessment using VAR and regression analysis: A case study of Iranian banking industry

    Directory of Open Access Journals (Sweden)

    Elmira Mahmoudzadeh

    2012-10-01

    Full Text Available During the past few years, there have been tremendous fluctuations on different currencies. For instance, European common currency, Euro, has be fluctuated between 0.60 to 0.9 against US dollar. Therefore, it is important to study the behavior of currency valuations using different techniques. In this paper, we present an empirical study to measure the impact of different items on risk of foreign currency using value at risk (VaR and regression methods. The proposed model of this paper investigates whether the risk of open positions of six foreign currencies including US dollar, Euro, British Pound, Switzerland Frank, Norwegian Kroner and United Emirate Dirham increase during the time horizon. The proposed study of this paper uses historical daily prices of these currencies for a fiscal year of 2011 in one of private banks located in Iran and measures the relative risk. The results of the implementation of two methods of VaR and linear regression indicate that the risk of open positions increases during the time horizon.

  18. Thermodynamic analyses and assessments of various thermal energy storage systems for buildings

    International Nuclear Information System (INIS)

    Caliskan, Hakan; Dincer, Ibrahim; Hepbasli, Arif

    2012-01-01

    Highlights: ► Proposing a novel latent (PCM), thermochemical and sensible (aquifer) TES combination for building heating. ► Performing comprehensive environmental, energy, exergy and sustainability analyses. ► Investigating the effect of varying dead state temperatures on the TESs. - Abstract: In this study, energetic, exergetic, environmental and sustainability analyses and their assessments are carried out for latent, thermochemical and sensible thermal energy storage (TES) systems for phase change material (PCM) supported building applications under varying environment (surrounding) temperatures. The present system consists of a floor heating system, System-I, System-II and System-III. The floor heating system stays at the building floor supported with a floor heating unit and pump. The System-I includes a latent TES system and a fan. The latent TES system is comprised of a PCM supported building envelope, in which from outside to inside; glass, transparent insulation material, PCM, air channel and insulation material are placed, respectively. Furthermore, System-II mainly has a solar-thermochemical TES while there are an aquifer TES and a heat pump in System-III. Among the TESs, the hot and cold wells of the aquifer TES have maximum exergetic efficiency values of 88.782% and 69.607% at 8 °C dead state temperature, respectively. According to the energy efficiency aspects of TESs, the discharging processes of the latent TES and the hot well of the aquifer TES possess the minimum and maximum values of 5.782% and 94.118% at 8 °C dead state temperature, respectively. Also, the fan used with the latent TES is the most environmentally-benign system component among the devices. Furthermore, the most sustainable TES is found for the aquifer TES while the worst sustainable system is the latent TES.

  19. Binary Logistic Regression Versus Boosted Regression Trees in Assessing Landslide Susceptibility for Multiple-Occurring Regional Landslide Events: Application to the 2009 Storm Event in Messina (Sicily, southern Italy).

    Science.gov (United States)

    Lombardo, L.; Cama, M.; Maerker, M.; Parisi, L.; Rotigliano, E.

    2014-12-01

    This study aims at comparing the performances of Binary Logistic Regression (BLR) and Boosted Regression Trees (BRT) methods in assessing landslide susceptibility for multiple-occurrence regional landslide events within the Mediterranean region. A test area was selected in the north-eastern sector of Sicily (southern Italy), corresponding to the catchments of the Briga and the Giampilieri streams both stretching for few kilometres from the Peloritan ridge (eastern Sicily, Italy) to the Ionian sea. This area was struck on the 1st October 2009 by an extreme climatic event resulting in thousands of rapid shallow landslides, mainly of debris flows and debris avalanches types involving the weathered layer of a low to high grade metamorphic bedrock. Exploiting the same set of predictors and the 2009 landslide archive, BLR- and BRT-based susceptibility models were obtained for the two catchments separately, adopting a random partition (RP) technique for validation; besides, the models trained in one of the two catchments (Briga) were tested in predicting the landslide distribution in the other (Giampilieri), adopting a spatial partition (SP) based validation procedure. All the validation procedures were based on multi-folds tests so to evaluate and compare the reliability of the fitting, the prediction skill, the coherence in the predictor selection and the precision of the susceptibility estimates. All the obtained models for the two methods produced very high predictive performances, with a general congruence between BLR and BRT in the predictor importance. In particular, the research highlighted that BRT-models reached a higher prediction performance with respect to BLR-models, for RP based modelling, whilst for the SP-based models the difference in predictive skills between the two methods dropped drastically, converging to an analogous excellent performance. However, when looking at the precision of the probability estimates, BLR demonstrated to produce more robust

  20. Assessment of triglyceride and cholesterol in overweight people based on multiple linear regression and artificial intelligence model.

    Science.gov (United States)

    Ma, Jing; Yu, Jiong; Hao, Guangshu; Wang, Dan; Sun, Yanni; Lu, Jianxin; Cao, Hongcui; Lin, Feiyan

    2017-02-20

    The prevalence of high hyperlipemia is increasing around the world. Our aims are to analyze the relationship of triglyceride (TG) and cholesterol (TC) with indexes of liver function and kidney function, and to develop a prediction model of TG, TC in overweight people. A total of 302 adult healthy subjects and 273 overweight subjects were enrolled in this study. The levels of fasting indexes of TG (fs-TG), TC (fs-TC), blood glucose, liver function, and kidney function were measured and analyzed by correlation analysis and multiple linear regression (MRL). The back propagation artificial neural network (BP-ANN) was applied to develop prediction models of fs-TG and fs-TC. The results showed there was significant difference in biochemical indexes between healthy people and overweight people. The correlation analysis showed fs-TG was related to weight, height, blood glucose, and indexes of liver and kidney function; while fs-TC was correlated with age, indexes of liver function (P < 0.01). The MRL analysis indicated regression equations of fs-TG and fs-TC both had statistic significant (P < 0.01) when included independent indexes. The BP-ANN model of fs-TG reached training goal at 59 epoch, while fs-TC model achieved high prediction accuracy after training 1000 epoch. In conclusions, there was high relationship of fs-TG and fs-TC with weight, height, age, blood glucose, indexes of liver function and kidney function. Based on related variables, the indexes of fs-TG and fs-TC can be predicted by BP-ANN models in overweight people.

  1. Exploring pre-service science teachers' pedagogical capacity for formative assessment through analyses of student answers

    Science.gov (United States)

    Aydeniz, Mehmet; Dogan, Alev

    2016-05-01

    Background: There has been an increasing emphasis on empowering pre-service and in-service science teachers to attend student reasoning and use formative assessments to guide student learning in recent years. Purpose: The purpose of this study was to explore pre-service science teachers' pedagogical capacity for formative assessment. Sample: This study took place in Turkey. The participants include 53 pre-service science teachers in their final year of schooling. All but two of the participants are female. Design and methods: We used a mixed-methods methodology in pursing this inquiry. Participants analyzed 28 responses to seven two-tiered questions given by four students of different ability levels. We explored their ability to identify the strengths and weaknesses in students' answers. We paid particular attention to the things that the pre-service science teachers noticed in students' explanations, the types of inferences they made about students' conceptual understanding, and the affordances of pedagogical decisions they made. Results: The results show that the majority of participants made an evaluative judgment (i.e. the answer is correct or incorrect) in their analyses of students' answers. Similarly, the majority of the participants recognized the type of mistake that the students made. However, they failed to successfully elaborate on fallacies, limitations, or strengths in student reasoning. We also asked the participants to make pedagogical decisions related to what needs to be done next in order to help the students to achieve academic objectives. Results show that 8% of the recommended instructional strategies were of no affordance, 64% of low-affordance, and 28% were of high affordance in terms of helping students achieve the academic objectives. Conclusion: If our goal is to improve pre-service science teachers' noticing skills, and the affordance of feedback that they provide, engaging them in activities that asks them to attend to students' ideas

  2. Multidimensional analyses to assess the relations between treatment choices by physicians and patients’ characteristics: the example of COPD

    Directory of Open Access Journals (Sweden)

    Roche Nicolas

    2012-08-01

    Full Text Available Abstract Background In some situations, practice guidelines do not provide firm evidence-based guidance regarding COPD treatment choices, especially when large trials have failed to identify subgroups of particularly good or poor responders to available medications. Methods This observational cross-sectional study explored the yield of four types of multidimensional analyses to assess the associations between the clinical characteristics of COPD patients and pharmacological and non-pharmacological treatments prescribed by lung specialists in a real-life context. Results Altogether, 2494 patients were recruited by 515 respiratory physicians. Multiple correspondence analysis and hierarchical clustering identified 6 clinical subtypes and 6 treatment subgroups. Strong bi-directional associations were found between clinical subtypes and treatment subgroups in multivariate logistic regression. However, although the overall frequency of prescriptions varied from one clinical subtype to the other for all types of pharmacological treatments, clinical subtypes were not associated with specific prescription profiles. When canonical analysis of redundancy was used, the proportion of variation in pharmacological treatments that was explained by clinical characteristics remained modest: 6.23%. This proportion was greater (14.29% for non-pharmacological components of care. Conclusion This study shows that, although pharmacological treatments of COPD are quantitatively very well related to patients’ clinical characteristics, there is no particular patient profile that could be qualitatively associated to prescriptions. This underlines uncertainties perceived by physicians for differentiating the respective effects of available pharmacological treatments. The methodology applied here is useful to identify areas of uncertainty requiring further research and/or guideline clarification.

  3. Assessment of abnormal brain structures and networks in major depressive disorder using morphometric and connectome analyses.

    Science.gov (United States)

    Chen, Vincent Chin-Hung; Shen, Chao-Yu; Liang, Sophie Hsin-Yi; Li, Zhen-Hui; Tyan, Yeu-Sheng; Liao, Yin-To; Huang, Yin-Chen; Lee, Yena; McIntyre, Roger S; Weng, Jun-Cheng

    2016-11-15

    It is hypothesized that the phenomenology of major depressive disorder (MDD) is subserved by disturbances in the structure and function of brain circuits; however, findings of structural abnormalities using MRI have been inconsistent. Generalized q-sampling imaging (GQI) methodology provides an opportunity to assess the functional integrity of white matter tracts in implicated circuits. The study population was comprised of 16 outpatients with MDD (mean age 44.81±2.2 years) and 30 age- and gender-matched healthy controls (mean age 45.03±1.88 years). We excluded participants with any other primary mental disorder, substance use disorder, or any neurological illnesses. We used T1-weighted 3D MRI with voxel-based morphometry (VBM) and vertex-wise shape analysis, and GQI with voxel-based statistical analysis (VBA), graph theoretical analysis (GTA) and network-based statistical (NBS) analysis to evaluate brain structure and connectivity abnormalities in MDD compared to healthy controls correlates with clinical measures of depressive symptom severity, Hamilton Depression Rating Scale 17-item (HAMD) and Hospital Anxiety and Depression Scale (HADS). Using VBM and vertex-wise shape analyses, we found significant volumetric decreases in the hippocampus and amygdala among subjects with MDD (pdisorder with abnormal circuit structure and connectivity. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Assessing an organizational culture instrument based on the Competing Values Framework: Exploratory and confirmatory factor analyses

    Science.gov (United States)

    Helfrich, Christian D; Li, Yu-Fang; Mohr, David C; Meterko, Mark; Sales, Anne E

    2007-01-01

    Background The Competing Values Framework (CVF) has been widely used in health services research to assess organizational culture as a predictor of quality improvement implementation, employee and patient satisfaction, and team functioning, among other outcomes. CVF instruments generally are presented as well-validated with reliable aggregated subscales. However, only one study in the health sector has been conducted for the express purpose of validation, and that study population was limited to hospital managers from a single geographic locale. Methods We used exploratory and confirmatory factor analyses to examine the underlying structure of data from a CVF instrument. We analyzed cross-sectional data from a work environment survey conducted in the Veterans Health Administration (VHA). The study population comprised all staff in non-supervisory positions. The survey included 14 items adapted from a popular CVF instrument, which measures organizational culture according to four subscales: hierarchical, entrepreneurial, team, and rational. Results Data from 71,776 non-supervisory employees (approximate response rate 51%) from 168 VHA facilities were used in this analysis. Internal consistency of the subscales was moderate to strong (α = 0.68 to 0.85). However, the entrepreneurial, team, and rational subscales had higher correlations across subscales than within, indicating poor divergent properties. Exploratory factor analysis revealed two factors, comprising the ten items from the entrepreneurial, team, and rational subscales loading on the first factor, and two items from the hierarchical subscale loading on the second factor, along with one item from the rational subscale that cross-loaded on both factors. Results from confirmatory factor analysis suggested that the two-subscale solution provides a more parsimonious fit to the data as compared to the original four-subscale model. Conclusion This study suggests that there may be problems applying conventional

  5. A histological evaluation and in vivo assessment of intratumoral near infrared photothermal nanotherapy-induced tumor regression

    Directory of Open Access Journals (Sweden)

    Green HN

    2014-11-01

    Full Text Available Hadiyah N Green,1,2 Stephanie D Crockett,3 Dmitry V Martyshkin,1 Karan P Singh,2,4 William E Grizzle,2,5 Eben L Rosenthal,2,6 Sergey B Mirov11Department of Physics, Center for Optical Sensors and Spectroscopies, 2Comprehensive Cancer Center, 3Department of Pediatrics, Division of Neonatology, 4Department of Medicine, Division of Preventive Medicine, Biostatistics and Bioinformatics Shared Facility, 5Department of Pathology, 6Department of Surgery, Division of Otolaryngology, Head and Neck Surgery, The University of Alabama at Birmingham, Birmingham, AL, USAPurpose: Nanoparticle (NP-enabled near infrared (NIR photothermal therapy has realized limited success in in vivo studies as a potential localized cancer therapy. This is primarily due to a lack of successful methods that can prevent NP uptake by the reticuloendothelial system, especially the liver and kidney, and deliver sufficient quantities of intravenously injected NPs to the tumor site. Histological evaluation of photothermal therapy-induced tumor regression is also neglected in the current literature. This report demonstrates and histologically evaluates the in vivo potential of NIR photothermal therapy by circumventing the challenges of intravenous NP delivery and tumor targeting found in other photothermal therapy studies.Methods: Subcutaneous Cal 27 squamous cell carcinoma xenografts received photothermal nanotherapy treatments, radial injections of polyethylene glycol (PEG-ylated gold nanorods and one NIR 785 nm laser irradiation for 10 minutes at 9.5 W/cm2. Tumor response was measured for 10–15 days, gross changes in tumor size were evaluated, and the remaining tumors or scar tissues were excised and histologically analyzed.Results: The single treatment of intratumoral nanorod injections followed by a 10 minute NIR laser treatment also known as photothermal nanotherapy, resulted in ~100% tumor regression in ~90% of treated tumors, which was statistically significant in a

  6. A statistical regression model for the estimation of acrylamide concentrations in French fries for excess lifetime cancer risk assessment.

    Science.gov (United States)

    Chen, Ming-Jen; Hsu, Hui-Tsung; Lin, Cheng-Li; Ju, Wei-Yuan

    2012-10-01

    Human exposure to acrylamide (AA) through consumption of French fries and other foods has been recognized as a potential health concern. Here, we used a statistical non-linear regression model, based on the two most influential factors, cooking temperature and time, to estimate AA concentrations in French fries. The R(2) of the predictive model is 0.83, suggesting the developed model was significant and valid. Based on French fry intake survey data conducted in this study and eight frying temperature-time schemes which can produce tasty and visually appealing French fries, the Monte Carlo simulation results showed that if AA concentration is higher than 168 ppb, the estimated cancer risk for adolescents aged 13-18 years in Taichung City would be already higher than the target excess lifetime cancer risk (ELCR), and that by taking into account this limited life span only. In order to reduce the cancer risk associated with AA intake, the AA levels in French fries might have to be reduced even further if the epidemiological observations are valid. Our mathematical model can serve as basis for further investigations on ELCR including different life stages and behavior and population groups. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. The use of molecular analyses in voided urine for the assessment of patients with hematuria

    DEFF Research Database (Denmark)

    Beukers, Willemien; Kandimalla, Raju; van Houwelingen, Diandra

    2013-01-01

    variables into a logistic regression model. Results Logistic regression analysis based on the five methylation markers, age, gender and type of hematuria resulted in an area under the curve (AUC) of 0.88 and an optimism corrected AUC of 0.84 after internal validation by bootstrapping. Using a cut-off value...... examination of low risk patients and thereby, reducing patient burden and costs. Further validation in a large prospective patient cohort is necessary to prove the true clinical value of this model....

  8. Kidney function changes with aging in adults: comparison between cross-sectional and longitudinal data analyses in renal function assessment.

    Science.gov (United States)

    Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas

    2015-12-01

    The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Landslide susceptibility assessment using logistic regression and its comparison with a rock mass classification system, along a road section in the northern Himalayas (India)

    Science.gov (United States)

    Das, Iswar; Sahoo, Sashikant; van Westen, Cees; Stein, Alfred; Hack, Robert

    2010-02-01

    Landslide studies are commonly guided by ground knowledge and field measurements of rock strength and slope failure criteria. With increasing sophistication of GIS-based statistical methods, however, landslide susceptibility studies benefit from the integration of data collected from various sources and methods at different scales. This study presents a logistic regression method for landslide susceptibility mapping and verifies the result by comparing it with the geotechnical-based slope stability probability classification (SSPC) methodology. The study was carried out in a landslide-prone national highway road section in the northern Himalayas, India. Logistic regression model performance was assessed by the receiver operator characteristics (ROC) curve, showing an area under the curve equal to 0.83. Field validation of the SSPC results showed a correspondence of 72% between the high and very high susceptibility classes with present landslide occurrences. A spatial comparison of the two susceptibility maps revealed the significance of the geotechnical-based SSPC method as 90% of the area classified as high and very high susceptible zones by the logistic regression method corresponds to the high and very high class in the SSPC method. On the other hand, only 34% of the area classified as high and very high by the SSPC method falls in the high and very high classes of the logistic regression method. The underestimation by the logistic regression method can be attributed to the generalisation made by the statistical methods, so that a number of slopes existing in critical equilibrium condition might not be classified as high or very high susceptible zones.

  10. Secondary mediation and regression analyses of the PTClinResNet database: determining causal relationships among the International Classification of Functioning, Disability and Health levels for four physical therapy intervention trials.

    Science.gov (United States)

    Mulroy, Sara J; Winstein, Carolee J; Kulig, Kornelia; Beneck, George J; Fowler, Eileen G; DeMuth, Sharon K; Sullivan, Katherine J; Brown, David A; Lane, Christianne J

    2011-12-01

    Each of the 4 randomized clinical trials (RCTs) hosted by the Physical Therapy Clinical Research Network (PTClinResNet) targeted a different disability group (low back disorder in the Muscle-Specific Strength Training Effectiveness After Lumbar Microdiskectomy [MUSSEL] trial, chronic spinal cord injury in the Strengthening and Optimal Movements for Painful Shoulders in Chronic Spinal Cord Injury [STOMPS] trial, adult stroke in the Strength Training Effectiveness Post-Stroke [STEPS] trial, and pediatric cerebral palsy in the Pediatric Endurance and Limb Strengthening [PEDALS] trial for children with spastic diplegic cerebral palsy) and tested the effectiveness of a muscle-specific or functional activity-based intervention on primary outcomes that captured pain (STOMPS, MUSSEL) or locomotor function (STEPS, PEDALS). The focus of these secondary analyses was to determine causal relationships among outcomes across levels of the International Classification of Functioning, Disability and Health (ICF) framework for the 4 RCTs. With the database from PTClinResNet, we used 2 separate secondary statistical approaches-mediation analysis for the MUSSEL and STOMPS trials and regression analysis for the STEPS and PEDALS trials-to test relationships among muscle performance, primary outcomes (pain related and locomotor related), activity and participation measures, and overall quality of life. Predictive models were stronger for the 2 studies with pain-related primary outcomes. Change in muscle performance mediated or predicted reductions in pain for the MUSSEL and STOMPS trials and, to some extent, walking speed for the STEPS trial. Changes in primary outcome variables were significantly related to changes in activity and participation variables for all 4 trials. Improvement in activity and participation outcomes mediated or predicted increases in overall quality of life for the 3 trials with adult populations. Variables included in the statistical models were limited to those

  11. Performance Assessment and Sensitivity Analyses of Disposal of Plutonium as Can-in-Canister Ceramic

    International Nuclear Information System (INIS)

    Rainer Senger

    2001-01-01

    The purpose of this analysis is to examine whether there is a justification for using high-level waste (HLW) as a surrogate for plutonium disposal in can-in-canister ceramic in the total-system performance assessment (TSPA) model for the Site Recommendation (SR). In the TSPA-SR model, the immobilized plutonium waste form is not explicitly represented, but is implicitly represented as an equal number of canisters of HLW. There are about 50 metric tons of plutonium in the U. S. Department of Energy inventory of surplus fissile material that could be disposed. Approximately 17 tons of this material contain significant quantities of impurities and are considered unsuitable for mixed-oxide (MOX) reactor fuel. This material has been designated for direct disposal by immobilization in a ceramic waste form and encapsulating this waste form in high-level waste (HLW). The remaining plutonium is suitable for incorporation into MOX fuel assemblies for commercial reactors (Shaw 1999, Section 2). In this analysis, two cases of immobilized plutonium disposal are analyzed, the 17-ton case and the 13-ton case (Shaw et al. 2001, Section 2.2). The MOX spent-fuel disposal is not analyzed in this report. In the TSPA-VA (CRWMS M and O 1998a, Appendix B, Section B-4), the calculated dose release from immobilized plutonium waste form (can-in-canister ceramic) did not exceed that from an equivalent amount of HLW glass. This indicates that the HLW could be used as a surrogate for the plutonium can-in-canister ceramic. Representation of can-in-canister ceramic as a surrogate is necessary to reduce the number of waste forms in the TSPA model. This reduction reduces the complexity and running time of the TSPA model and makes the analyses tractable. This document was developed under a Technical Work Plan (CRWMS M and O 2000a), and is compliant with that plan. The application of the Quality Assurance (QA) program to the development of that plan (CRWMS M and O 2000a) and of this Analysis is

  12. On the role of environmental corruption in healthcare infrastructures: An empirical assessment for Italy using DEA with truncated regression approach.

    Science.gov (United States)

    Cavalieri, Marina; Guccio, Calogero; Rizzo, Ilde

    2017-05-01

    This paper investigates empirically whether the institutional features of the contracting authority as well as the level of 'environmental' corruption in the area where the work is localised affect the efficient execution of public contracts for healthcare infrastructures. A two-stage Data Envelopment Analysis (DEA) is carried out based on a sample of Italian public contracts for healthcare infrastructures during the period 2000-2005. First, a smoothed bootstrapped DEA estimator is used to assess the relative efficiency in the implementation of each single infrastructure contract. Second, the determinants of the efficiency scores variability are considered, paying special attention to the effect exerted by 'environmental' corruption on different types of contracting authorities. Our results show that the performance of the contracts for healthcare infrastructures is significantly affected by 'environmental' corruption. Furthermore, healthcare contracting authorities are, on average, less efficient and the negative effect of corruption on efficiency is greater for this type of public procurers. The policy recommendation coming out of the study is to rely on 'qualified' contracting authorities since not all the public bodies have the necessary expertise to carry on public contracts for healthcare infrastructures efficiently. Copyright © 2017. Published by Elsevier B.V.

  13. Assessing the Determinants of Renewable Electricity Acceptance Integrating Meta-Analysis Regression and a Local Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    Simona Bigerna

    2015-08-01

    Full Text Available In dealing with renewable electricity (RE, individuals are involved both as end-consumers on the demand side and as stakeholders (citizens in the local production process on the supply side. Empirical evidence shows that in many countries, consumers are willing to pay a significant amount to facilitate adoption of RE. By contrast, environmental externalities are often the cause of strong opposition to RE adoption if local communities are involved as stakeholders in wind, solar or biomass investment projects. Looking at the literature on willingness to pay and on willingness to accept, we have investigated RE acceptance mechanisms. First, we have used the meta-analysis to assess the major determinants of RE acceptance on both demand and supply sides. Meta-analysis has provided some insights useful for managing field research on an onshore wind farm enlargement project located in the Umbria region. Meta-analysis and survey results confirm that the local community plays a central role in local RE acceptance. Furthermore, people who have previous experience with windmills require less compensation, or are willing to pay more, for RE development. Results suggest that these attributes should be included in future research to improve understanding of determinants of RE acceptance.

  14. Harmonisation of food consumption data format for dietary exposure assessments of chemicals analysed in raw agricultural commodities

    DEFF Research Database (Denmark)

    Boon, Polly E.; Ruprich, Jiri; Petersen, Annette

    2009-01-01

    In this paper, we present an approach to format national food consumption data at raw agricultural commodity (RAC) level. In this way, the data is both formatted in a harmonised way given the comparability of RACs between countries, and suitable to assess the dietary exposure to chemicals analysed......, and the use of the FAO/WHO Codex Classification system of Foods and Animal Feeds to harmonise the classification. We demonstrate that this approach works well for pesticides and glycoalkaloids, and is an essential step forward in the harmonisation of risk assessment procedures within Europe when addressing...... chemicals analysed in RACs by all national food control systems....

  15. Assessment of diagnostic value of various tumors markers (CEA, CA199, CA50) for colorectal neoplasm with logistic regression and ROC curve

    International Nuclear Information System (INIS)

    Gu Ping; Huang Gang; Han Yuan

    2007-01-01

    Objective: To assess the diagnostic value of CEA, CA199 and CA50 for colorectal neoplasm by logistic regression and ROC curve. Methods: Serum CEA (with CLIA), CA199 (with ECLIA) and CA50 (with IRMA) levels were measured in 75 patients with colorectal cancer, 35 patients with benign colorectal disorders and 49 controls. The area under the ROC curve (AUC)s of CEA, CA199, CA50 from logistic regression results were compared. Results: In the cancer-benign disorder group, the AUC of CA50 was larger than the AUC of CA199. AUC of combined CEA, CA50 was largest: not only larger than any AUC of CEA, CA50, CA199 alone but also larger than the AUC of the combined three markers (0.875 vs 0.604). In cancer-control group, the AUC of combination of CEA, CA199 and CA50 was larger than any AUC of CEA, CA199 or CA50 alone. Both in the cancer-benign disorder group or cancer-control group, the AUC of CEA was larger than the AUC of CA199 or CA50. Conclusion: CEA is of definite value in the diagnosis of colorectal cancer. For differential diagnosis, the combination of CEA and CA50 can give more information, while the combination of three tumor markers is less helpful. As an advanced statistical method, logistic regression can improve the diagnostic sensitivity and specificity. (authors)

  16. Multiple-output support vector machine regression with feature selection for arousal/valence space emotion assessment.

    Science.gov (United States)

    Torres-Valencia, Cristian A; Álvarez, Mauricio A; Orozco-Gutiérrez, Alvaro A

    2014-01-01

    Human emotion recognition (HER) allows the assessment of an affective state of a subject. Until recently, such emotional states were described in terms of discrete emotions, like happiness or contempt. In order to cover a high range of emotions, researchers in the field have introduced different dimensional spaces for emotion description that allow the characterization of affective states in terms of several variables or dimensions that measure distinct aspects of the emotion. One of the most common of such dimensional spaces is the bidimensional Arousal/Valence space. To the best of our knowledge, all HER systems so far have modelled independently, the dimensions in these dimensional spaces. In this paper, we study the effect of modelling the output dimensions simultaneously and show experimentally the advantages in modeling them in this way. We consider a multimodal approach by including features from the Electroencephalogram and a few physiological signals. For modelling the multiple outputs, we employ a multiple output regressor based on support vector machines. We also include an stage of feature selection that is developed within an embedded approach known as Recursive Feature Elimination (RFE), proposed initially for SVM. The results show that several features can be eliminated using the multiple output support vector regressor with RFE without affecting the performance of the regressor. From the analysis of the features selected in smaller subsets via RFE, it can be observed that the signals that are more informative into the arousal and valence space discrimination are the EEG, Electrooculogram/Electromiogram (EOG/EMG) and the Galvanic Skin Response (GSR).

  17. Application of geographically-weighted regression analysis to assess risk factors for malaria hotspots in Keur Soce health and demographic surveillance site.

    Science.gov (United States)

    Ndiath, Mansour M; Cisse, Badara; Ndiaye, Jean Louis; Gomis, Jules F; Bathiery, Ousmane; Dia, Anta Tal; Gaye, Oumar; Faye, Babacar

    2015-11-18

    In Senegal, considerable efforts have been made to reduce malaria morbidity and mortality during the last decade. This resulted in a marked decrease of malaria cases. With the decline of malaria cases, transmission has become sparse in most Senegalese health districts. This study investigated malaria hotspots in Keur Soce sites by using geographically-weighted regression. Because of the occurrence of hotspots, spatial modelling of malaria cases could have a considerable effect in disease surveillance. This study explored and analysed the spatial relationships between malaria occurrence and socio-economic and environmental factors in small communities in Keur Soce, Senegal, using 6 months passive surveillance. Geographically-weighted regression was used to explore the spatial variability of relationships between malaria incidence or persistence and the selected socio-economic, and human predictors. A model comparison of between ordinary least square and geographically-weighted regression was also explored. Vector dataset (spatial) of the study area by village levels and statistical data (non-spatial) on malaria confirmed cases, socio-economic status (bed net use), population data (size of the household) and environmental factors (temperature, rain fall) were used in this exploratory analysis. ArcMap 10.2 and Stata 11 were used to perform malaria hotspots analysis. From Jun to December, a total of 408 confirmed malaria cases were notified. The explanatory variables-household size, housing materials, sleeping rooms, sheep and distance to breeding site returned significant t values of -0.25, 2.3, 4.39, 1.25 and 2.36, respectively. The OLS global model revealed that it explained about 70 % (adjusted R(2) = 0.70) of the variation in malaria occurrence with AIC = 756.23. The geographically-weighted regression of malaria hotspots resulted in coefficient intercept ranging from 1.89 to 6.22 with a median of 3.5. Large positive values are distributed mainly in the southeast

  18. Scientific information and the Tongass land management plan: key findings derived from the scientific literature, species assessments, resource analyses, workshops, and risk assessment panels.

    Science.gov (United States)

    Douglas N. Swanston; Charles G. Shaw; Winston P. Smith; Kent R. Julin; Guy A. Cellier; Fred H. Everest

    1996-01-01

    This document highlights key items of information obtained from the published literature and from specific assessments, workshops, resource analyses, and various risk assessment panels conducted as part of the Tongass land management planning process. None of this information dictates any particular decision; however, it is important to consider during decisionmaking...

  19. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Assessment of ethylene vinyl-acetate copolymer samples exposed to γ-rays via linearity analyses

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lucas N. de; Nascimento, Eriberto O. do; Schimidt, Fernando [Instituto Federal de Educação, Ciência e Tecnologia de Goiás (IFG), Goiânia, GO (Brazil); Antonio, Patrícia L.; Caldas, Linda V.E., E-mail: lcaldas@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Materials with the potential to become dosimeters are of interest in radiation physics. In this research, the materials were analyzed and compared in relation to their linearity ranges. Samples of ethylene vinyl-acetate copolymer (EVA) were irradiated with doses from 10 Gy to 10 kGy using a {sup 60}Co Gamma-Cell system 220 and evaluated with the FTIR technique. The linearity analyses were applied through two methodologies, searching for linear regions in their response. The results show that both applied analyses indicate linear regions in defined dose interval. The radiation detectors EVA can be useful for radiation dosimetry in intermediate and high doses. (author)

  1. Validity and reliability of an instrument for assessing case analyses in bioengineering ethics education.

    Science.gov (United States)

    Goldin, Ilya M; Pinkus, Rosa Lynn; Ashley, Kevin

    2015-06-01

    Assessment in ethics education faces a challenge. From the perspectives of teachers, students, and third-party evaluators like the Accreditation Board for Engineering and Technology and the National Institutes of Health, assessment of student performance is essential. Because of the complexity of ethical case analysis, however, it is difficult to formulate assessment criteria, and to recognize when students fulfill them. Improvement in students' moral reasoning skills can serve as the focus of assessment. In previous work, Rosa Lynn Pinkus and Claire Gloeckner developed a novel instrument for assessing moral reasoning skills in bioengineering ethics. In this paper, we compare that approach to existing assessment techniques, and evaluate its validity and reliability. We find that it is sensitive to knowledge gain and that independent coders agree on how to apply it.

  2. Influence of assessment setting on the results of functional analyses of problem behavior

    NARCIS (Netherlands)

    Lang, R.B.; Sigafoos, J.; Lancioni, G.E.; Didden, H.C.M.; Rispoli, M.

    2010-01-01

    Analogue functional analyses are widely used to identify the operant function of problem behavior in individuals with developmental disabilities. Because problem behavior often occurs across multiple settings (e.g., homes, schools, outpatient clinics), it is important to determine whether the

  3. Assessment of Adult Psychopathology: Meta-Analyses and Implications of Cross-Informant Correlations

    Science.gov (United States)

    Achenbach, Thomas M.; Krukowsi, Rebecca A.; Dumenci, Levent; Ivanova, Masha Y.

    2005-01-01

    Assessment of adult psychopathology relies heavily on self-reports. To determine how well self-reports agree with reports by "informants" who know the person being assessed, the authors examined 51,000 articles published over 10 years in 52 peer-reviewed journals for correlations between self-reports and "informants" reports. Qualifying…

  4. A three-dimensional analyses of fluid flow and heat transfer for moderator integrity assessment in PHWR

    International Nuclear Information System (INIS)

    Bang, K. H.; Lee, J. Y.; Yoo, S. O.; Kim, M. W.; Kim, H. J.

    2002-01-01

    Three-dimensional analyses of fluid flow and heat transfer has been performed in this study. The simulation of SPEL experimental work and comparison with experimental data has been carried out to verify the analyses models. Moreover, to verify the CANDU-6 reactor type, analyses of fluid flow and heat transfer in the calandria under the condition of steady state has been performed using FLUENT code, which is the conventional code for a three-dimensional analyses of fluid flow and heat transfer for moderator integrity assessment in PHWR thermal-hydraulics. It is found that the maximum temperature in the moderator is 347K (74 ), so that the moderator has the enough subcoolability to ensure the integrity of pressure tube during LOCA conditions

  5. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  6. Interobserver agreement of radiologists assessing the response of rectal cancers to preoperative chemoradiation using the MRI tumour regression grading (mrTRG)

    International Nuclear Information System (INIS)

    Siddiqui, M.R.S.; Gormly, K.L.; Bhoday, J.; Balyansikova, S.; Battersby, N.J.; Chand, M.; Rao, S.; Tekkis, P.; Abulafi, A.M.; Brown, G.

    2016-01-01

    Aim: To investigate whether the magnetic resonance imaging (MRI) tumour regression grading (mrTRG) scale can be taught effectively resulting in a clinically reasonable interobserver agreement (>0.4; moderate to near perfect agreement). Materials and methods: This study examines the interobserver agreement of mrTRG, between 35 radiologists and a central reviewer. Two workshops were organised for radiologists to assess regression of rectal cancers on MRI staging scans. A range of mrTRGs on 12 patient scans were used for assessment. Results: Kappa agreement ranged from 0.14–0.82 with a median value of 0.57 (95% CI: 0.37–0.77) indicating good overall agreement. Eight (26%) radiologists had very good/near perfect agreement (κ>0.8). Six (19%) radiologists had good agreement (0.8≥κ>0.6) and a further 12 (39%) had moderate agreement (0.6≥κ>0.4). Five (16%) radiologists had a fair agreement (0.4≥κ>0.2) and two had poor agreement (0.2>κ). There was a tendency towards good agreement (skewness: 0.92). In 65.9% and 90% of cases the radiologists were able to correctly highlight good and poor responders, respectively. Conclusions: The assessment of the response of rectal cancers to chemoradiation therapy may be performed effectively using mrTRG. Radiologists can be taught the mrTRG scale. Even with minimal training, good agreement with the central reviewer along with effective differentiation between good and intermediate/poor responders can be achieved. Focus should be on facilitating the identification of good responders. It is predicted that with more intensive interactive case-based learning a κ>0.8 is likely to be achieved. Testing and retesting is recommended. - Highlights: • Inter-observer agreement of radiologists was assessed using MRI rectal tumour regression scale. • Kappa agreement had a median value of 0.57 (95% CI: 0.37–0.77) indicating an overall good agreement. • In 65.9% and 90% of cases the radiologists were able to correctly highlight

  7. Basing assessment and treatment of problem behavior on behavioral momentum theory: Analyses of behavioral persistence.

    Science.gov (United States)

    Schieltz, Kelly M; Wacker, David P; Ringdahl, Joel E; Berg, Wendy K

    2017-08-01

    The connection, or bridge, between applied and basic behavior analysis has been long-established (Hake, 1982; Mace & Critchfield, 2010). In this article, we describe how clinical decisions can be based more directly on behavioral processes and how basing clinical procedures on behavioral processes can lead to improved clinical outcomes. As a case in point, we describe how applied behavior analyses of maintenance, and specifically the long-term maintenance of treatment effects related to problem behavior, can be adjusted and potentially enhanced by basing treatment on Behavioral Momentum Theory. We provide a brief review of the literature including descriptions of two translational studies that proposed changes in how differential reinforcement of alternative behavior treatments are conducted based on Behavioral Momentum Theory. We then describe current clinical examples of how these translations are continuing to impact the definitions, designs, analyses, and treatment procedures used in our clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Littoral Assessment of Mine Burial Signatures (LAMBS) buried land mine/background spectral signature analyses

    Science.gov (United States)

    Kenton, A.C.; Geci, D.M.; Ray, K.J.; Thomas, C.M.; Salisbury, J.W.; Mars, J.C.; Crowley, J.K.; Witherspoon, N.H.; Holloway, J.H.; Harmon R.S.Broach J.T.Holloway, Jr. J.H.

    2004-01-01

    The objective of the Office of Naval Research (ONR) Rapid Overt Reconnaissance (ROR) program and the Airborne Littoral Reconnaissance Technologies (ALRT) project's LAMBS effort is to determine if electro-optical spectral discriminants exist that are useful for the detection of land mines in littoral regions. Statistically significant buried mine overburden and background signature data were collected over a wide spectral range (0.35 to 14 ??m) to identify robust spectral features that might serve as discriminants for new airborne sensor concepts. LAMBS has expanded previously collected databases to littoral areas - primarily dry and wet sandy soils - where tidal, surf, and wind conditions can severely modify spectral signatures. At AeroSense 2003, we reported completion of three buried mine collections at an inland bay, Atlantic and Gulf of Mexico beach sites.1 We now report LAMBS spectral database analyses results using metrics which characterize the detection performance of general types of spectral detection algorithms. These metrics include mean contrast, spectral signal-to-clutter, covariance, information content, and spectral matched filter analyses. Detection performance of the buried land mines was analyzed with regard to burial age, background type, and environmental conditions. These analyses considered features observed due to particle size differences, surface roughness, surface moisture, and compositional differences.

  9. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  10. Assessing the impact of local meteorological variables on surface ozone in Hong Kong during 2000-2015 using quantile and multiple line regression models

    Science.gov (United States)

    Zhao, Wei; Fan, Shaojia; Guo, Hai; Gao, Bo; Sun, Jiaren; Chen, Laiguo

    2016-11-01

    The quantile regression (QR) method has been increasingly introduced to atmospheric environmental studies to explore the non-linear relationship between local meteorological conditions and ozone mixing ratios. In this study, we applied QR for the first time, together with multiple linear regression (MLR), to analyze the dominant meteorological parameters influencing the mean, 10th percentile, 90th percentile and 99th percentile of maximum daily 8-h average (MDA8) ozone concentrations in 2000-2015 in Hong Kong. The dominance analysis (DA) was used to assess the relative importance of meteorological variables in the regression models. Results showed that the MLR models worked better at suburban and rural sites than at urban sites, and worked better in winter than in summer. QR models performed better in summer for 99th and 90th percentiles and performed better in autumn and winter for 10th percentile. And QR models also performed better in suburban and rural areas for 10th percentile. The top 3 dominant variables associated with MDA8 ozone concentrations, changing with seasons and regions, were frequently associated with the six meteorological parameters: boundary layer height, humidity, wind direction, surface solar radiation, total cloud cover and sea level pressure. Temperature rarely became a significant variable in any season, which could partly explain the peak of monthly average ozone concentrations in October in Hong Kong. And we found the effect of solar radiation would be enhanced during extremely ozone pollution episodes (i.e., the 99th percentile). Finally, meteorological effects on MDA8 ozone had no significant changes before and after the 2010 Asian Games.

  11. Analyses to assess the level of boron dilution in the VVER-1000 primary circuit

    International Nuclear Information System (INIS)

    Kral, P.; Macek, J.; Krhounkova, J.

    2000-12-01

    Thermal hydraulic analyses of loss-of-coolant accidents which can result in volumes with a reduced boric acid concentration were performed by using the RELAP5/MOD3.1 code. Small LOCA were calculated (i) without high-pressure pumps (HPP), (ii) with 1/3 HPP, (iii) with 1/3 HPP and cooling via steam dump station to the atmosphere, and (iv) with 1/3 HPP, cooling via steam dump station to the atmosphere, and star of main coolant pump no. 1. (P.A.)

  12. Contour plot assessment of existing meta-analyses confirms robust association of statin use and acute kidney injury risk.

    Science.gov (United States)

    Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W

    2015-10-01

    Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Sensitivity and Uncertainty Analyses Applied to Neutronics Calculations for Safety Assessment at IRSN

    International Nuclear Information System (INIS)

    Ivanov, Evgeny; Ivanova, Tatiana; Pignet, Sophie

    2013-01-01

    Objective of the presentation: • Present IRSN vision relevant to validation of stand-alone neutronics codes on support of the fuel cycle and reactor safety assessment for fast neutron reactors. • Provide work status, future developments and needs for R&D working program on validation methodology for neutronics of fast systems

  14. Development and testing of an assessment instrument for the formative peer review of significant event analyses.

    Science.gov (United States)

    McKay, J; Murphy, D J; Bowie, P; Schmuck, M-L; Lough, M; Eva, K W

    2007-04-01

    To establish the content validity and specific aspects of reliability for an assessment instrument designed to provide formative feedback to general practitioners (GPs) on the quality of their written analysis of a significant event. Content validity was quantified by application of a content validity index. Reliability testing involved a nested design, with 5 cells, each containing 4 assessors, rating 20 unique significant event analysis (SEA) reports (10 each from experienced GPs and GPs in training) using the assessment instrument. The variance attributable to each identified variable in the study was established by analysis of variance. Generalisability theory was then used to investigate the instrument's ability to discriminate among SEA reports. Content validity was demonstrated with at least 8 of 10 experts endorsing all 10 items of the assessment instrument. The overall G coefficient for the instrument was moderate to good (G>0.70), indicating that the instrument can provide consistent information on the standard achieved by the SEA report. There was moderate inter-rater reliability (G>0.60) when four raters were used to judge the quality of the SEA. This study provides the first steps towards validating an instrument that can provide educational feedback to GPs on their analysis of significant events. The key area identified to improve instrument reliability is variation among peer assessors in their assessment of SEA reports. Further validity and reliability testing should be carried out to provide GPs, their appraisers and contractual bodies with a validated feedback instrument on this aspect of the general practice quality agenda.

  15. Analysing biodiversity and conservation knowledge products to support regional environmental assessments.

    Science.gov (United States)

    Brooks, Thomas M; Akçakaya, H Resit; Burgess, Neil D; Butchart, Stuart H M; Hilton-Taylor, Craig; Hoffmann, Michael; Juffe-Bignoli, Diego; Kingston, Naomi; MacSharry, Brian; Parr, Mike; Perianin, Laurence; Regan, Eugenie C; Rodrigues, Ana S L; Rondinini, Carlo; Shennan-Farpon, Yara; Young, Bruce E

    2016-02-16

    Two processes for regional environmental assessment are currently underway: the Global Environment Outlook (GEO) and Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES). Both face constraints of data, time, capacity, and resources. To support these assessments, we disaggregate three global knowledge products according to their regions and subregions. These products are: The IUCN Red List of Threatened Species, Key Biodiversity Areas (specifically Important Bird &Biodiversity Areas [IBAs], and Alliance for Zero Extinction [AZE] sites), and Protected Planet. We present fourteen Data citations: numbers of species occurring and percentages threatened; numbers of endemics and percentages threatened; downscaled Red List Indices for mammals, birds, and amphibians; numbers, mean sizes, and percentage coverages of IBAs and AZE sites; percentage coverage of land and sea by protected areas; and trends in percentages of IBAs and AZE sites wholly covered by protected areas. These data will inform the regional/subregional assessment chapters on the status of biodiversity, drivers of its decline, and institutional responses, and greatly facilitate comparability and consistency between the different regional/subregional assessments.

  16. Analysing biodiversity and conservation knowledge products to support regional environmental assessments

    Science.gov (United States)

    Brooks, Thomas M.; Akçakaya, H. Resit; Burgess, Neil D.; Butchart, Stuart H.M.; Hilton-Taylor, Craig; Hoffmann, Michael; Juffe-Bignoli, Diego; Kingston, Naomi; MacSharry, Brian; Parr, Mike; Perianin, Laurence; Regan, Eugenie C.; Rodrigues, Ana S.L.; Rondinini, Carlo; Shennan-Farpon, Yara; Young, Bruce E.

    2016-01-01

    Two processes for regional environmental assessment are currently underway: the Global Environment Outlook (GEO) and Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES). Both face constraints of data, time, capacity, and resources. To support these assessments, we disaggregate three global knowledge products according to their regions and subregions. These products are: The IUCN Red List of Threatened Species, Key Biodiversity Areas (specifically Important Bird & Biodiversity Areas [IBAs], and Alliance for Zero Extinction [AZE] sites), and Protected Planet. We present fourteen Data citations: numbers of species occurring and percentages threatened; numbers of endemics and percentages threatened; downscaled Red List Indices for mammals, birds, and amphibians; numbers, mean sizes, and percentage coverages of IBAs and AZE sites; percentage coverage of land and sea by protected areas; and trends in percentages of IBAs and AZE sites wholly covered by protected areas. These data will inform the regional/subregional assessment chapters on the status of biodiversity, drivers of its decline, and institutional responses, and greatly facilitate comparability and consistency between the different regional/subregional assessments. PMID:26881749

  17. Analysing biodiversity and conservation knowledge products to support regional environmental assessments

    Science.gov (United States)

    Brooks, Thomas M.; Akçakaya, H. Resit; Burgess, Neil D.; Butchart, Stuart H. M.; Hilton-Taylor, Craig; Hoffmann, Michael; Juffe-Bignoli, Diego; Kingston, Naomi; Macsharry, Brian; Parr, Mike; Perianin, Laurence; Regan, Eugenie C.; Rodrigues, Ana S. L.; Rondinini, Carlo; Shennan-Farpon, Yara; Young, Bruce E.

    2016-02-01

    Two processes for regional environmental assessment are currently underway: the Global Environment Outlook (GEO) and Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES). Both face constraints of data, time, capacity, and resources. To support these assessments, we disaggregate three global knowledge products according to their regions and subregions. These products are: The IUCN Red List of Threatened Species, Key Biodiversity Areas (specifically Important Bird & Biodiversity Areas [IBAs], and Alliance for Zero Extinction [AZE] sites), and Protected Planet. We present fourteen Data citations: numbers of species occurring and percentages threatened; numbers of endemics and percentages threatened; downscaled Red List Indices for mammals, birds, and amphibians; numbers, mean sizes, and percentage coverages of IBAs and AZE sites; percentage coverage of land and sea by protected areas; and trends in percentages of IBAs and AZE sites wholly covered by protected areas. These data will inform the regional/subregional assessment chapters on the status of biodiversity, drivers of its decline, and institutional responses, and greatly facilitate comparability and consistency between the different regional/subregional assessments.

  18. Communicative Interactions in Everyday and College-Assessed Digital Literacy Practices: Transcribing and Analysing Multimodal Texts

    Science.gov (United States)

    Creer, Adele

    2017-01-01

    This paper explores integrating a range of digital media into classroom practice to establish the effectiveness of the media and its encompassing modes as a pedagogical tool with a focus on assessment. Directing attention on a communication skills module, research indicated that bringing a range of digital media into the classroom motivated and…

  19. Integrated well-to-wheel assessment on biofuels, analysing energy, emission and welfare economic consequences

    Energy Technology Data Exchange (ETDEWEB)

    Slentoe, E.; Moeller, F.; Frederiksen, P.; Jepsen, M.R.

    2011-07-15

    Various biofuel evaluation methods exist, with different analytical framework setup and different scopes. The scope of this study is to develop an integrated method to evaluate the consequences of producing biofuels. The consequences should include energy consumption, emission and welfare economic changes within the well-to-wheel (WTW) flow chain focusing on the production of biomass, and the subsequent conversion into bio fuel and combustion in vehicles. This method (Moeller and Slentoe, 2010) is applied to a Danish case, implementing policy targets for biofuel use in the transport sector and also developing an alternative scenario of higher biofuel shares. This paper presents the results of three interlinked parallel running analyses, of energy consumption, emissions and welfare economics (Slentoe, Moeller and Winther, 2010), and discusses the feasibility of those analyses, which are based on the same consequential analysis method, comparing a scenario situation to a reference situation. As will be shown, the results are not univocal; example given, what is an energy gain is not necessarily a welfare economic gain. The study is conducted as part of the Danish REBECa project. Within this, two main scenarios, HS1 and HS2, for biofuel mixture in fossil diesel fuel and gasoline are established. The biofuel rape diesel (RME) stems from rape seeds and bioethanol stems from either wheat grains (1st generation) or straw (2nd generation) - all cultivated in Denmark. The share of 2nd generation bioethanol exceeds 1st generation bioethanol towards 2030. Both scenarios initiate at a 5.75% mixture in 2010 and reach 10% and 25% in 2030 for HS1 and HS2, such that the low mixture scenario reflects the Danish Act on sustainable biofuels (June 2009), implementing the EU renewable energy directive (2009/29/EC), using biofuels as energy carrier. The two scenarios are computed in two variants each, reflecting oil prices at 65$ and 100$ per barrel. (Author)

  20. Assessment of Surface Water Quality in the Malaysian Coastal Waters by Using Multivariate Analyses

    International Nuclear Information System (INIS)

    Yap, C.K.; Chee, M.W.; Shamarina, S.; Edward, F.B.; Chew, W.; Tan, S.G.

    2011-01-01

    Coastal water samples were collected from 20 sampling sites in the southern part of Peninsular Malaysia. Seven physico-chemical parameters were measured directly in-situ while water samples were collected and analysed for 6 dissolved trace metal concentrations. The surface water (0-20 cm) physico-chemical parameters including temperature, salinity, dissolved oxygen (DO), pH, total dissolved solids (TDS), specific conductance (SpC) and turbidity while the dissolved trace metals were Cd, Cu, Fe, Ni, Pb and Zn. The ranges for the physico-chemical parameters were 28.07-35.6 degree Celsius for temperature, 0.18-32.42 ppt for salinity, 2.20-12.03 mg/ L for DO, 5.50-8.53 for pH, 0.24-31.65 mg/ L for TDS, 368-49452 μS/ cm for SpC and 0-262 NTU for turbidity while the dissolved metals (mg/ L) were 0.013-0.147 for Cd, 0.024-0.143 for Cu, 0.266-2.873 for Fe, 0.027-0.651 for Ni, 0.018-0.377 for Pb and 0.032-0.099 for Zn. Based on multivariate analysis (including correlation, cluster and principal component analyses), the polluted sites were found at Kg. Pasir Puteh and Tg. Kupang while Ni and Pb were identified as two major dissolved metals of high variation in the coastal waters. Therefore, water quality monitoring and control of release of untreated anthropogenic wastes into rivers and coastal waters are strongly needed. (author)

  1. Analysing Diagnostic Assessment on the Ratio of Sine in a Right Triangle

    Science.gov (United States)

    Andika, R.; Juandi, D.; Rosjanuardi, R.

    2017-09-01

    This study aims to develop diagnostic assessment with the special topic of the ratio of sinus in a right triangle and analyze the result whether the students are ready to continue to the next lesson of trigonometry specially the sinus rule. The methodology that use in this study is a design research of Plomp model which is it comprises of 3 phases: (a) preliminary research; (b) prototyping phase; and (c) assessment phase. The findings show that almost half of students made a mistake in determining the ratio of sin in a right triangle, consequently the procedure for solving the problem went wrong. In strategic competency and adaptive communication most of students did not solve the problem that was given. According to the result, the students have to get remedial program before to the next lesson, the rule of sin.

  2. Assessing cross-cultural differences through use of multiple-group invariance analyses.

    Science.gov (United States)

    Stein, Judith A; Lee, Jerry W; Jones, Patricia S

    2006-12-01

    The use of structural equation modeling in cross-cultural personality research has become a popular method for testing measurement invariance. In this report, we present an example of testing measurement invariance using the Sense of Coherence Scale of Antonovsky (1993) in 3 ethnic groups: Chinese, Japanese, and Whites. In a series of increasingly restrictive constraints on the measurement models of the 3 groups, we demonstrate how to assess differences among the groups. We also provide an example of construct validation.

  3. What Risk Assessments of Genetically Modified Organisms Can Learn from Institutional Analyses of Public Health Risks

    Directory of Open Access Journals (Sweden)

    S. Ravi Rajan

    2012-01-01

    Full Text Available The risks of genetically modified organisms (GMOs are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  4. What risk assessments of genetically modified organisms can learn from institutional analyses of public health risks.

    Science.gov (United States)

    Rajan, S Ravi; Letourneau, Deborah K

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  5. Transparency in practice: Evidence from 'verification analyses' issued by the Polish Agency for Health Technology Assessment in 2012-2015.

    Science.gov (United States)

    Ozierański, Piotr; Löblová, Olga; Nicholls, Natalia; Csanádi, Marcell; Kaló, Zoltán; McKee, Martin; King, Lawrence

    2018-01-08

    Transparency is recognised to be a key underpinning of the work of health technology assessment (HTA) agencies, yet it has only recently become a subject of systematic inquiry. We contribute to this research field by considering the Polish Agency for Health Technology Assessment (AHTAPol). We situate the AHTAPol in a broader context by comparing it with the National Institute for Health and Care Excellence (NICE) in England. To this end, we analyse all 332 assessment reports, called verification analyses, that the AHTAPol issued from 2012 to 2015, and a stratified sample of 22 Evidence Review Group reports published by NICE in the same period. Overall, by increasingly presenting its key conclusions in assessment reports, the AHTAPol has reached the transparency standards set out by NICE in transparency of HTA outputs. The AHTAPol is more transparent than NICE in certain aspects of the HTA process, such as providing rationales for redacting assessment reports and providing summaries of expert opinions. Nevertheless, it is less transparent in other areas of the HTA process, such as including information on expert conflicts of interest. Our findings have important implications for understanding HTA in Poland and more broadly. We use them to formulate recommendations for policymakers.

  6. Natural resources assessment and their utilization: analyses from a Himalayan state.

    Science.gov (United States)

    Uniyal, Sanjay Kr; Singh, Rakesh D

    2012-08-01

    The present paper quantifies and reviews the natural resource use in the Himalayan state of Himachal Pradesh (HP). Twenty-five percent of the geographical area of HP is under forests and harbour ca. 3,400 plant species. The available bioresources not only support the livelihood of nearly 6 million people but also fulfill the forage requirement of 5.2 million livestock. Thus, dependence on bioresources is manifold. Based on field surveys to different localities of HP and analyses of published information, two types of resource use patterns have been identified. One, the direct use of forest resources which is represented by extraction of timber, fuelwood and fodder; and the second represents indirect resource use from the forest that is represented by activities related to agriculture, tourism and industry. Amongst the direct resource use, annual timber requirement of the local people works out to be 310,063 m(3). On the other hand, annual fuelwood and fodder requirement of local people is to the tune of 3,646,348.8 and 10,294,116.5 tons, respectively. Extraction of fodder therefore appears to be one of the main reasons for forest degradation in HP as opposed to timber and fuelwood extraction. However, compared to direct resource use, indirect resource use and pressures have far more pronounced effect on the forests. Of the indirect pressures, shifts in agriculture patterns and increased tourism seem to be the most prominent.

  7. Dose Assessment using Chromosome Aberration Analyses in Human Peripheral Blood Lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Tae Ho; Kim, Jin-Hong; Kim, Jin Kyu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The healthy five donors were recruited to establish the dose-response calibration curve for chromosomal aberrations by ionizing radiation exposure. Our cytogenetic results revealed that the mean frequency of chromosome aberration increased with increasing radiation dose. In this study, dicentric assay and CBMN assay were compared considering the sensitivity and accuracy of dose estimation. Therefore, these chromosome aberration analyses will be the foundation for biological dosimetric analysis with additional research methods such as translocation and PCC assay. The conventional analysis of dicentric chromosomes in HPBL was suggested by Bender and Gooch in 1962. This assay has been for many years, the golden standard and the most specific method for ionizing radiation damage. The dicentric assay technique in HPBL has been shown as the most sensitive biological method and reliable bio-indicator of quantifying the radiation dose. In contrast, the micronucleus assay has advantages over the dicentric assay since it is rapid and requires less specialized expertise, and accordingly it can be applied to monitor a big population. The cytokinesis-block micronucleus (CBMN) assay is a suitable method for micronuceli measurement in cultured human as well as mammalian cells. The aim of our study was to establish the dose response curve of radiation-induced chromosome aberrations in HPBL by analyzing the frequency of dicentrics and micronuclei.

  8. Assessment of groundwater and soil quality degradation using multivariate and geostatistical analyses, Dakhla Oasis, Egypt

    Science.gov (United States)

    Masoud, Alaa A.; El-Horiny, Mohamed M.; Atwia, Mohamed G.; Gemail, Khaled S.; Koike, Katsuaki

    2018-06-01

    Salinization of groundwater and soil resources has long been a serious environmental hazard in arid regions. This study was conducted to investigate and document the factors controlling such salinization and their inter-relationships in the Dakhla Oasis (Egypt). To accomplish this, 60 groundwater samples and 31 soil samples were collected in February 2014. Factor analysis (FA) and hierarchical cluster analysis (HCA) were integrated with geostatistical analyses to characterize the chemical properties of groundwater and soil and their spatial patterns, identify the factors controlling the pattern variability, and clarify the salinization mechanism. Groundwater quality standards revealed emergence of salinization (av. 885.8 mg/L) and extreme occurrences of Fe2+ (av. 17.22 mg/L) and Mn2+ (av. 2.38 mg/L). Soils were highly salt-affected (av. 15.2 dS m-1) and slightly alkaline (av. pH = 7.7). Evaporation and ion-exchange processes governed the evolution of two main water types: Na-Cl (52%) and Ca-Mg-Cl (47%), respectively. Salinization leads the chemical variability of both resources. Distinctive patterns of slight salinization marked the northern part and intense salinization marked the middle and southern parts. Congruence in the resources clusters confirmed common geology, soil types, and urban and agricultural practices. Minimizing the environmental and socioeconomic impacts of the resources salinization urges the need for better understanding of the hydrochemical characteristics and prediction of quality changes.

  9. Assessing Compatibility of Direct Detection Data: Halo-Independent Global Likelihood Analyses

    CERN Document Server

    Gelmini, Graciela B.

    2016-10-18

    We present two different halo-independent methods utilizing a global maximum likelihood that can assess the compatibility of dark matter direct detection data given a particular dark matter model. The global likelihood we use is comprised of at least one extended likelihood and an arbitrary number of Poisson or Gaussian likelihoods. In the first method we find the global best fit halo function and construct a two sided pointwise confidence band, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a "constrained parameter goodness-of-fit" test statistic, whose $p$-value we then use to define a "plausibility region" (e.g. where $p \\geq 10\\%$). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. $p < 10 \\%$). As an example we apply these methods to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic s...

  10. A new method for odour impact assessment based on spatial and temporal analyses of community response

    International Nuclear Information System (INIS)

    Henshaw, P.; Nicell, J.; Sikdar, A.

    2002-01-01

    Odorous emission from stationary sources account for the majority of air pollution complaints to regulatory agencies. Sometimes regulators rely on nuisance provisions of common law to assess odour impact, which is highly subjective. The other commonly used approach, the dilution-to-threshold principle, assumes that an odour is a problem simply if detected, without regard to the fact that a segment of the population can detect the odour at concentrations below the threshold. The odour impact model (OIM) represents a significant improvement over current methods for quantifying odours by characterizing the dose-response relationship of the odour. Dispersion modelling can be used in conjunction with the OIM to estimate the probability of response in the surrounding vicinity, taking into account the local meteorological conditions. The objective of this research is to develop an objective method of assessing the impact of odorous airborne emissions. To this end, several metrics were developed to quantify the impact of an odorous stationary source on the surrounding community. These 'odour impact parameters' are: maximum concentration, maximum probability of response, footprint area, probability-weighted footprint area and the number of people responding to the odour. These impact parameters were calculated for a stationary odour source in Canada. Several remediation scenarios for reducing the odour impact were proposed and their effect on the impact parameters calculated. (author)

  11. Assessing compatibility of direct detection data: halo-independent global likelihood analyses

    Energy Technology Data Exchange (ETDEWEB)

    Gelmini, Graciela B. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States); Huh, Ji-Haeng [CERN Theory Division,CH-1211, Geneva 23 (Switzerland); Witte, Samuel J. [Department of Physics and Astronomy, UCLA,475 Portola Plaza, Los Angeles, CA 90095 (United States)

    2016-10-18

    We present two different halo-independent methods to assess the compatibility of several direct dark matter detection data sets for a given dark matter model using a global likelihood consisting of at least one extended likelihood and an arbitrary number of Gaussian or Poisson likelihoods. In the first method we find the global best fit halo function (we prove that it is a unique piecewise constant function with a number of down steps smaller than or equal to a maximum number that we compute) and construct a two-sided pointwise confidence band at any desired confidence level, which can then be compared with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a “constrained parameter goodness-of-fit” test statistic, whose p-value we then use to define a “plausibility region” (e.g. where p≥10%). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. p<10%). We illustrate these methods by applying them to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic spin-independent isospin-conserving interactions or exothermic spin-independent isospin-violating interactions.

  12. Combining analytical frameworks to assess livelihood vulnerability to climate change and analyse adaptation options.

    Science.gov (United States)

    Reed, M S; Podesta, G; Fazey, I; Geeson, N; Hessel, R; Hubacek, K; Letson, D; Nainggolan, D; Prell, C; Rickenbach, M G; Ritsema, C; Schwilch, G; Stringer, L C; Thomas, A D

    2013-10-01

    Experts working on behalf of international development organisations need better tools to assist land managers in developing countries maintain their livelihoods, as climate change puts pressure on the ecosystem services that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories and methods. This review therefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change, whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.

  13. Biosphere analyses for the safety assessment SR-Site - synthesis and summary of results

    International Nuclear Information System (INIS)

    Saetre, Peter

    2010-12-01

    This report summarises nearly 20 biosphere reports and gives a synthesis of the work performed within the SR-Site Biosphere project, i.e. the biosphere part of SR-Site. SR-Site Biosphere provides the main project with dose conversion factors (LDFs), given a unit release rate, for calculation of human doses under different release scenarios, and assesses if a potential release from the repository would have detrimental effects on the environment. The intention of this report is to give sufficient details for an overview of methods, results and major conclusions, with references to the biosphere reports where methods, data and results are presented and discussed in detail. The philosophy of the biosphere assessment was to make estimations of the radiological risk for humans and the environment as realistic as possible, based on the knowledge of present-day conditions at Forsmark and the past and expected future development of the site. This was achieved by using the best available knowledge, understanding and data from extensive site investigations from two sites. When sufficient information was not available, uncertainties were handled cautiously. A systematic identification and evaluation of features and processes that affect transport and accumulation of radionuclides at the site was conducted, and the results were summarised in an interaction matrix. Data and understanding from the site investigation was an integral part of this work, the interaction matrix underpinned the development of the radionuclide model used in the biosphere assessment. Understanding of the marine, lake and river and terrestrial ecosystems at the site was summarized in a conceptual model, and relevant features and process have been characterized to capture site specific parameter values. Detailed investigations of the structure and history of the regolith at the site and simulations of regolith dynamics were used to describe the present day state at Forsmark and the expected development of

  14. Reproductive gonadal steroidogenic activity in the fishing cat (Prionailurus viverrinus) assessed by fecal steroid analyses.

    Science.gov (United States)

    Santymire, Rachel M; Brown, Janine L; Stewart, Rosemary A; Santymire, Robb C; Wildt, David E; Howard, JoGayle

    2011-10-01

    Non-invasive fecal steroid analyses were used to characterize gonadal activity in the fishing cat (Prionailurus viverrinus). Estrogen, progestagen and androgen metabolites were quantified in fecal samples collected for 12 months from four males and 10 females housed at seven North American zoological institutions. Male reproductive hormone concentrations did not vary (P>0.05) among season, and estrogen cycles were observed year-round in females and averaged (±SEM) 19.9±1.0 days. Mean peak estrogen concentration during estrus (460.0±72.6ng/g feces) was five-fold higher than baseline (87.3±14.0ng/g feces). Five of seven females (71.4%) housed alone or with another female demonstrated spontaneous luteal activity (apparent ovulation without copulation), with mean progestagen concentration (20.3±4.7μg/g feces), increasing nearly five-fold above baseline (4.1±0.8μg/g feces). The non-pregnant luteal phase averaged 32.9±2.5 days (n=13). One female delivered kittens 70 days after natural mating with fecal progestagen concentrations averaging 51.2±5.2μg/g feces. Two additional females were administered exogenous gonadotropins (150IU eCG; 100IU hCG), which caused hyper-elevated concentrations of fecal estrogen and progestagen (plus ovulation). Results indicate that: (1) male and female fishing cats managed in North American zoos are reproductively active year round; (2) 71.4% of females experienced spontaneous ovulation; and (3) females are responsive to exogenous gonadotropins for ovulation induction, but a regimen that produces a normative ovarian steroidogenic response needs to be identified. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Profile analyses of the Personality Assessment Inventory following military-related traumatic brain injury.

    Science.gov (United States)

    Kennedy, Jan E; Cooper, Douglas B; Reid, Matthew W; Tate, David F; Lange, Rael T

    2015-05-01

    Personality Assessment Inventory (PAI) profiles were examined in 160 U.S. service members (SMs) following mild-severe traumatic brain injury (TBI). Participants who sustained a mild TBI had significantly higher PAI scores than those with moderate-severe TBI on eight of the nine clinical scales examined. A two-step cluster analysis identified four PAI profiles, heuristically labeled "High Distress", "Moderate Distress", "Somatic Distress," and "No Distress". Postconcussive and posttraumatic stress symptom severity was highest for the High Distress group, followed by the Somatic and Moderate Distress groups, and the No Distress group. Profile groups differed in age, ethnicity, rank, and TBI severity. Findings indicate that meaningful patterns of behavioral and personality characteristics can be detected in active duty military SMs following TBI, which may prove useful in selecting the most efficacious rehabilitation strategies. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Geospatial analyses and system architectures for the next generation of radioactive materials risk assessment and routing

    International Nuclear Information System (INIS)

    Ganter, J.H.

    1996-01-01

    This paper suggests that inexorable changes in the society are presenting both challenges and a rich selection of technologies for responding to these challenges. The citizen is more demanding of environmental and personal protection, and of information. Simultaneously, the commercial and government information technologies markets are providing new technologies like commercial off-the-shelf (COTS) software, common datasets, ''open'' GIS, recordable CD-ROM, and the World Wide Web. Thus one has the raw ingredients for creating new techniques and tools for spatial analysis, and these tools can support participative study and decision-making. By carrying out a strategy of thorough and demonstrably correct science, design, and development, can move forward into a new generation of participative risk assessment and routing for radioactive and hazardous materials

  17. Combined proteomic and metallomic analyses in Scrobicularia plana clams to assess environmental pollution of estuarine ecosystems.

    Science.gov (United States)

    González-Domínguez, Raúl; Santos, Hugo Miguel; Bebianno, Maria João; García-Barrera, Tamara; Gómez-Ariza, José Luis; Capelo, José Luis

    2016-12-15

    Estuaries are very important ecosystems with great ecological and economic value, but usually highly impacted by anthropogenic pressure. Thus, the assessment of pollution levels in these habitats is critical in order to evaluate their environmental quality. In this work, we combined complementary metallomic and proteomic approaches with the aim to monitor the effects of environmental pollution on Scrobicularia plana clams captured in three estuarine systems from the south coast of Portugal; Arade estuary, Ria Formosa and Guadiana estuary. Multi-elemental profiling of digestive glands was carried out to evaluate the differential pollution levels in the three study areas. Then, proteomic analysis by means of two-dimensional gel electrophoresis and mass spectrometry revealed twenty-one differential proteins, which could be associated with multiple toxicological mechanisms induced in environmentally stressed organisms. Accordingly, it could be concluded that the combination of different omic approaches presents a great potential in environmental research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Assessing the Credit Risk of Corporate Bonds Based on Factor Analysis and Logistic Regress Analysis Techniques: Evidence from New Energy Enterprises in China

    Directory of Open Access Journals (Sweden)

    Yuanxin Liu

    2018-05-01

    Full Text Available In recent years, new energy sources have ushered in tremendous opportunities for development. The difficulties to finance new energy enterprises (NEEs can be estimated through issuing corporate bonds. However, there are few scientific and reasonable methods to assess the credit risk of NEE bonds, which is not conducive to the healthy development of NEEs. Based on this, this paper analyzes the advantages and risks of NEEs issuing bonds and the main factors affecting the credit risk of NEE bonds, constructs a hybrid model for assessing the credit risk of NEE bonds based on factor analysis and logistic regress analysis techniques, and verifies the applicability and effectiveness of the model employing relevant data from 46 Chinese NEEs. The results show that the main factors affecting the credit risk of NEE bonds are internal factors involving the company’s profitability, solvency, operational ability, growth potential, asset structure and viability, and external factors including macroeconomic environment and energy policy support. Based on the empirical results and the exact situation of China’s NEE bonds, this article finally puts forward several targeted recommendations.

  19. Systematic assessment of environmental risk factors for bipolar disorder: an umbrella review of systematic reviews and meta-analyses.

    Science.gov (United States)

    Bortolato, Beatrice; Köhler, Cristiano A; Evangelou, Evangelos; León-Caballero, Jordi; Solmi, Marco; Stubbs, Brendon; Belbasis, Lazaros; Pacchiarotti, Isabella; Kessing, Lars V; Berk, Michael; Vieta, Eduard; Carvalho, André F

    2017-03-01

    The pathophysiology of bipolar disorder is likely to involve both genetic and environmental risk factors. In our study, we aimed to perform a systematic search of environmental risk factors for BD. In addition, we assessed possible hints of bias in this literature, and identified risk factors supported by high epidemiological credibility. We searched the Pubmed/MEDLINE, EMBASE and PsycInfo databases up to 7 October 2016 to identify systematic reviews and meta-analyses of observational studies that assessed associations between putative environmental risk factors and BD. For each meta-analysis, we estimated its summary effect size by means of both random- and fixed-effects models, 95% confidence intervals (CIs), the 95% prediction interval, and heterogeneity. Evidence of small-study effects and excess of significance bias was also assessed. Sixteen publications met the inclusion criteria (seven meta-analyses and nine qualitative systematic reviews). Fifty-one unique environmental risk factors for BD were evaluated. Six meta-analyses investigated associations with a risk factor for BD. Only irritable bowel syndrome (IBS) emerged as a risk factor for BD supported by convincing evidence (k=6; odds ratio [OR]=2.48; 95% CI=2.35-2.61; P<.001), and childhood adversity was supported by highly suggestive evidence. Asthma and obesity were risk factors for BD supported by suggestive evidence, and seropositivity to Toxoplasma gondii and a history of head injury were supported by weak evidence. Notwithstanding that several environmental risk factors for BD were identified, few meta-analyses of observational studies were available. Therefore, further well-designed and adequately powered studies are necessary to map the environmental risk factors for BD. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Impacts Analyses Supporting the National Environmental Policy Act Environmental Assessment for the Resumption of Transient Testing Program

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Annette L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Brown, LLoyd C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Carathers, David C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Christensen, Boyd D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dahl, James J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miller, Mark L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Farnum, Cathy Ottinger [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peterson, Steven [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sondrup, A. Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Subaiya, Peter V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wachs, Daniel M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Weiner, Ruth F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-02-01

    This document contains the analysis details and summary of analyses conducted to evaluate the environmental impacts for the Resumption of Transient Fuel and Materials Testing Program. It provides an assessment of the impacts for the two action alternatives being evaluated in the environmental assessment. These alternatives are (1) resumption of transient testing using the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory (INL) and (2) conducting transient testing using the Annular Core Research Reactor (ACRR) at Sandia National Laboratory in New Mexico (SNL/NM). Analyses are provided for radiologic emissions, other air emissions, soil contamination, and groundwater contamination that could occur (1) during normal operations, (2) as a result of accidents in one of the facilities, and (3) during transport. It does not include an assessment of the biotic, cultural resources, waste generation, or other impacts that could result from the resumption of transient testing. Analyses were conducted by technical professionals at INL and SNL/NM as noted throughout this report. The analyses are based on bounding radionuclide inventories, with the same inventories used for test materials by both alternatives and different inventories for the TREAT Reactor and ACRR. An upper value on the number of tests was assumed, with a test frequency determined by the realistic turn-around times required between experiments. The estimates provided for impacts during normal operations are based on historical emission rates and projected usage rates; therefore, they are bounding. Estimated doses for members of the public, collocated workers, and facility workers that could be incurred as a result of an accident are very conservative. They do not credit safety systems or administrative procedures (such as evacuation plans or use of personal protective equipment) that could be used to limit worker doses. Doses estimated for transportation are conservative and are based on

  1. Biosphere analyses for the safety assessment SR-Site - synthesis and summary of results

    Energy Technology Data Exchange (ETDEWEB)

    Saetre, Peter [comp.

    2010-12-15

    This report summarises nearly 20 biosphere reports and gives a synthesis of the work performed within the SR-Site Biosphere project, i.e. the biosphere part of SR-Site. SR-Site Biosphere provides the main project with dose conversion factors (LDFs), given a unit release rate, for calculation of human doses under different release scenarios, and assesses if a potential release from the repository would have detrimental effects on the environment. The intention of this report is to give sufficient details for an overview of methods, results and major conclusions, with references to the biosphere reports where methods, data and results are presented and discussed in detail. The philosophy of the biosphere assessment was to make estimations of the radiological risk for humans and the environment as realistic as possible, based on the knowledge of present-day conditions at Forsmark and the past and expected future development of the site. This was achieved by using the best available knowledge, understanding and data from extensive site investigations from two sites. When sufficient information was not available, uncertainties were handled cautiously. A systematic identification and evaluation of features and processes that affect transport and accumulation of radionuclides at the site was conducted, and the results were summarised in an interaction matrix. Data and understanding from the site investigation was an integral part of this work, the interaction matrix underpinned the development of the radionuclide model used in the biosphere assessment. Understanding of the marine, lake and river and terrestrial ecosystems at the site was summarized in a conceptual model, and relevant features and process have been characterized to capture site specific parameter values. Detailed investigations of the structure and history of the regolith at the site and simulations of regolith dynamics were used to describe the present day state at Forsmark and the expected development of

  2. Assessing regional and interspecific variation in threshold responses of forest breeding birds through broad scale analyses.

    Directory of Open Access Journals (Sweden)

    Yntze van der Hoek

    Full Text Available BACKGROUND: Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. METHODOLOGY/PRINCIPAL FINDINGS: We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively. In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence and 66.45% (SE = 9.15, extinction in New York, compared to 51.08% (SE = 10.60, persistence and 73.67% (SE = 5.70, extinction in Vermont. Across species, thresholds were found at 19.45-87.96% forest cover for persistence and 50.82-91.02% for extinction dynamics. CONCLUSIONS/SIGNIFICANCE: Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that

  3. Assessing regional and interspecific variation in threshold responses of forest breeding birds through broad scale analyses.

    Science.gov (United States)

    van der Hoek, Yntze; Renfrew, Rosalind; Manne, Lisa L

    2013-01-01

    Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively). In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence) and 66.45% (SE = 9.15, extinction) in New York, compared to 51.08% (SE = 10.60, persistence) and 73.67% (SE = 5.70, extinction) in Vermont. Across species, thresholds were found at 19.45-87.96% forest cover for persistence and 50.82-91.02% for extinction dynamics. Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that regardless of the reasons behind these differences, our results merit a warning that

  4. Assessing Ecological Impacts of Shrimp and Sewage Effluent: Biological Indicators with Standard Water Quality Analyses

    Science.gov (United States)

    Jones, A. B.; O'Donohue, M. J.; Udy, J.; Dennison, W. C.

    2001-01-01

    Despite evidence linking shrimp farming to several cases of environmental degradation, there remains a lack of ecologically meaningful information about the impacts of effluent on receiving waters. The aim of this study was to determine the biological impact of shrimp farm effluent, and to compare and distinguish its impacts from treated sewage effluent. Analyses included standard water quality/sediment parameters, as well as biological indicators including tissue nitrogen (N) content, stable isotope ratio of nitrogen (δ 15N), and amino acid composition of inhabitant seagrasses, mangroves and macroalgae. The study area consisted of two tidal creeks, one receiving effluent from a sewage treatment plant and the other from an intensive shrimp farm. The creeks discharged into the western side of Moreton Bay, a sub-tropical coastal embayment on the east coast of Australia. Characterization of water quality revealed significant differences between the creeks, and with unimpacted eastern Moreton Bay. The sewage creek had higher concentrations of dissolved nutrients (predominantly NO-3/NO-2 and PO3-4, compared to NH+4 in the shrimp creek). In contrast, the shrimp creek was more turbid and had higher phytoplankton productivity. Beyond 750 m from the creek mouths, water quality parameters were indistinguishable from eastern Moreton Bay values. Biological indicators detected significant impacts up to 4 km beyond the creek mouths (reference site). Elevated plant δ 15N values ranged from 10·4-19·6‰ at the site of sewage discharge to 2·9-4·5‰ at the reference site. The free amino acid concentration and composition of seagrass and macroalgae was used to distinguish between the uptake of sewage and shrimp derived N. Proline (seagrass) and serine (macroalgae) were high in sewage impacted plants and glutamine (seagrass) and alanine (macroalgae) were high in plants impacted by shrimp effluent. The δ 15N isotopic signatures and free amino acid composition of inhabitant

  5. Assessing and analysing the impact of land take pressures on arable land

    Directory of Open Access Journals (Sweden)

    E. Aksoy

    2017-06-01

    Full Text Available Land, and in particular soil, is a finite and essentially non-renewable resource. Across the European Union, land take, i.e. the increase of settlement area over time, annually consumes more than 1000 km2 of which half is actually sealed and hence lost under impermeable surfaces. Land take, and in particular soil sealing, has already been identified as one of the major soil threats in the 2006 European Commission Communication Towards a Thematic Strategy on Soil Protection and the Soil Thematic Strategy and has been confirmed as such in the report on the implementation of this strategy. The aim of this study is to relate the potential of land for a particular use in a given region with the actual land use. This allows evaluating whether land (especially the soil dimension is used according to its (theoretical potential. To this aim, the impact of several land cover flows related to urban development on soils with good, average, and poor production potentials were assessed and mapped. Thus, the amount and quality (potential for agricultural production of arable land lost between the years 2000 and 2006 was identified. In addition, areas with high productivity potential around urban areas, indicating areas of potential future land use conflicts for Europe, were identified.

  6. Compound-Specific Isotope Analyses to Assess TCE Biodegradation in a Fractured Dolomitic Aquifer.

    Science.gov (United States)

    Clark, Justin A; Stotler, Randy L; Frape, Shaun K; Illman, Walter A

    2017-01-01

    The potential for trichloroethene (TCE) biodegradation in a fractured dolomite aquifer at a former chemical disposal site in Smithville, Ontario, Canada, is assessed using chemical analysis and TCE and cis-DCE compound-specific isotope analysis of carbon and chlorine collected over a 16-month period. Groundwater redox conditions change from suboxic to much more reducing environments within and around the plume, indicating that oxidation of organic contaminants and degradation products is occurring at the study site. TCE and cis-DCE were observed in 13 of 14 wells sampled. VC, ethene, and/or ethane were also observed in ten wells, indicating that partial/full dechlorination has occurred. Chlorine isotopic values (δ 37 Cl) range between 1.39 to 4.69‰ SMOC for TCE, and 3.57 to 13.86‰ SMOC for cis-DCE. Carbon isotopic values range between -28.9 and -20.7‰ VPDB for TCE, and -26.5 and -11.8‰ VPDB for cis-DCE. In most wells, isotopic values remained steady over the 15-month study. Isotopic enrichment from TCE to cis-DCE varied between 0 and 13‰ for carbon and 1 and 4‰ for chlorine. Calculated chlorine-carbon isotopic enrichment ratios (ϵ Cl /ϵ C ) were 0.18 for TCE and 0.69 for cis-DCE. Combined, isotopic and chemical data indicate very little dechlorination is occurring near the source zone, but suggest bacterially mediated degradation is occurring closer to the edges of the plume. © 2016, National Ground Water Association.

  7. Assessing the Extent of Sediment Contamination Around Creosote-treated Pilings Through Chemical and Biological Analyses

    Science.gov (United States)

    Stefansson, E. S.

    2008-12-01

    Creosote is a common wood preservative used to treat marine structures, such as docks and bulkheads. Treated dock pilings continually leach polycyclic aromatic hydrocarbons (PAHs) and other creosote compounds into the surrounding water and sediment. Over time, these compounds can accumulate in marine sediments, reaching much greater concentrations than those in seawater. The purpose of this study was to assess the extent of creosote contamination in sediments, at a series of distances from treated pilings. Three pilings were randomly selected from a railroad trestle in Fidalgo Bay, WA and sediment samples were collected at four distances from each: 0 meters, 0.5 meters, 1 meter, and 2 meters. Samples were used to conduct two bioassays: an amphipod bioassay (Rhepoxynius abronius) and a sand dollar embryo bioassay. Grain size and PAH content (using a fluorometric method) were also measured. Five samples in the amphipod bioassay showed significantly lower effective survival than the reference sediment. These consisted of samples closest to the piling at 0 and 0.5 meters. One 0 m sample in the sand dollar embryo bioassay also showed a significantly lower percentage of normal embryos than the reference sediment. Overall, results strongly suggest that creosote-contaminated sediments, particularly those closest to treated pilings, can negatively affect both amphipods and echinoderm embryos. Although chemical data were somewhat ambiguous, 0 m samples had the highest levels of PAHs, which corresponded to the lowest average survival in both bioassays. Relatively high levels of PAHs were found as far as 2 meters away from pilings. Therefore, we cannot say how far chemical contamination can spread from creosote-treated pilings, and at what distance this contamination can still affect marine organisms. These results, as well as future research, are essential to the success of proposed piling removal projects. In addition to creosote-treated pilings, contaminated sediments must

  8. Exploratory breath analyses for assessing toxic dermal exposures of firefighters during suppression of structural burns.

    Science.gov (United States)

    Pleil, Joachim D; Stiegel, Matthew A; Fent, Kenneth W

    2014-09-01

    Firefighters wear fireproof clothing and self-contained breathing apparatus (SCBA) during rescue and fire suppression activities to protect against acute effects from heat and toxic chemicals. Fire services are also concerned about long-term health outcomes from chemical exposures over a working lifetime, in particular about low-level exposures that might serve as initiating events for adverse outcome pathways (AOP) leading to cancer. As part of a larger US National Institute for Occupational Safety and Health (NIOSH) study of dermal exposure protection from safety gear used by the City of Chicago firefighters, we collected pre- and post-fire fighting breath samples and analyzed for single-ring and polycyclic aromatic hydrocarbons as bioindicators of occupational exposure to gas-phase toxicants. Under the assumption that SCBA protects completely against inhalation exposures, any changes in the exhaled profile of combustion products were attributed to dermal exposures from gas and particle penetration through the protective clothing. Two separate rounds of firefighting activity were performed each with 15 firefighters per round. Exhaled breath samples were collected onto adsorbent tubes and analyzed with gas-chromatography-mass spectrometry (GC-MS) with a targeted approach using selective ion monitoring. We found that single ring aromatics and some PAHs were statistically elevated in post-firefighting samples of some individuals, suggesting that fire protective gear may allow for dermal exposures to airborne contaminants. However, in comparison to a previous occupational study of Air Force maintenance personnel where similar compounds were measured, these exposures are much lower suggesting that firefighters' gear is very effective. This study suggests that exhaled breath sampling and analysis for specific targeted compounds is a suitable method for assessing systemic dermal exposure in a simple and non-invasive manner.

  9. Psychometric Properties of “Community Assessment of Psychic Experiences”: Review and Meta-analyses

    Science.gov (United States)

    Mark, Winifred; Toulopoulou, Timothea

    2016-01-01

    The Community Assessment of Psychic Experiences (CAPE) has been used extensively as a measurement for psychosis proneness in clinical and research settings. However, no prior review and meta-analysis have comprehensively examined psychometric properties (reliability and validity) of CAPE scores across different studies. To study CAPE’s internal reliability—ie, how well scale items correlate with one another—111 studies were reviewed. Of these, 18 reported unique internal reliability coefficients using data at hand, which were aggregated in a meta-analysis. Furthermore, to confirm the number and nature of factors tapped by CAPE, 17 factor analytic studies were reviewed and subjected to meta-analysis in cases of discrepancy. Results suggested that CAPE scores were psychometrically reliable—ie, scores obtained could be attributed to true score variance. Our review of factor analytic studies supported a 3-factor model for CAPE consisting of “Positive”, “Negative”, and “Depressive” subscales; and a tripartite structure for the Negative dimension consisting of “Social withdrawal”, “Affective flattening”, and “Avolition” subdimensions. Meta-analysis of factor analytic studies of the Positive dimension revealed a tridimensional structure consisting of “Bizarre experiences”, “Delusional ideations”, and “Perceptual anomalies”. Information on reliability and validity of CAPE scores is important for ensuring accurate measurement of the psychosis proneness phenotype, which in turn facilitates early detection and intervention for psychotic disorders. Apart from enhancing the understanding of psychometric properties of CAPE scores, our review revealed questionable reporting practices possibly reflecting insufficient understanding regarding the significance of psychometric properties. We recommend increased focus on psychometrics in psychology programmes and clinical journals. PMID:26150674

  10. Sample heterogeneity in unipolar depression as assessed by functional connectivity analyses is dominated by general disease effects.

    Science.gov (United States)

    Feder, Stephan; Sundermann, Benedikt; Wersching, Heike; Teuber, Anja; Kugel, Harald; Teismann, Henning; Heindel, Walter; Berger, Klaus; Pfleiderer, Bettina

    2017-11-01

    Combinations of resting-state fMRI and machine-learning techniques are increasingly employed to develop diagnostic models for mental disorders. However, little is known about the neurobiological heterogeneity of depression and diagnostic machine learning has mainly been tested in homogeneous samples. Our main objective was to explore the inherent structure of a diverse unipolar depression sample. The secondary objective was to assess, if such information can improve diagnostic classification. We analyzed data from 360 patients with unipolar depression and 360 non-depressed population controls, who were subdivided into two independent subsets. Cluster analyses (unsupervised learning) of functional connectivity were used to generate hypotheses about potential patient subgroups from the first subset. The relationship of clusters with demographical and clinical measures was assessed. Subsequently, diagnostic classifiers (supervised learning), which incorporated information about these putative depression subgroups, were trained. Exploratory cluster analyses revealed two weakly separable subgroups of depressed patients. These subgroups differed in the average duration of depression and in the proportion of patients with concurrently severe depression and anxiety symptoms. The diagnostic classification models performed at chance level. It remains unresolved, if subgroups represent distinct biological subtypes, variability of continuous clinical variables or in part an overfitting of sparsely structured data. Functional connectivity in unipolar depression is associated with general disease effects. Cluster analyses provide hypotheses about potential depression subtypes. Diagnostic models did not benefit from this additional information regarding heterogeneity. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Assessing deep-seated landslide susceptibility using 3-D groundwater and slope-stability analyses, southwestern Seattle, Washington

    Science.gov (United States)

    Brien, Dianne L.; Reid, Mark E.

    2008-01-01

    In Seattle, Washington, deep-seated landslides on bluffs along Puget Sound have historically caused extensive damage to land and structures. These large failures are controlled by three-dimensional (3-D) variations in strength and pore-water pressures. We assess the slope stability of part of southwestern Seattle using a 3-D limit-equilibrium analysis coupled with a 3-D groundwater flow model. Our analyses use a high-resolution digital elevation model (DEM) combined with assignment of strength and hydraulic properties based on geologic units. The hydrogeology of the Seattle area consists of a layer of permeable glacial outwash sand that overlies less permeable glacial lacustrine silty clay. Using a 3-D groundwater model, MODFLOW-2000, we simulate a water table above the less permeable units and calibrate the model to observed conditions. The simulated pore-pressure distribution is then used in a 3-D slope-stability analysis, SCOOPS, to quantify the stability of the coastal bluffs. For wet winter conditions, our analyses predict that the least stable areas are steep hillslopes above Puget Sound, where pore pressures are elevated in the outwash sand. Groundwater flow converges in coastal reentrants, resulting in elevated pore pressures and destabilization of slopes. Regions predicted to be least stable include the areas in or adjacent to three mapped historically active deep-seated landslides. The results of our 3-D analyses differ significantly from a slope map or results from one-dimensional (1-D) analyses.

  12. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  13. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a

  14. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures

    Science.gov (United States)

    Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-01

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.

  15. Fungible weights in logistic regression.

    Science.gov (United States)

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Vanadium NMR Chemical Shifts of (Imido)vanadium(V) Dichloride Complexes with Imidazolin-2-iminato and Imidazolidin-2-iminato Ligands: Cooperation with Quantum-Chemical Calculations and Multiple Linear Regression Analyses.

    Science.gov (United States)

    Yi, Jun; Yang, Wenhong; Sun, Wen-Hua; Nomura, Kotohiro; Hada, Masahiko

    2017-11-30

    The NMR chemical shifts of vanadium ( 51 V) in (imido)vanadium(V) dichloride complexes with imidazolin-2-iminato and imidazolidin-2-iminato ligands were calculated by the density functional theory (DFT) method with GIAO. The calculated 51 V NMR chemical shifts were analyzed by the multiple linear regression (MLR) analysis (MLRA) method with a series of calculated molecular properties. Some of calculated NMR chemical shifts were incorrect using the optimized molecular geometries of the X-ray structures. After the global minimum geometries of all of the molecules were determined, the trend of the observed chemical shifts was well reproduced by the present DFT method. The MLRA method was performed to investigate the correlation between the 51 V NMR chemical shift and the natural charge, band energy gap, and Wiberg bond index of the V═N bond. The 51 V NMR chemical shifts obtained with the present MLR model were well reproduced with a correlation coefficient of 0.97.

  17. Application of the Support Vector Regression Method for Turbidity Assessment with MODIS on a Shallow Coral Reef Lagoon (Voh-Koné-Pouembout, New Caledonia

    Directory of Open Access Journals (Sweden)

    Guillaume Wattelez

    2017-09-01

    Full Text Available Particle transport by erosion from ultramafic lands in pristine tropical lagoons is a crucial problem, especially for the benthic and pelagic biodiversity associated with coral reefs. Satellite imagery is useful for assessing particle transport from land to sea. However, in the oligotrophic and shallow waters of tropical lagoons, the bottom reflection of downwelling light usually hampers the use of classical optical algorithms. In order to address this issue, a Support Vector Regression (SVR model was developed and tested. The proposed application concerns the lagoon of New Caledonia—the second longest continuous coral reef in the world—which is frequently exposed to river plumes from ultramafic watersheds. The SVR model is based on a large training sample of in-situ turbidity values representative of the annual variability in the Voh-Koné-Pouembout lagoon (Western Coast of New Caledonia during the 2014–2015 period and on coincident satellite reflectance values from MODerate Resolution Imaging Spectroradiometer (MODIS. It was trained with reflectance and two other explanatory parameters—bathymetry and bottom colour. This approach significantly improved the model’s capacity for retrieving the in-situ turbidity range from MODIS images, as compared with algorithms dedicated to deep oligotrophic or turbid waters, which were shown to be inadequate. This SVR model is applicable to the whole shallow lagoon waters from the Western Coast of New Caledonia and it is now ready to be tested over other oligotrophic shallow lagoon waters worldwide.

  18. Assessing the impact of drinking water and sanitation on diarrhoeal disease in low- and middle-income settings: systematic review and meta-regression.

    Science.gov (United States)

    Wolf, Jennyfer; Prüss-Ustün, Annette; Cumming, Oliver; Bartram, Jamie; Bonjour, Sophie; Cairncross, Sandy; Clasen, Thomas; Colford, John M; Curtis, Valerie; De France, Jennifer; Fewtrell, Lorna; Freeman, Matthew C; Gordon, Bruce; Hunter, Paul R; Jeandron, Aurelie; Johnston, Richard B; Mäusezahl, Daniel; Mathers, Colin; Neira, Maria; Higgins, Julian P T

    2014-08-01

    To assess the impact of inadequate water and sanitation on diarrhoeal disease in low- and middle-income settings. The search strategy used Cochrane Library, MEDLINE & PubMed, Global Health, Embase and BIOSIS supplemented by screening of reference lists from previously published systematic reviews, to identify studies reporting on interventions examining the effect of drinking water and sanitation improvements in low- and middle-income settings published between 1970 and May 2013. Studies including randomised controlled trials, quasi-randomised trials with control group, observational studies using matching techniques and observational studies with a control group where the intervention was well defined were eligible. Risk of bias was assessed using a modified Ottawa-Newcastle scale. Study results were combined using meta-analysis and meta-regression to derive overall and intervention-specific risk estimates. Of 6819 records identified for drinking water, 61 studies met the inclusion criteria, and of 12,515 records identified for sanitation, 11 studies were included. Overall, improvements in drinking water and sanitation were associated with decreased risks of diarrhoea. Specific improvements, such as the use of water filters, provision of high-quality piped water and sewer connections, were associated with greater reductions in diarrhoea compared with other interventions. The results show that inadequate water and sanitation are associated with considerable risks of diarrhoeal disease and that there are notable differences in illness reduction according to the type of improved water and sanitation implemented. © 2014 John Wiley & Sons Ltd The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  19. A combined approach to investigate the toxicity of an industrial landfill's leachate: Chemical analyses, risk assessment and in vitro assays

    International Nuclear Information System (INIS)

    Baderna, D.; Maggioni, S.; Boriani, E.; Gemma, S.; Molteni, M.; Lombardo, A.; Colombo, A.; Bordonali, S.; Rotella, G.; Lodi, M.; Benfenati, E.

    2011-01-01

    Solid wastes constitute an important and emerging problem. Landfills are still one of the most common ways to manage waste disposal. The risk assessment of pollutants from landfills is becoming a major environmental issue in Europe, due to the large number of sites and to the importance of groundwater protection. Furthermore, there is lack of knowledge for the environmental, ecotoxicological and toxicological characteristics of most contaminants contained into landfill leacheates. Understanding leachate composition and creating an integrated strategy for risk assessment are currently needed to correctly face the landfill issues and to make projections on the long-term impacts of a landfill, with particular attention to the estimation of possible adverse effects on human health and ecosystem. In the present study, we propose an integrated strategy to evaluate the toxicity of the leachate using chemical analyses, risk assessment guidelines and in vitro assays using the hepatoma HepG2 cells as a model. The approach was applied on a real case study: an industrial waste landfill in northern Italy for which data on the presence of leachate contaminants are available from the last 11 years. Results from our ecological risk models suggest important toxic effects on freshwater fish and small rodents, mainly due to ammonia and inorganic constituents. Our results from in vitro data show an inhibition of cell proliferation by leachate at low doses and cytotoxic effect at high doses after 48 h of exposure. - Research highlights: → We study the toxicity of leachate from a non-hazardous industrial waste landfill. → We perform chemical analyses, risk assessments and in vitro assays on HepG2 cells. → Risk models suggest toxic effects due to ammonia and inorganic constituents. → In vitro assays show that leachate inhibits cell proliferation at low doses. → Leachate can induce cytotoxic effects on HepG2 cells at high doses.

  20. Assessing the effects of habitat patches ensuring propagule supply and different costs inclusion in marine spatial planning through multivariate analyses.

    Science.gov (United States)

    Appolloni, L; Sandulli, R; Vetrano, G; Russo, G F

    2018-05-15

    Marine Protected Areas are considered key tools for conservation of coastal ecosystems. However, many reserves are characterized by several problems mainly related to inadequate zonings that often do not protect high biodiversity and propagule supply areas precluding, at the same time, economic important zones for local interests. The Gulf of Naples is here employed as a study area to assess the effects of inclusion of different conservation features and costs in reserve design process. In particular eight scenarios are developed using graph theory to identify propagule source patches and fishing and exploitation activities as costs-in-use for local population. Scenarios elaborated by MARXAN, software commonly used for marine conservation planning, are compared using multivariate analyses (MDS, PERMANOVA and PERMDISP) in order to assess input data having greatest effects on protected areas selection. MARXAN is heuristic software able to give a number of different correct results, all of them near to the best solution. Its outputs show that the most important areas to be protected, in order to ensure long-term habitat life and adequate propagule supply, are mainly located around the Gulf islands. In addition through statistical analyses it allowed us to prove that different choices on conservation features lead to statistically different scenarios. The presence of propagule supply patches forces MARXAN to select almost the same areas to protect decreasingly different MARXAN results and, thus, choices for reserves area selection. The multivariate analyses applied here to marine spatial planning proved to be very helpful allowing to identify i) how different scenario input data affect MARXAN and ii) what features have to be taken into account in study areas characterized by peculiar biological and economic interests. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  2. Assessing residential building values in Spain for risk analyses - application to the landslide hazard in the Autonomous Community of Valencia

    Science.gov (United States)

    Cantarino, I.; Torrijo, F. J.; Palencia, S.; Gielen, E.

    2014-11-01

    This paper proposes a method of valuing the stock of residential buildings in Spain as the first step in assessing possible damage caused to them by natural hazards. For the purposes of the study we had access to the SIOSE (the Spanish Land Use and Cover Information System), a high-resolution land-use model, as well as to a report on the financial valuations of this type of building throughout Spain. Using dasymetric disaggregation processes and GIS techniques we developed a geolocalized method of obtaining this information, which was the exposure variable in the general risk assessment formula. Then, with the application over a hazard map, the risk value can be easily obtained. An example of its application is given in a case study that assesses the risk of a landslide in the entire 23 200 km2 of the Valencia Autonomous Community (NUT2), the results of which are analysed by municipal areas (LAU2) for the years 2005 and 2009.

  3. Development of SAGE, A computer code for safety assessment analyses for Korean Low-Level Radioactive Waste Disposal

    International Nuclear Information System (INIS)

    Zhou, W.; Kozak, Matthew W.; Park, Joowan; Kim, Changlak; Kang, Chulhyung

    2002-01-01

    This paper describes a computer code, called SAGE (Safety Assessment Groundwater Evaluation) to be used for evaluation of the concept for low-level waste disposal in the Republic of Korea (ROK). The conceptual model in the code is focused on releases from a gradually degrading engineered barrier system to an underlying unsaturated zone, thence to a saturated groundwater zone. Doses can be calculated for several biosphere systems including drinking contaminated groundwater, and subsequent contamination of foods, rivers, lakes, or the ocean by that groundwater. The flexibility of the code will permit both generic analyses in support of design and site development activities, and straightforward modification to permit site-specific and design-specific safety assessments of a real facility as progress is made toward implementation of a disposal site. In addition, the code has been written to easily interface with more detailed codes for specific parts of the safety assessment. In this way, the code's capabilities can be significantly expanded as needed. The code has the capability to treat input parameters either deterministic ally or probabilistic ally. Parameter input is achieved through a user-friendly Graphical User Interface.

  4. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  5. An innovative statistical approach for analysing non-continuous variables in environmental monitoring: assessing temporal trends of TBT pollution.

    Science.gov (United States)

    Santos, José António; Galante-Oliveira, Susana; Barroso, Carlos

    2011-03-01

    The current work presents an innovative statistical approach to model ordinal variables in environmental monitoring studies. An ordinal variable has values that can only be compared as "less", "equal" or "greater" and it is not possible to have information about the size of the difference between two particular values. The example of ordinal variable under this study is the vas deferens sequence (VDS) used in imposex (superimposition of male sexual characters onto prosobranch females) field assessment programmes for monitoring tributyltin (TBT) pollution. The statistical methodology presented here is the ordered logit regression model. It assumes that the VDS is an ordinal variable whose values match up a process of imposex development that can be considered continuous in both biological and statistical senses and can be described by a latent non-observable continuous variable. This model was applied to the case study of Nucella lapillus imposex monitoring surveys conducted in the Portuguese coast between 2003 and 2008 to evaluate the temporal evolution of TBT pollution in this country. In order to produce more reliable conclusions, the proposed model includes covariates that may influence the imposex response besides TBT (e.g. the shell size). The model also provides an analysis of the environmental risk associated to TBT pollution by estimating the probability of the occurrence of females with VDS ≥ 2 in each year, according to OSPAR criteria. We consider that the proposed application of this statistical methodology has a great potential in environmental monitoring whenever there is the need to model variables that can only be assessed through an ordinal scale of values.

  6. Margin benefit assessment of the YGN 3 cycle 1 fxy error files for COLSS and CPC overall uncertainty analyses

    International Nuclear Information System (INIS)

    Yoon, Rae Young; In, Wang Kee; Auh, Geun Sun; Kim, Hee Cheol; Lee, Sang Keun

    1994-01-01

    Margin benefits are quantitatively assessed for the Yonggwang Unit 3 (YGN 3) Cycle 1 planar radial peaking factor (Fxy) error files for each time-in-life, i.e., BOC, IOC, MOC and EOC. The generic Fxy error file (FXYMEQO) is presently used for Yonggwang Unit 3 Cycle 1 COLSS (Core Operating Limit Supervisory System) and CPC (Core Protection Calculator) Overall Uncertainty Analyses (OUA). However, because this file is more conservative than the plant/cycle specific Fxy error files, COLSS and CPC thermal margins (DNB-OPM) for the generic Fxy error file are less than those of the plant/cycle specific Fxy error file. Therefore, the YGN 3 Cycle 1 Fxy error files were generated and analyzed by the modified codes for Yonggwang Plants. The YGN 3 Cycle 1 Fxy error files gave the increased thermal margin by about 1% for COLSS and CPC, respectively

  7. Landfill mining: Resource potential of Austrian landfills--Evaluation and quality assessment of recovered municipal solid waste by chemical analyses.

    Science.gov (United States)

    Wolfsberger, Tanja; Aldrian, Alexia; Sarc, Renato; Hermann, Robert; Höllen, Daniel; Budischowsky, Andreas; Zöscher, Andreas; Ragoßnig, Arne; Pomberger, Roland

    2015-11-01

    Since the need for raw materials in countries undergoing industrialisation (like China) is rising, the availability of metal and fossil fuel energy resources (like ores or coal) has changed in recent years. Landfill sites can contain considerable amounts of recyclables and energy-recoverable materials, therefore, landfill mining is an option for exploiting dumped secondary raw materials, saving primary sources. For the purposes of this article, two sanitary landfill sites have been chosen for obtaining actual data to determine the resource potential of Austrian landfills. To evaluate how pretreating waste before disposal affects the resource potential of landfills, the first landfill site has been selected because it has received untreated waste, whereas mechanically-biologically treated waste was dumped in the second. The scope of this investigation comprised: (1) waste characterisation by sorting analyses of recovered waste; and (2) chemical analyses of specific waste fractions for quality assessment regarding potential energy recovery by using it as solid recovered fuels. The content of eight heavy metals and the net calorific values were determined for the chemical characterisation tests. © The Author(s) 2015.

  8. A large-area, spatially continuous assessment of land cover map error and its impact on downstream analyses.

    Science.gov (United States)

    Estes, Lyndon; Chen, Peng; Debats, Stephanie; Evans, Tom; Ferreira, Stefanus; Kuemmerle, Tobias; Ragazzo, Gabrielle; Sheffield, Justin; Wolf, Adam; Wood, Eric; Caylor, Kelly

    2018-01-01

    Land cover maps increasingly underlie research into socioeconomic and environmental patterns and processes, including global change. It is known that map errors impact our understanding of these phenomena, but quantifying these impacts is difficult because many areas lack adequate reference data. We used a highly accurate, high-resolution map of South African cropland to assess (1) the magnitude of error in several current generation land cover maps, and (2) how these errors propagate in downstream studies. We first quantified pixel-wise errors in the cropland classes of four widely used land cover maps at resolutions ranging from 1 to 100 km, and then calculated errors in several representative "downstream" (map-based) analyses, including assessments of vegetative carbon stocks, evapotranspiration, crop production, and household food security. We also evaluated maps' spatial accuracy based on how precisely they could be used to locate specific landscape features. We found that cropland maps can have substantial biases and poor accuracy at all resolutions (e.g., at 1 km resolution, up to ∼45% underestimates of cropland (bias) and nearly 50% mean absolute error (MAE, describing accuracy); at 100 km, up to 15% underestimates and nearly 20% MAE). National-scale maps derived from higher-resolution imagery were most accurate, followed by multi-map fusion products. Constraining mapped values to match survey statistics may be effective at minimizing bias (provided the statistics are accurate). Errors in downstream analyses could be substantially amplified or muted, depending on the values ascribed to cropland-adjacent covers (e.g., with forest as adjacent cover, carbon map error was 200%-500% greater than in input cropland maps, but ∼40% less for sparse cover types). The average locational error was 6 km (600%). These findings provide deeper insight into the causes and potential consequences of land cover map error, and suggest several recommendations for land

  9. The role of multicollinearity in landslide susceptibility assessment by means of Binary Logistic Regression: comparison between VIF and AIC stepwise selection

    Science.gov (United States)

    Cama, Mariaelena; Cristi Nicu, Ionut; Conoscenti, Christian; Quénéhervé, Geraldine; Maerker, Michael

    2016-04-01

    Landslide susceptibility can be defined as the likelihood of a landslide occurring in a given area on the basis of local terrain conditions. In the last decades many research focused on its evaluation by means of stochastic approaches under the assumption that 'the past is the key to the future' which means that if a model is able to reproduce a known landslide spatial distribution, it will be able to predict the future locations of new (i.e. unknown) slope failures. Among the various stochastic approaches, Binary Logistic Regression (BLR) is one of the most used because it calculates the susceptibility in probabilistic terms and its results are easily interpretable from a geomorphological point of view. However, very often not much importance is given to multicollinearity assessment whose effect is that the coefficient estimates are unstable, with opposite sign and therefore difficult to interpret. Therefore, it should be evaluated every time in order to make a model whose results are geomorphologically correct. In this study the effects of multicollinearity in the predictive performance and robustness of landslide susceptibility models are analyzed. In particular, the multicollinearity is estimated by means of Variation Inflation Index (VIF) which is also used as selection criterion for the independent variables (VIF Stepwise Selection) and compared to the more commonly used AIC Stepwise Selection. The robustness of the results is evaluated through 100 replicates of the dataset. The study area selected to perform this analysis is the Moldavian Plateau where landslides are among the most frequent geomorphological processes. This area has an increasing trend of urbanization and a very high potential regarding the cultural heritage, being the place of discovery of the largest settlement belonging to the Cucuteni Culture from Eastern Europe (that led to the development of the great complex Cucuteni-Tripyllia). Therefore, identifying the areas susceptible to

  10. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  11. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  12. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  13. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  14. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  15. Locoregional control of non-small cell lung cancer in relation to automated early assessment of tumor regression on cone beam computed tomography

    DEFF Research Database (Denmark)

    Brink, Carsten; Bernchou, Uffe; Bertelsen, Anders

    2014-01-01

    was estimated on the basis of the first one third and two thirds of the scans. The concordance between estimated and actual relative volume at the end of radiation therapy was quantified by Pearson's correlation coefficient. On the basis of the estimated relative volume, the patients were stratified into 2...... for other clinical characteristics. RESULTS: Automatic measurement of the tumor regression from standard CBCT images was feasible. Pearson's correlation coefficient between manual and automatic measurement was 0.86 in a sample of 9 patients. Most patients experienced tumor volume regression, and this could...

  16. Assessing models of speciation under different biogeographic scenarios; An empirical study using multi-locus and RNA-seq analyses

    Science.gov (United States)

    Edwards, Taylor; Tollis, Marc; Hsieh, PingHsun; Gutenkunst, Ryan N.; Liu, Zhen; Kusumi, Kenro; Culver, Melanie; Murphy, Robert W.

    2016-01-01

    Evolutionary biology often seeks to decipher the drivers of speciation, and much debate persists over the relative importance of isolation and gene flow in the formation of new species. Genetic studies of closely related species can assess if gene flow was present during speciation, because signatures of past introgression often persist in the genome. We test hypotheses on which mechanisms of speciation drove diversity among three distinct lineages of desert tortoise in the genus Gopherus. These lineages offer a powerful system to study speciation, because different biogeographic patterns (physical vs. ecological segregation) are observed at opposing ends of their distributions. We use 82 samples collected from 38 sites, representing the entire species' distribution and generate sequence data for mtDNA and four nuclear loci. A multilocus phylogenetic analysis in *BEAST estimates the species tree. RNA-seq data yield 20,126 synonymous variants from 7665 contigs from two individuals of each of the three lineages. Analyses of these data using the demographic inference package ∂a∂i serve to test the null hypothesis of no gene flow during divergence. The best-fit demographic model for the three taxa is concordant with the *BEAST species tree, and the ∂a∂i analysis does not indicate gene flow among any of the three lineages during their divergence. These analyses suggest that divergence among the lineages occurred in the absence of gene flow and in this scenario the genetic signature of ecological isolation (parapatric model) cannot be differentiated from geographic isolation (allopatric model).

  17. Assessing the response of area burned to changing climate in western boreal North America using a Multivariate Adaptive Regression Splines (MARS) approach

    Science.gov (United States)

    Michael S. Balshi; A. David McGuire; Paul Duffy; Mike Flannigan; John Walsh; Jerry Melillo

    2009-01-01

    We developed temporally and spatially explicit relationships between air temperature and fuel moisture codes derived from the Canadian Fire Weather Index System to estimate annual area burned at 2.5o (latitude x longitude) resolution using a Multivariate Adaptive Regression Spline (MARS) approach across Alaska and Canada. Burned area was...

  18. Analyse of pollution sources in Horna Nitra river basin using the system GeoEnviron such as instrument for groundwater and surface water pollution risk assessment

    International Nuclear Information System (INIS)

    Kutnik, P.

    2004-01-01

    In this presentation author deals with the analyse of pollution sources in Horna Nitra river basin using the system GeoEnviron such as instrument for groundwater and surface water pollution risk assessment

  19. Correlation and simple linear regression.

    Science.gov (United States)

    Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G

    2003-06-01

    In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.

  20. An Assessment of Polynomial Regression Techniques for the Relative Radiometric Normalization (RRN of High-Resolution Multi-Temporal Airborne Thermal Infrared (TIR Imagery

    Directory of Open Access Journals (Sweden)

    Mir Mustafizur Rahman

    2014-11-01

    Full Text Available Thermal Infrared (TIR remote sensing images of urban environments are increasingly available from airborne and satellite platforms. However, limited access to high-spatial resolution (H-res: ~1 m TIR satellite images requires the use of TIR airborne sensors for mapping large complex urban surfaces, especially at micro-scales. A critical limitation of such H-res mapping is the need to acquire a large scene composed of multiple flight lines and mosaic them together. This results in the same scene components (e.g., roads, buildings, green space and water exhibiting different temperatures in different flight lines. To mitigate these effects, linear relative radiometric normalization (RRN techniques are often applied. However, the Earth’s surface is composed of features whose thermal behaviour is characterized by complexity and non-linearity. Therefore, we hypothesize that non-linear RRN techniques should demonstrate increased radiometric agreement over similar linear techniques. To test this hypothesis, this paper evaluates four (linear and non-linear RRN techniques, including: (i histogram matching (HM; (ii pseudo-invariant feature-based polynomial regression (PIF_Poly; (iii no-change stratified random sample-based linear regression (NCSRS_Lin; and (iv no-change stratified random sample-based polynomial regression (NCSRS_Poly; two of which (ii and iv are newly proposed non-linear techniques. When applied over two adjacent flight lines (~70 km2 of TABI-1800 airborne data, visual and statistical results show that both new non-linear techniques improved radiometric agreement over the previously evaluated linear techniques, with the new fully-automated method, NCSRS-based polynomial regression, providing the highest improvement in radiometric agreement between the master and the slave images, at ~56%. This is ~5% higher than the best previously evaluated linear technique (NCSRS-based linear regression.

  1. Assessing diet quality of African ungulates from faecal analyses: the effect of forage quality, intake and herbivore species

    Directory of Open Access Journals (Sweden)

    J.M. Wrench

    1997-08-01

    Full Text Available Faecal phosphorous and nitrogen can be used as indicators of the nutritive content of the veld. Dietary P concentrations can be predicted with reasonable accuracy from faecal P concentrations in faeces of caged impala rams using a simple linear regression model, Y = 0.393X (r2 = 0.97. This regression holds whether impala are grazing or browsing as well as for high and low levels of intake. The regression equation used in the prediction of dietary P in zebra, blue wildebeest and cattle, did not differ significantly from this simple regression and a combined regression equation could be formulated. A faecal P concentration of less than 2 g P/kg OM would appear to indicate a P deficiency in most species. The prediction of dietary N is influenced by the intake of phenolic compounds and different regression equations exist for grazers and browsers. For prediction of dietary N concentrations, both the concentration of N and P in the faeces should be taken into account. This multiple regression equation is applicable for grazing impala at all levels of intake. For impala utilising browse, a regression model with faecal Acid Detergent Insoluble Nitrogen (ADIN and Acid Detergent Lignin (ADL should be used to predict dietary N concentration. For grazers, a faecal N concentration of less than 14 g/kg DM would indicate a deficiency. Dietary digestibility can be predicted accurately in some species using faecal N, P and ADL concentrations. However, more work needs to be done to quantify their effects.

  2. Equações de regressão para estimar valores energéticos do grão de trigo e seus subprodutos para frangos de corte, a partir de análises químicas Regression equations to evaluate the energy values of wheat grain and its by-products for broiler chickens from chemical analyses

    Directory of Open Access Journals (Sweden)

    F.M.O. Borges

    2003-12-01

    que significou pouca influência da metodologia sobre essa medida. A FDN não mostrou ser melhor preditor de EM do que a FB.One experiment was run with broiler chickens, to obtain prediction equations for metabolizable energy (ME based on feedstuffs chemical analyses, and determined ME of wheat grain and its by-products, using four different methodologies. Seven wheat grain by-products were used in five treatments: wheat grain, wheat germ, white wheat flour, dark wheat flour, wheat bran for human use, wheat bran for animal use and rough wheat bran. Based on chemical analyses of crude fiber (CF, ether extract (EE, crude protein (CP, ash (AS and starch (ST of the feeds and the determined values of apparent energy (MEA, true energy (MEV, apparent corrected energy (MEAn and true energy corrected by nitrogen balance (MEVn in five treatments, prediction equations were obtained using the stepwise procedure. CF showed the best relationship with metabolizable energy values, however, this variable alone was not enough for a good estimate of the energy values (R² below 0.80. When EE and CP were included in the equations, R² increased to 0.90 or higher in most estimates. When the equations were calculated with all treatments, the equation for MEA were less precise and R² decreased. When ME data of the traditional or force-feeding methods were used separately, the precision of the equations increases (R² higher than 0.85. For MEV and MEVn values, the best multiple linear equations included CF, EE and CP (R²>0.90, independently of using all experimental data or separating by methodology. The estimates of MEVn values showed high precision and the linear coefficients (a of the equations were similar for all treatments or methodologies. Therefore, it explains the small influence of the different methodologies on this parameter. NDF was not a better predictor of ME than CF.

  3. A Simulation Investigation of Principal Component Regression.

    Science.gov (United States)

    Allen, David E.

    Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…

  4. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  5. North Atlantic Aerosol Properties for Radiative Impact Assessments. Derived from Column Closure Analyses in TARFOX and ACE-2

    Science.gov (United States)

    Russell, Philip A.; Bergstrom, Robert A.; Schmid, Beat; Livingston, John M.

    2000-01-01

    Aerosol effects on atmospheric radiative fluxes provide a forcing function that can change the climate in potentially significant ways. This aerosol radiative forcing is a major source of uncertainty in understanding the climate change of the past century and predicting future climate. To help reduce this uncertainty, the 1996 Tropospheric Aerosol Radiative Forcing Observational Experiment (TARFOX) and the 1997 Aerosol Characterization Experiment (ACE-2) measured the properties and radiative effects of aerosols over the Atlantic Ocean. Both experiments used remote and in situ measurements from aircraft and the surface, coordinated with overpasses by a variety of satellite radiometers. TARFOX focused on the urban-industrial haze plume flowing from the United States over the western Atlantic, whereas ACE-2 studied aerosols over the eastern Atlantic from both Europe and Africa. These aerosols often have a marked impact on satellite-measured radiances. However, accurate derivation of flux changes, or radiative forcing, from the satellite measured radiances or retrieved aerosol optical depths (AODs) remains a difficult challenge. Here we summarize key initial results from TARFOX and ACE-2, with a focus on closure analyses that yield aerosol microphysical models for use in improved assessments of flux changes. We show how one such model gives computed radiative flux sensitivities (dF/dAOD) that agree with values measured in TARFOX and preliminary values computed for the polluted marine boundary layer in ACE-2. A companion paper uses the model to compute aerosol-induced flux changes over the North Atlantic from AVHRR-derived AOD fields.

  6. A meta-analytic review of life cycle assessment and flow analyses studies of palm oil biodiesel.

    Science.gov (United States)

    Manik, Yosef; Halog, Anthony

    2013-01-01

    This work reviews and performs a meta-analysis of the recent life cycle assessment and flow analyses studies palm oil biodiesel. The best available data and information are extracted, summarized, and discussed. Most studies found palm oil biodiesel would produce positive energy balance with an energy ratio between 2.27 and 4.81, and with a net energy production of 112 GJ ha(-1) y(-1). With the exception of a few studies, most conclude that palm oil biodiesel is a net emitter of greenhouse gases (GHG). The origin of oil palm plantation (planted area) is the foremost determinant of GHG emissions and C payback time (CPBT). Converting peatland forest results in GHG emissions up to 60 tons CO(2) equivalent (eq) ha(-1) y(-1) leading to 420 years of CPBT. In contrast, converting degraded land or grassland for plantation can positively offset the system to become a net sequester of 5 tons CO(2) eq ha(-1) y(-1). Few studies have discussed cradle-to-grave environmental impacts such as acidification, eutrophication, toxicity, and biodiversity, which open opportunity for further studies. Copyright © 2012 SETAC.

  7. An assessment of the wind re-analyses in the modelling of an extreme sea state in the Black Sea

    Science.gov (United States)

    Akpinar, Adem; Ponce de León, S.

    2016-03-01

    This study aims at an assessment of wind re-analyses for modelling storms in the Black Sea. A wind-wave modelling system (Simulating WAve Nearshore, SWAN) is applied to the Black Sea basin and calibrated with buoy data for three recent re-analysis wind sources, namely the European Centre for Medium-Range Weather Forecasts Reanalysis-Interim (ERA-Interim), Climate Forecast System Reanalysis (CFSR), and Modern Era Retrospective Analysis for Research and Applications (MERRA) during an extreme wave condition that occurred in the north eastern part of the Black Sea. The SWAN model simulations are carried out for default and tuning settings for deep water source terms, especially whitecapping. Performances of the best model configurations based on calibration with buoy data are discussed using data from the JASON2, TOPEX-Poseidon, ENVISAT and GFO satellites. The SWAN model calibration shows that the best configuration is obtained with Janssen and Komen formulations with whitecapping coefficient (Cds) equal to 1.8e-5 for wave generation by wind and whitecapping dissipation using ERA-Interim. In addition, from the collocated SWAN results against the satellite records, the best configuration is determined to be the SWAN using the CFSR winds. Numerical results, thus show that the accuracy of a wave forecast will depend on the quality of the wind field and the ability of the SWAN model to simulate the waves under extreme wind conditions in fetch limited wave conditions.

  8. The effects of ropivacaine hydrochloride on platelet function: an assessment using the platelet function analyser (PFA-100).

    LENUS (Irish Health Repository)

    Porter, J

    2012-02-03

    Amide local anaesthetics impair blood clotting in a concentration-dependent manner by inhibition of platelet function and enhanced fibrinolysis. We hypothesised that the presence of ropivacaine in the epidural space could decrease the efficacy of an epidural blood patch, as this technique requires that the injected blood can clot in order to be effective. Ropivacaine is an aminoamide local anaesthetic used increasingly for epidural analgesia during labour. The concentration of local anaesthetic in blood achieved in the epidural space during the performance of an epidural blood patch is likely to be the greatest which occurs (intentionally) in any clinical setting. This study was undertaken to investigate whether concentrations of ropivacaine in blood, which could occur: (i) clinically in the epidural space and (ii) in plasma during an epidural infusion of ropivacaine, alter platelet function. A platelet function analyser (Dade PFA-100, Miami) was employed to assess the effects of ropivacaine-treated blood on platelet function. The greater concentrations of ropivacaine studied (3.75 and 1.88 mg x ml(-1)), which correspond to those which could occur in the epidural space, produced significant inhibition of platelet aggregation. We conclude that the presence of ropivacaine in the epidural space may decrease the efficacy of an early or prophylactic epidural blood patch.

  9. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  10. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  11. Assessment of regression models for adjustment of iron status biomarkers for inflammation in children with moderate acute malnutrition in Burkina Faso

    DEFF Research Database (Denmark)

    Cichon, Bernardette; Ritz, Christian; Fabiansen, Christian

    2017-01-01

    BACKGROUND: Biomarkers of iron status are affected by inflammation. In order to interpret them in individuals with inflammation, the use of correction factors (CFs) has been proposed. OBJECTIVE: The objective of this study was to investigate the use of regression models as an alternative to the CF...... approach. METHODS: Morbidity data were collected during clinical examinations with morbidity recalls in a cross-sectional study in children aged 6-23 mo with moderate acute malnutrition. C-reactive protein (CRP), α1-acid glycoprotein (AGP), serum ferritin (SF), and soluble transferrin receptor (sTfR) were......TfR with the use of the best-performing model led to a 17% point increase and iron deficiency. CONCLUSION: Regression analysis is an alternative to adjust SF and may be preferable in research settings, because it can take morbidity and severity...

  12. Quantitative analyses at baseline and interim PET evaluation for response assessment and outcome definition in patients with malignant pleural mesothelioma

    Energy Technology Data Exchange (ETDEWEB)

    Lopci, Egesta; Chiti, Arturo [Humanitas Research Hospital, Nuclear Medicine Department, Rozzano, Milan (Italy); Zucali, Paolo Andrea; Perrino, Matteo; Gianoncelli, Letizia; Lorenzi, Elena; Gemelli, Maria; Santoro, Armando [Humanitas Research Hospital, Oncology, Rozzano (Italy); Ceresoli, Giovanni Luca [Humanitas Gavazzeni, Oncology, Bergamo (Italy); Giordano, Laura [Humanitas Research Hospital, Biostatistics, Rozzano (Italy)

    2015-04-01

    Quantitative analyses on FDG PET for response assessment are increasingly used in clinical studies, particularly with respect to tumours in which radiological assessment is challenging and complete metabolic response is rarely achieved after treatment. A typical example is malignant pleural mesothelioma (MPM), an aggressive tumour originating from mesothelial cells of the pleura. We present our results concerning the use of semiquantitative and quantitative parameters, evaluated at the baseline and interim PET examinations, for the prediction of treatment response and disease outcome in patients with MPM. We retrospectively analysed data derived from 131 patients (88 men, 43 women; mean age 66 years) with MPM who were referred to our institution for treatment between May 2004 and July 2013. Patients were investigated using FDG PET at baseline and after two cycles of pemetrexed-based chemotherapy. Responses were determined using modified RECIST criteria based on the best CT response after treatment. Disease control rate, progression-free survival (PFS) and overall survival (OS) were calculated for the whole population and were correlated with semiquantitative and quantitative parameters evaluated at the baseline and interim PET examinations; these included SUV{sub max}, total lesion glycolysis (TLG), percentage change in SUV{sub max} (ΔSUV{sub max}) and percentage change in TLG (ΔTLG). Disease control was achieved in 84.7 % of the patients, and median PFS and OS for the entire cohort were 7.2 and 14.3 months, respectively. The log-rank test showed a statistically significant difference in PFS between patients with radiological progression and those with partial response (PR) or stable disease (SD) (1.8 vs. 8.6 months, p < 0.001). Baseline SUV{sub max} and TLG showed a statistically significant correlation with PFS and OS (p < 0.001). In the entire population, both ΔSUV{sub max} and ΔTLG were correlated with disease control based on best CT response (p < 0

  13. Logistic regression for dichotomized counts.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  14. Multilingual speaker age recognition: regression analyses on the Lwazi corpus

    CSIR Research Space (South Africa)

    Feld, M

    2009-12-01

    Full Text Available Multilinguality represents an area of significant opportunities for automatic speech-processing systems: whereas multilingual societies are commonplace, the majority of speechprocessing systems are developed with a single language in mind. As a step...

  15. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  16. Local systems, global impacts. Using life cycle assessment to analyse the potential and constraints of industrial symbioses

    Energy Technology Data Exchange (ETDEWEB)

    Sokka, L.

    2011-08-15

    Human activities extract and displace different substances and materials from the earthAEs crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper 1). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle

  17. Simple Linear Regression and Reflectance Sensitivity Analysis Used to Determine the Optimum Wavelength for Nondestructive Assessment of Chlorophyll in Fresh Leaves Using Spectral Reflectance

    Science.gov (United States)

    The accuracy of nondestructive optical methods for chlorophyll (Chl) assessment based on leaf spectral characteristics depends on the wavelengths used for Chl assessment. Using spectroscopy, the optimum wavelengths for Chl assessment (OWChl) were determined for almond, poplar, and apple trees grown ...

  18. Influence of soft tissue in the assessment of the primary fixation of acetabular cup implants using impact analyses.

    Science.gov (United States)

    Bosc, Romain; Tijou, Antoine; Rosi, Giuseppe; Nguyen, Vu-Hieu; Meningaud, Jean-Paul; Hernigou, Philippe; Flouzat-Lachaniette, Charles-Henri; Haiat, Guillaume

    2018-06-01

    The acetabular cup (AC) implant primary stability is an important determinant for the success of cementless hip surgery but it remains difficult to assess the AC implant fixation in the clinic. A method based on the analysis of the impact produced by an instrumented hammer on the ancillary has been developed by our group (Michel et al., 2016a). However, the soft tissue thickness present around the acetabulum may affect the impact response, which may hamper the robustness of the method. The aim of this study is to evaluate the influence of the soft tissue thickness (STT) on the acetabular cup implant primary fixation evaluation using impact analyses. To do so, different AC implants were inserted in five bovine bone samples. For each sample, different stability conditions were obtained by changing the cavity diameter. For each configuration, the AC implant was impacted 25 times with 10 and 30 mm of soft tissues positioned underneath the sample. The averaged indicator I m was determined based on the amplitude of the signal for each configuration and each STT and the pull-out force was measured. The results show that the resonance frequency of the system increases when the value of the soft tissue thickness decreases. Moreover, an ANOVA analysis shows that there was no significant effect of the value of soft tissue thickness on the values of the indicator I m (F = 2.33; p-value = 0.13). This study shows that soft tissue thickness does not appear to alter the prediction of the acetabular cup implant primary fixation obtained using the impact analysis approach, opening the path towards future clinical trials. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Assessment of S(α, β) libraries for criticality safety evaluations of wet storage pools by refined trend analyses

    International Nuclear Information System (INIS)

    Kolbe, E.; Vasiliev, A.; Ferroukhi, H.

    2009-01-01

    In a recent criticality safety evaluation (CSE) of a commercial wet storage pool applying MCNPX-2.5.0 in combination with the ENDF/B-VII.0 and JEFF-3.1 continuous energy cross section libraries, the maximum permissible initial fuel-enrichment limit for water reflected configurations was found to be dependant upon the applied neutron cross section library. More detailed investigations indicated that the difference is mainly caused by different sub-libraries for thermal neutron scattering based on parameterizations of the S(α, β) scattering matrix. Hence an analysis of trends was done with respect to the low energy neutron flux in order to assess the S(α, β) data sets. First, when performing the trend analysis based on the full set of 149 benchmarks that were employed for the validation, significant trends could not be found. But by analyzing a selected subset of benchmarks clear trends with respect to the low energy neutron flux could be detected. The results presented in this paper demonstrate the sensitivity of specific configurations to the parameterizations of the S(α, β) scattering matrix and thus may help to improve CSE of wet storage pools. Finally, in addition to the low energy neutron flux, we also refined the trend analyses with respect to other key (spectrum-related) parameters by performing them with various selected subsets of the full suite of 149 benchmarks. The corresponding outcome using MCNPX 2.5.0 in combination with the ENDF/B-VII.0, ENDF/B-VI.8, JEFF-3.1, JEF-2.2, and JENDL-3.3 neutron cross section libraries are presented and discussed. (authors)

  20. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  1. Multicollinearity and Regression Analysis

    Science.gov (United States)

    Daoud, Jamal I.

    2017-12-01

    In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.

  2. Minimax Regression Quantiles

    DEFF Research Database (Denmark)

    Bache, Stefan Holst

    A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....

  3. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  4. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  5. Assessing outcomes of large-scale public health interventions in the absence of baseline data using a mixture of Cox and binomial regressions

    Science.gov (United States)

    2014-01-01

    Background Large-scale public health interventions with rapid scale-up are increasingly being implemented worldwide. Such implementation allows for a large target population to be reached in a short period of time. But when the time comes to investigate the effectiveness of these interventions, the rapid scale-up creates several methodological challenges, such as the lack of baseline data and the absence of control groups. One example of such an intervention is Avahan, the India HIV/AIDS initiative of the Bill & Melinda Gates Foundation. One question of interest is the effect of Avahan on condom use by female sex workers with their clients. By retrospectively reconstructing condom use and sex work history from survey data, it is possible to estimate how condom use rates evolve over time. However formal inference about how this rate changes at a given point in calendar time remains challenging. Methods We propose a new statistical procedure based on a mixture of binomial regression and Cox regression. We compare this new method to an existing approach based on generalized estimating equations through simulations and application to Indian data. Results Both methods are unbiased, but the proposed method is more powerful than the existing method, especially when initial condom use is high. When applied to the Indian data, the new method mostly agrees with the existing method, but seems to have corrected some implausible results of the latter in a few districts. We also show how the new method can be used to analyze the data of all districts combined. Conclusions The use of both methods can be recommended for exploratory data analysis. However for formal statistical inference, the new method has better power. PMID:24397563

  6. Using occupancy modeling and logistic regression to assess the distribution of shrimp species in lowland streams, Costa Rica: Does regional groundwater create favorable habitat?

    Science.gov (United States)

    Snyder, Marcia; Freeman, Mary C.; Purucker, S. Thomas; Pringle, Catherine M.

    2016-01-01

    Freshwater shrimps are an important biotic component of tropical ecosystems. However, they can have a low probability of detection when abundances are low. We sampled 3 of the most common freshwater shrimp species, Macrobrachium olfersii, Macrobrachium carcinus, and Macrobrachium heterochirus, and used occupancy modeling and logistic regression models to improve our limited knowledge of distribution of these cryptic species by investigating both local- and landscape-scale effects at La Selva Biological Station in Costa Rica. Local-scale factors included substrate type and stream size, and landscape-scale factors included presence or absence of regional groundwater inputs. Capture rates for 2 of the sampled species (M. olfersii and M. carcinus) were sufficient to compare the fit of occupancy models. Occupancy models did not converge for M. heterochirus, but M. heterochirus had high enough occupancy rates that logistic regression could be used to model the relationship between occupancy rates and predictors. The best-supported models for M. olfersii and M. carcinus included conductivity, discharge, and substrate parameters. Stream size was positively correlated with occupancy rates of all 3 species. High stream conductivity, which reflects the quantity of regional groundwater input into the stream, was positively correlated with M. olfersii occupancy rates. Boulder substrates increased occupancy rate of M. carcinus and decreased the detection probability of M. olfersii. Our models suggest that shrimp distribution is driven by factors that function at local (substrate and discharge) and landscape (conductivity) scales.

  7. Functional connectivity of coral reef fishes in a tropical seascape assessed by compound-specific stable isotope analyses

    KAUST Repository

    McMahon, Kelton W.

    2011-01-01

    The ecological integrity of tropical habitats, including mangroves, seagrass beds and coral reefs, is coming under increasing pressure from human activities. Many coral reef fish species are thought to use mangroves and seagrass beds as juvenile nurseries before migrating to coral reefs as adults. Identifying essential habitats and preserving functional linkages among these habitats is likely necessary to promote ecosystem health and sustainable fisheries on coral reefs. This necessitates quantitative assessment of functional connectivity among essential habitats at the seascape level. This thesis presents the development and first application of a method for tracking fish migration using amino acid (AA) δ13C analysis in otoliths. In a controlled feeding experiment with fish reared on isotopically distinct diets, we showed that essential AAs exhibited minimal trophic fractionation between consumer and diet, providing a δ13C record of the baseline isoscape. We explored the potential for geochemical signatures in otoliths of snapper to act as natural tags of residency in seagrass beds, mangroves and coral reefs in the Red Sea, Caribbean Sea and Eastern Pacific Ocean. The δ13C values of otolith essential AAs varied as a function of habitat type and provided a better tracer of residence in juvenile nursery habitats than conventional bulk stable isotope analyses (SIA). Using our otolith AA SIA approach, we quantified the relative contribution of coastal wetlands and reef habitats to Lutjanus ehrenbergii populations on coastal, shelf and oceanic coral reefs in the Red Sea. L. ehrenbergii made significant ontogenetic migrations, traveling more than 30 km from juvenile nurseries to coral reefs and across deep open water. Coastal wetlands were important nurseries for L. ehrenbergii; however, there was significant plasticity in L. ehrenbergii juvenile habitat requirements. Seascape configuration played an important role in determining the functional connectivity of L

  8. Multiple linear regression analysis

    Science.gov (United States)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  9. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  10. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  11. Nonlinear Regression with R

    CERN Document Server

    Ritz, Christian; Parmigiani, Giovanni

    2009-01-01

    R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.

  12. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Bounded Gaussian process regression

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan

    2013-01-01

    We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....

  14. and Multinomial Logistic Regression

    African Journals Online (AJOL)

    This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).

  15. Mechanisms of neuroblastoma regression

    Science.gov (United States)

    Brodeur, Garrett M.; Bagatell, Rochelle

    2014-01-01

    Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179

  16. Assessment of Brown Bear\\'s (Ursus arctos syriacus Winter Habitat Using Geographically Weighted Regression and Generalized Linear Model in South of Iran

    Directory of Open Access Journals (Sweden)

    A. A. Zarei

    2016-03-01

    Full Text Available Winter dens are one of the important components of brown bear's (Ursus arctos syriacus habitat, affecting their reproduction and survival. Therefore identification of factors affecting the habitat selection and suitable denning areas in the conservation of our largest carnivore is necessary. We used Geographically Weighted Logistic Regression (GWLR and Generalized Linear Model (GLM for modeling suitability of denning habitat in Kouhkhom region in Fars province. In the present research, 20 dens (presence locations and 20 caves where signs of bear were not found (absence locations were used as dependent variables and six environmental factors were used for each location as independent variables. The results of GLM showed that variables of distance to settlements, altitude, and distance to water were the most important parameters affecting suitability of the brown bear's denning habitat. The results of GWLR showed the significant local variations in the relationship between occurrence of brown bear dens and the variable of distance to settlements. Based on the results of both models, suitable habitats for denning of the species are impassable areas in the mountains and inaccessible for humans.

  17. Confirmatory Factor Analysis and Multiple Linear Regression of the Neck Disability Index: Assessment If Subscales Are Equally Relevant in Whiplash and Nonspecific Neck Pain.

    Science.gov (United States)

    Croft, Arthur C; Milam, Bryce; Meylor, Jade; Manning, Richard

    2016-06-01

    Because of previously published recommendations to modify the Neck Disability Index (NDI), we evaluated the responsiveness and dimensionality of the NDI within a population of adult whiplash-injured subjects. The purpose of the present study was to evaluate the responsiveness and dimensionality of the NDI within a population of adult whiplash-injured subjects. Subjects who had sustained whiplash injuries of grade 2 or higher completed an NDI questionnaire. There were 123 subjects (55% female, of which 36% had recovered and 64% had chronic symptoms. NDI subscales were analyzed using confirmatory factor analysis, considering only the subscales and, secondly, using sex as an 11th variable. The subscales were also tested with multiple linear regression modeling using the total score as a target variable. When considering only the 10 NDI subscales, only a single factor emerged, with an eigenvalue of 5.4, explaining 53.7% of the total variance. Strong correlation (> .55) (P factor model of the NDI is not justified based on our results, and in this population of whiplash subjects, the NDI was unidimensional, demonstrating high internal consistency and supporting the original validation study of Vernon and Mior.

  18. Pitfalls in the assessment of radioresponse as determined by tumor regression. Consideration based on the location and histologic constitution of tumors

    Energy Technology Data Exchange (ETDEWEB)

    Ohara, Kiyoshi; Shimizu, Wakako; Itai, Yuji [Tsukuba Univ., Ibaraki (Japan). Inst. of Clinical Medicine

    2000-05-01

    To prove the following hypotheses regarding tumor shrinkage after radiotherapy. Tumors located on an outer tissue surface, e.g. esophageal tumors shrink faster than parenchymal tumors, e.g. lymph-node metastasis, because two clearance mechanisms, exfoliation and absorption, can operate in the former type of tumors whereas only absorption can function in the latter. Tumors which are being controlled do not necessarily respond completely, because tumors are constituted not only of tumor cells but also stromal tissues that are difficult to be absorbed. Long-term shrinkage patterns of a parenchymal tumor were determined by using 18 curatively irradiated hepatomas. Preoperatively irradiated thymomas (10) and lymph-node metastases (37) from head and neck cancers were examined histopathologically. Twenty-one esophageal cancers were used for intra-patient response comparison between the primary disease and the lymph-node metastases. Shrinkage patterns were generally biphasic: rapid exponential regression followed by a plateau phase. Histologically, thymomas generally consisted of predominant fibrous tissues and few remaining tumor cells. Radioresponse did not predict the presence of remaining cancer cells in the lymph nodes. Esophageal-cancer radiorespone was always higher for the primary disease than the lymph-node metastases. The location and histologic constitution of tumors must be taken into account in predicting radiocurability using radioresponse. (author)

  19. Assessing the impacts of droughts and heat waves at thermoelectric power plants in the United States using integrated regression, thermodynamic, and climate models

    Directory of Open Access Journals (Sweden)

    Margaret A. Cook

    2015-11-01

    Full Text Available Recent droughts and heat waves have revealed the vulnerability of some power plants to effects from higher temperature intake water for cooling. In this evaluation, we develop a methodology for predicting whether power plants are at risk of violating thermal pollution limits. We begin by developing a regression model of average monthly intake temperatures for open loop and recirculating cooling pond systems. We then integrate that information into a thermodynamic model of energy flows within each power plant to determine the change in cooling water temperature that occurs at each plant and the relationship of that water temperature to other plants in the river system. We use these models together with climate change models to estimate the monthly effluent temperature at twenty-six power plants in the Upper Mississippi River Basin and Texas between 2015 and 2035 to predict which ones are at risk of reaching thermal pollution limits. The intake model shows that two plants could face elevated intake temperatures between 2015 and 2035 compared to the 2010–2013 baseline. In general, a rise in ambient cooling water temperature of 1 °C could cause a drop in power output of 0.15%–0.5%. The energy balance shows that twelve plants might exceed state summer effluent limits.

  20. The Application of Multinomial Logistic Regression Models for the Assessment of Parameters of Oocytes and Embryos Quality in Predicting Pregnancy and Miscarriage

    Directory of Open Access Journals (Sweden)

    Milewska Anna Justyna

    2017-09-01

    Full Text Available Infertility is a huge problem nowadays, not only from the medical but also from the social point of view. A key step to improve treatment outcomes is the possibility of effective prediction of treatment result. In a situation when a phenomenon with more than 2 states needs to be explained, e.g. pregnancy, miscarriage, non-pregnancy, the use of multinomial logistic regression is a good solution. The aim of this paper is to select those features that have a significant impact on achieving clinical pregnancy as well as those that determine the occurrence of spontaneous miscarriage (non-pregnancy was set as the reference category. Two multi-factor models were obtained, used in predicting infertility treatment outcomes. One of the models enabled to conclude that the number of follicles and the percentage of retrieved mature oocytes have a significant impact when prediction of treatment outcome is made on the basis of information about oocytes. The other model, built on the basis of information about embryos, showed the significance of the number of fertilized oocytes, the percentage of at least 7-cell embryos on day 3, the percentage of blasts on day 5, and the day of transfer.

  1. Categorical regression dose-response modeling

    Science.gov (United States)

    The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...

  2. Robustness assessments are needed to reduce bias in meta-analyses that include zero-event randomized trials

    DEFF Research Database (Denmark)

    Keus, F; Wetterslev, J; Gluud, C

    2009-01-01

    of statistical method on inference. RESULTS: In seven meta-analyses of seven outcomes from 15 trials, there were zero-event trials in 0 to 71.4% of the trials. We found inconsistency in significance in one of seven outcomes (14%; 95% confidence limit 0.4%-57.9%). There was also considerable variability......OBJECTIVES: Meta-analysis of randomized trials with binary data can use a variety of statistical methods. Zero-event trials may create analytic problems. We explored how different methods may impact inferences from meta-analyses containing zero-event trials. METHODS: Five levels of statistical...... methods are identified for meta-analysis with zero-event trials, leading to numerous data analyses. We used the binary outcomes from our Cochrane review of randomized trials of laparoscopic vs. small-incision cholecystectomy for patients with symptomatic cholecystolithiasis to illustrate the influence...

  3. Binary Logistic Regression Analysis in Assessment and Identifying Factors That Influence Students' Academic Achievement: The Case of College of Natural and Computational Science, Wolaita Sodo University, Ethiopia

    Science.gov (United States)

    Zewude, Bereket Tessema; Ashine, Kidus Meskele

    2016-01-01

    An attempt has been made to assess and identify the major variables that influence student academic achievement at college of natural and computational science of Wolaita Sodo University in Ethiopia. Study time, peer influence, securing first choice of department, arranging study time outside class, amount of money received from family, good life…

  4. Ridge Regression Signal Processing

    Science.gov (United States)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  5. Subset selection in regression

    CERN Document Server

    Miller, Alan

    2002-01-01

    Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...

  6. Regression in organizational leadership.

    Science.gov (United States)

    Kernberg, O F

    1979-02-01

    The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.

  7. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  8. Comparison of Classical Linear Regression and Orthogonal Regression According to the Sum of Squares Perpendicular Distances

    OpenAIRE

    KELEŞ, Taliha; ALTUN, Murat

    2016-01-01

    Regression analysis is a statistical technique for investigating and modeling the relationship between variables. The purpose of this study was the trivial presentation of the equation for orthogonal regression (OR) and the comparison of classical linear regression (CLR) and OR techniques with respect to the sum of squared perpendicular distances. For that purpose, the analyses were shown by an example. It was found that the sum of squared perpendicular distances of OR is smaller. Thus, it wa...

  9. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  10. Assessing the influence of land use land cover pattern, socio economic factors and air quality status to predict morbidity on the basis of logistic based regression model

    Science.gov (United States)

    Dixit, A.; Singh, V. K.

    2017-12-01

    Recent studies conducted by World Health Organisation (WHO) estimated that 92 % of the total world population are living in places where the air quality level has exceeded the WHO standard limit for air quality. This is due to the change in Land Use Land Cover (LULC) pattern, socio economic drivers and anthropogenic heat emission caused by manmade activity. Thereby, many prevalent human respiratory diseases such as lung cancer, chronic obstructive pulmonary disease and emphysema have increased in recent times. In this study, a quantitative relationship is developed between land use (built-up land, water bodies, and vegetation), socio economic drivers and air quality parameters using logistic based regression model over 7 different cities of India for the winter season of 2012 to 2016. Different LULC, socio economic, industrial emission sources, meteorological condition and air quality level from the monitoring stations are taken to estimate the influence on morbidity of each city. Results of correlation are analyzed between land use variables and monthly concentration of pollutants. These values range from 0.63 to 0.76. Similarly, the correlation value between land use variable with socio economic and morbidity ranges from 0.57 to 0.73. The performance of model is improved from 67 % to 79 % in estimating morbidity for the year 2015 and 2016 due to the better availability of observed data.The study highlights the growing importance of incorporating socio-economic drivers with air quality data for evaluating morbidity rate for each city in comparison to just change in quantitative analysis of air quality.

  11. Testing and Modeling Fuel Regression Rate in a Miniature Hybrid Burner

    Directory of Open Access Journals (Sweden)

    Luciano Fanton

    2012-01-01

    Full Text Available Ballistic characterization of an extended group of innovative HTPB-based solid fuel formulations for hybrid rocket propulsion was performed in a lab-scale burner. An optical time-resolved technique was used to assess the quasisteady regression history of single perforation, cylindrical samples. The effects of metalized additives and radiant heat transfer on the regression rate of such formulations were assessed. Under the investigated operating conditions and based on phenomenological models from the literature, analyses of the collected experimental data show an appreciable influence of the radiant heat flux from burnt gases and soot for both unloaded and loaded fuel formulations. Pure HTPB regression rate data are satisfactorily reproduced, while the impressive initial regression rates of metalized formulations require further assessment.

  12. Comparison of the Complior Analyse device with Sphygmocor and Complior SP for pulse wave velocity and central pressure assessment.

    Science.gov (United States)

    Stea, Francesco; Bozec, Erwan; Millasseau, Sandrine; Khettab, Hakim; Boutouyrie, Pierre; Laurent, Stéphane

    2014-04-01

    The Complior device (Alam Medical, France) was used in epidemiological studies which established pulse wave velocity (PWV) as a cardiovascular risk marker. Central pressure is related, but complementary to PWV and also associated to cardiovascular outcomes. The new Complior Analyse measures both PWV and central blood pressure during the same acquisition. The aim of this study was to compare PWV values from Complior Analyse with the previous Complior SP (PWVcs) and with Sphygmocor (PWVscr; AtCor, Australia), and to compare central systolic pressure from Complior Analyse and Sphygmocor. Peripheral and central pressures and PWV were measured with the three devices in 112 patients. PWV measurements from Complior Analyse were analysed using two foot-detection algorithms (PWVca_it and PWVca_cs). Both radial (ao-SBPscr) and carotid (car-SBPscr) approaches from Sphygmocor were compared to carotid Complior Analyse measurements (car-SBPca). The same distance and same calibrating pressures were used for all devices. PWVca_it was strongly correlated to PWVscr (R(2) = 0.93, P < 0.001) with a difference of 0.0 ± 0.7  m/s. PWVca_cs was also correlated to PWVcs (R(2) = 0.90, P < 0.001) with a difference of 0.1 ± 0.7  m/s. Central systolic pressures were strongly correlated. The difference between car-SBPca and ao-SBPscr was 3.1 ± 4.2  mmHg (P < 0.001), statistically equivalent to the difference between car-SBPscr and ao-SBPscr (3.9 ± 5.8  mmHg, P < 0.001), whilst the difference between car-SBPca and car-SBPscr was negligible (-0.7 ± 5.6  mmHg, P = NS). The new Complior Analyse device provides equivalent results for PWV and central pressure values to the Sphygmocor and Complior SP. It reaches Association for the Advancement of Medical Instrumentation standard for central blood pressure and grades as excellent for PWV on the Artery Society criteria. It can be interchanged with existing devices.

  13. Assessment of the influence of different sample processing and cold storage duration on plant free proline content analyses.

    Science.gov (United States)

    Teklić, Tihana; Spoljarević, Marija; Stanisavljević, Aleksandar; Lisjak, Miroslav; Vinković, Tomislav; Parađiković, Nada; Andrić, Luka; Hancock, John T

    2010-01-01

    A method which is widely accepted for the analysis of free proline content in plant tissues is based on the use of 3% sulfosalicylic acid as an extractant, followed by spectrophotometric quantification of a proline-ninhydrin complex in toluene. However, sample preparation and storage may influence the proline actually measured. This may give misleading or difficult to compare data. To evaluate free proline levels fresh and frozen strawberry (Fragaria × ananassa Duch.) leaves and soybean [Glycine max (L.) Merr.] hypocotyl tissues were used. These were ground with or without liquid nitrogen and proline extracted with sulfosalicylic acid. A particular focus was the influence of plant sample cold storage duration (1, 4 and 12 weeks at -20°C) on tissue proline levels measured. The free proline content analyses, carried out in leaves of Fragaria × ananassa Duch. as well as in hypocotyls of Glycine max (L.) Merr., showed a significant influence of the sample preparation method and cold storage period. Long-term storage of up to 12 weeks at -20°C led to a significant increase in the measured proline in all samples analysed. The observed changes in proline content in plant tissue samples stored at -20°C indicate the likelihood of the over-estimation of the proline content if the proline analyses are delayed. Plant sample processing and cold storage duration seem to have an important influence on results of proline analyses. Therefore it is recommended that samples should be ground fresh and analysed immediately. Copyright © 2010 John Wiley & Sons, Ltd.

  14. Response to Ecological Risk Assessment Forum Request for Information on the Benefits of PCB Congener-Specific Analyses

    Science.gov (United States)

    In August, 2001, the Ecological Risk Assessment Forum (ERAF) submitted a formal question to the Ecological Risk Assessment Support Center (ERASC) on the benefits of evaluating PCB congeners in environmental samples. This question was developed by ERAF members Bruce Duncan and Cla...

  15. Steganalysis using logistic regression

    Science.gov (United States)

    Lubenko, Ivans; Ker, Andrew D.

    2011-02-01

    We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

  16. SEPARATION PHENOMENA LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Ikaro Daniel de Carvalho Barreto

    2014-03-01

    Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.

  17. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  18. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  19. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  20. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    Energy Technology Data Exchange (ETDEWEB)

    Milani, Gabriele, E-mail: milani@stru.polimi.it; Valente, Marco, E-mail: milani@stru.polimi.it [Department of Architecture, Built Environment and Construction Engineering (ABC), Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milan (Italy)

    2014-10-06

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures.

  1. Safety assessment of historical masonry churches based on pre-assigned kinematic limit analysis, FE limit and pushover analyses

    International Nuclear Information System (INIS)

    Milani, Gabriele; Valente, Marco

    2014-01-01

    This study presents some results of a comprehensive numerical analysis on three masonry churches damaged by the recent Emilia-Romagna (Italy) seismic events occurred in May 2012. The numerical study comprises: (a) pushover analyses conducted with a commercial code, standard nonlinear material models and two different horizontal load distributions; (b) FE kinematic limit analyses performed using a non-commercial software based on a preliminary homogenization of the masonry materials and a subsequent limit analysis with triangular elements and interfaces; (c) kinematic limit analyses conducted in agreement with the Italian code and based on the a-priori assumption of preassigned failure mechanisms, where the masonry material is considered unable to withstand tensile stresses. All models are capable of giving information on the active failure mechanism and the base shear at failure, which, if properly made non-dimensional with the weight of the structure, gives also an indication of the horizontal peak ground acceleration causing the collapse of the church. The results obtained from all three models indicate that the collapse is usually due to the activation of partial mechanisms (apse, façade, lateral walls, etc.). Moreover the horizontal peak ground acceleration associated to the collapse is largely lower than that required in that seismic zone by the Italian code for ordinary buildings. These outcomes highlight that structural upgrading interventions would be extremely beneficial for the considerable reduction of the seismic vulnerability of such kind of historical structures

  2. Reflectance spectral analyses for the assessment of environmental pollution in the geothermal site of Mt. Amiata (Italy)

    Science.gov (United States)

    Manzo, Ciro; Salvini, Riccardo; Guastaldi, Enrico; Nicolardi, Valentina; Protano, Giuseppe

    2013-11-01

    We studied the environmental impact of geothermal activities in the Mt. Amiata area, using on-site spectral analyses of various ecological components. Analytical techniques were based on the study of the “red-edge”, which represents the spectral feature of the reflectance spectra defined between red and infrared wavelengths (λ) within the range 670-780 nm. Since in the study area the geothermal exploitation causes the drifting of contaminants such as Hg, Sb, S, B, As and H2S (hydrogen sulfide) from power plants, the spectral response of vegetation and lichens depends on their distance from the power stations, and also on the exposed surface, material type and other physical parameters. In the present research, the spectral radiance of targets was measured in the field using an Analytical Spectral Device (ASD) Field-Spec™FR portable radiometer. Spectral measurements were made on vegetation and lichen samples located near to and far from geothermal areas and potential pollution sources (e.g., power plants), with the aim of spatially defining their environmental impact. Observations for vegetation and lichens showed correlation with laboratory chemical analyses when these organisms were under stress conditions. The evaluation of relationships was carried out using several statistical approaches, which allowed to identify methods for identifying contamination indicators for plants and lichens in polluted areas. Results show that the adopted spectral indices are sensitive to environmental pollution and their responses spatialstatically correlated to chemical and ecophysiological analyses within a notable distance.

  3. Economic efficiency assessment of greenhouse gases mitigation for agriculture; Analyse af omkostningseffektiviteten ved drivhusgasreducerende tiltag i relation til landbruget

    Energy Technology Data Exchange (ETDEWEB)

    Dubgaard, A.; Moeller Laugesen, F.; Staehl, E.E.; Bang, J.R.; Schou, E.; Jacobsen, Brian H.; Oerum, J.E.; Dejgaerd Jensen, J.

    2013-08-15

    The report contains the contributions by the Institute of Food and Resource Economics (IFRO) to a Danish Government appraisal of greenhouse gas (GHG) reduction measures. The policy goal is a 40 per cent reduction in total Danish GHG emissions by 2020 compared to 1990. The GHGs analysed in the present study include emissions of CO{sub 2}, nitrous oxide and methane plus soil carbon sequestration. The purpose of the study is to identify GHG mitigation measures related to agriculture which can deliver cost-effective contributions to the targeted reduction in GHG emissions in Denmark. A total of 21 GHG mitigation measures are included in the assessment. The stipulated implementation period is 2013 to 2020. The cost calculations have a time horizon equal to 30 years, i.e. from 2013 to 2042. The GHG reduction potential, expressed in CO{sub 2} equivalents (CO{sub 2}-eq), is calculated as the sum of the effect on the emission of CO{sub 2} (with and without changes in soil carbon), methane and nitrous oxide. The 21 mitigation measures are listed below (figures in brackets show the assumed implementation potential): 1. Biogas from livestock manure/slurry (10 % of total slurry production) 2. Biogas from slurry and maize (10 % of total slurry production) 3. Biogas from organic clover 4. Additional fat in diet for dairy cows (80% of conventional dairy cow stock and 20 % of organic dairy cow stock) 5. Additional concentrated feed in diet for other cattle (25 % of cattle stock under 2 years of age) 6. Prolonged lactation period for dairy cows (10 % of dairy cow stock) 7. Acidification of slurry (10 % of total slurry production) 8. Covers on slurry containers (40 % of total slurry production) 9. Cooling of pig slurry (10 % of pig slurry) 10. Nitrification inhibitors in nitrate fertilisers (100 % of chemical fertilisers with nitrogen) 11. Increased nitrogen utilization requirement for degassed slurry in nitrogen quota system (50 % of total slurry production) 12. Increased nitrogen

  4. Time-trend of melanoma screening practice by primary care physicians: A meta-regression analysis

    OpenAIRE

    Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni

    2009-01-01

    Objective To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Methods Meta-regression analyses of available data. Data Sources: MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Results Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%?82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening ten...

  5. Evaluation of Two Surface Sampling Methods for Microbiological and Chemical Analyses To Assess the Presence of Biofilms in Food Companies.

    Science.gov (United States)

    Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De

    2017-12-01

    Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.

  6. Analysing the New Taliban Code of Conduct (Layeha): An Assessment of Changing Perspectives and Strategies of the Afghan Taliban

    Science.gov (United States)

    2012-03-01

    Amir ul-Momineen)16 and his Deputy, who goes unnamed , unlike the 2009 version where Omar’s then most-trusted deputy, Mullah Abdul Ghani Berader, was...Chivers, C.J., 2011. In Eastern Afghanistan, at war with the Taliban’s shadowy rule. The New York Times, 6 Feb. Coll, S., 2004. Ghost wars: the secret...details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/ccas20 Analysing the new Taliban Code of Conduct

  7. Life cycle assessment applied to wastewater treatment; Analyse de cycle de vie appliquee aux systemes de traitement des eaux usees

    Energy Technology Data Exchange (ETDEWEB)

    Renou, S.

    2006-01-15

    Nowadays, the environmental performances of wastewater treatment systems are not properly analyzed. Thus, the development of an exhaustive and reliable method is needed to help stakeholders to choose the best environmental solutions. Life cycle assessment (LCA) was selected as a starting point to answer this problem. LCA has been tested. This tool is essential to analyze the environmental performances of wastewater treatment systems. In order to fulfill our goal, the best compromise seems to be the association of LCA, to assess global impacts, with others methodologies, to assess local impacts. Finally, a software has been developed to compare urban sludge treatment and recovering process trains. Two impacts, energy and greenhouse effect, are currently included in. The software and its development steps are described and illustrated through two case studies. This tool has made LCA easier to apply and more useful to wastewater field stakeholders. (author)

  8. A comparison of three methods of assessing differential item functioning (DIF) in the Hospital Anxiety Depression Scale: ordinal logistic regression, Rasch analysis and the Mantel chi-square procedure.

    Science.gov (United States)

    Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C

    2014-12-01

    It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.

  9. Assessing residential buildings value in Spain for risk analyses. Application to the landslide hazard in the Autonomous Community of Valencia

    Science.gov (United States)

    Cantarino, I.; Torrijo, F. J.; Palencia, S.; Gielen, E.

    2014-05-01

    This paper proposes a method of valuing the stock of residential buildings in Spain as the first step in assessing possible damage caused to them by natural hazards. For the purposes of the study we had access to the SIOSE (the Spanish Land Use and Cover Information System), a high-resolution land-use model, as well as to a report on the financial valuations of this type of buildings throughout Spain. Using dasymetric disaggregation processes and GIS techniques we developed a geolocalized method of obtaining this information, which was the exposure variable in the general risk assessment formula. If hazard maps and risk assessment methods - the other variables - are available, the risk value can easily be obtained. An example of its application is given in a case study that assesses the risk of a landslide in the entire 23 200 km2 of the Valencia Autonomous Community (NUT2), the results of which are analyzed by municipal areas (LAU2) for the years 2005 and 2009.

  10. Forest sector carbon analyses support land management planning and projects: Assessing the influence of anthropogenic and natural factors

    Science.gov (United States)

    Alexa J. Dugan; Richard Birdsey; Sean P. Healey; Yude Pan; Fangmin Zhang; Gang Mo; Jing Chen; Christopher W. Woodall; Alexander J. Hernandez; Kevin McCullough; James B. McCarter; Crystal L. Raymond; Karen. Dante-Wood

    2017-01-01

    Management of forest carbon stocks on public lands is critical to maintaining or enhancing carbon dioxide removal from the atmosphere. Acknowledging this, an array of federal regulations and policies have emerged that requires US National Forests to report baseline carbon stocks and changes due to disturbance and management and assess how management activities and...

  11. Aid and growth regressions

    DEFF Research Database (Denmark)

    Hansen, Henrik; Tarp, Finn

    2001-01-01

    This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...

  12. Assessment applicability of selected models of multiple discriminant analyses to forecast financial situation of Polish wood sector enterprises

    Directory of Open Access Journals (Sweden)

    Adamowicz Krzysztof

    2017-03-01

    Full Text Available In the last three decades forecasting bankruptcy of enterprises has been an important and difficult problem, used as an impulse for many research projects (Ribeiro et al. 2012. At present many methods of bankruptcy prediction are available. In view of the specific character of economic activity in individual sectors, specialised methods adapted to a given branch of industry are being used increasingly often. For this reason an important scientific problem is related with the indication of an appropriate model or group of models to prepare forecasts for a given branch of industry. Thus research has been conducted to select an appropriate model of Multiple Discriminant Analysis (MDA, best adapted to forecasting changes in the wood industry. This study analyses 10 prediction models popular in Poland. Effectiveness of the model proposed by Jagiełło, developed for all industrial enterprises, may be labelled accidental. That model is not adapted to predict financial changes in wood sector companies in Poland.

  13. A New 3D Tool for Assessing the Accuracy of Bimaxillary Surgery: The OrthoGnathicAnalyser

    Science.gov (United States)

    Xi, Tong; Schreurs, Ruud; de Koning, Martien; Bergé, Stefaan; Maal, Thomas

    2016-01-01

    Aim The purpose of this study was to present and validate an innovative semi-automatic approach to quantify the accuracy of the surgical outcome in relation to 3D virtual orthognathic planning among patients who underwent bimaxillary surgery. Material and Method For the validation of this new semi-automatic approach, CBCT scans of ten patients who underwent bimaxillary surgery were acquired pre-operatively. Individualized 3D virtual operation plans were made for all patients prior to surgery. During surgery, the maxillary and mandibular segments were positioned as planned by using 3D milled interocclusal wafers. Consequently, post-operative CBCT scan were acquired. The 3D rendered pre- and postoperative virtual head models were aligned by voxel-based registration upon the anterior cranial base. To calculate the discrepancies between the 3D planning and the actual surgical outcome, the 3D planned maxillary and mandibular segments were segmented and superimposed upon the postoperative maxillary and mandibular segments. The translation matrices obtained from this registration process were translated into translational and rotational discrepancies between the 3D planning and the surgical outcome, by using the newly developed tool, the OrthoGnathicAnalyser. To evaluate the reproducibility of this method, the process was performed by two independent observers multiple times. Results Low intra-observer and inter-observer variations in measurement error (mean error 0.97) were found, supportive of the observer independent character of the OrthoGnathicAnalyser. The pitch of the maxilla and mandible showed the highest discrepancy between the 3D planning and the postoperative results, 2.72° and 2.75° respectively. Conclusion This novel method provides a reproducible tool for the evaluation of bimaxillary surgery, making it possible to compare larger patient groups in an objective and time-efficient manner in order to optimize the current workflow in orthognathic surgery

  14. A New 3D Tool for Assessing the Accuracy of Bimaxillary Surgery: The OrthoGnathicAnalyser.

    Directory of Open Access Journals (Sweden)

    Frank Baan

    Full Text Available The purpose of this study was to present and validate an innovative semi-automatic approach to quantify the accuracy of the surgical outcome in relation to 3D virtual orthognathic planning among patients who underwent bimaxillary surgery.For the validation of this new semi-automatic approach, CBCT scans of ten patients who underwent bimaxillary surgery were acquired pre-operatively. Individualized 3D virtual operation plans were made for all patients prior to surgery. During surgery, the maxillary and mandibular segments were positioned as planned by using 3D milled interocclusal wafers. Consequently, post-operative CBCT scan were acquired. The 3D rendered pre- and postoperative virtual head models were aligned by voxel-based registration upon the anterior cranial base. To calculate the discrepancies between the 3D planning and the actual surgical outcome, the 3D planned maxillary and mandibular segments were segmented and superimposed upon the postoperative maxillary and mandibular segments. The translation matrices obtained from this registration process were translated into translational and rotational discrepancies between the 3D planning and the surgical outcome, by using the newly developed tool, the OrthoGnathicAnalyser. To evaluate the reproducibility of this method, the process was performed by two independent observers multiple times.Low intra-observer and inter-observer variations in measurement error (mean error 0.97 were found, supportive of the observer independent character of the OrthoGnathicAnalyser. The pitch of the maxilla and mandible showed the highest discrepancy between the 3D planning and the postoperative results, 2.72° and 2.75° respectively.This novel method provides a reproducible tool for the evaluation of bimaxillary surgery, making it possible to compare larger patient groups in an objective and time-efficient manner in order to optimize the current workflow in orthognathic surgery.

  15. The mental health care model in Brazil: analyses of the funding, governance processes, and mechanisms of assessment

    Directory of Open Access Journals (Sweden)

    Thiago Lavras Trapé

    Full Text Available ABSTRACT OBJECTIVE This study aims to analyze the current status of the mental health care model of the Brazilian Unified Health System, according to its funding, governance processes, and mechanisms of assessment. METHODS We have carried out a documentary analysis of the ordinances, technical reports, conference reports, normative resolutions, and decrees from 2009 to 2014. RESULTS This is a time of consolidation of the psychosocial model, with expansion of the health care network and inversion of the funding for community services with a strong emphasis on the area of crack cocaine and other drugs. Mental health is an underfunded area within the chronically underfunded Brazilian Unified Health System. The governance model constrains the progress of essential services, which creates the need for the incorporation of a process of regionalization of the management. The mechanisms of assessment are not incorporated into the health policy in the bureaucratic field. CONCLUSIONS There is a need to expand the global funding of the area of health, specifically mental health, which has been shown to be a successful policy. The current focus of the policy seems to be archaic in relation to the precepts of the psychosocial model. Mechanisms of assessment need to be expanded.

  16. Impacts Analyses Supporting the National Environmental Policy Act Environmental Assessment for the Resumption of Transient Testing Program

    Energy Technology Data Exchange (ETDEWEB)

    Annette L. Schafer; Lloyd C. Brown; David C. Carathers; Boyd D. Christensen; James J. Dahl; Mark L. Miller; Cathy Ottinger Farnum; Steven Peterson; A. Jeffrey Sondrup; Peter V. Subaiya; Daniel M. Wachs; Ruth F. Weiner

    2013-11-01

    Environmental and health impacts are presented for activities associated with transient testing of nuclear fuel and material using two candidate test reactors. Transient testing involves irradiation of nuclear fuel or materials for short time-periods under high neutron flux rates. The transient testing process includes transportation of nuclear fuel or materials inside a robust shipping cask to a hot cell, removal from the shipping cask, pre-irradiation examination of the nuclear materials, assembly of an experiment assembly, transportation of the experiment assembly to the test reactor, irradiation in the test reactor, transport back to the hot cell, and post-irradiation examination of the nuclear fuel or material. The potential for environmental or health consequences during the transportation, examination, and irradiation actions are assessed for normal operations, off-normal (accident) scenarios, and transportation. Impacts to the environment (air, soil, and groundwater), are assessed during each phase of the transient testing process. This report documents the evaluation of potential consequences to the general public. This document supports the Environmental Assessment (EA) required by the U.S. National Environmental Policy Act (NEPA) (42 USC Subsection 4321 et seq.).

  17. Methods and Techniques Used to Convey Total System Performance Assessment Analyses and Results for Site Recommendation at Yucca Mountain, Nevada, USA

    International Nuclear Information System (INIS)

    Mattie, Patrick D.; McNeish, Jerry A.; Sevougian, S. David; Andrews, Robert W.

    2001-01-01

    Total System Performance Assessment (TSPA) is used as a key decision-making tool for the potential geologic repository of high level radioactive waste at Yucca Mountain, Nevada USA. Because of the complexity and uncertainty involved in a post-closure performance assessment, an important goal is to produce a transparent document describing the assumptions, the intermediate steps, the results, and the conclusions of the analyses. An important objective for a TSPA analysis is to illustrate confidence in performance projections of the potential repository given a complex system of interconnected process models, data, and abstractions. The methods and techniques used for the recent TSPA analyses demonstrate an effective process to portray complex models and results with transparency and credibility

  18. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  19. Use of Speech Analyses within a Mobile Application for the Assessment of Cognitive Impairment in Elderly People.

    Science.gov (United States)

    Konig, Alexandra; Satt, Aharon; Sorin, Alex; Hoory, Ran; Derreumaux, Alexandre; David, Renaud; Robert, Phillippe H

    2018-01-01

    Various types of dementia and Mild Cognitive Impairment (MCI) are manifested as irregularities in human speech and language, which have proven to be strong predictors for the disease presence and progress ion. Therefore, automatic speech analytics provided by a mobile application may be a useful tool in providing additional indicators for assessment and detection of early stage dementia and MCI. 165 participants (subjects with subjective cognitive impairment (SCI), MCI patients, Alzheimer's disease (AD) and mixed dementia (MD) patients) were recorded with a mobile application while performing several short vocal cognitive tasks during a regular consultation. These tasks included verbal fluency, picture description, counting down and a free speech task. The voice recordings were processed in two steps: in the first step, vocal markers were extracted using speech signal processing techniques; in the second, the vocal markers were tested to assess their 'power' to distinguish between SCI, MCI, AD and MD. The second step included training automatic classifiers for detecting MCI and AD, based on machine learning methods, and testing the detection accuracy. The fluency and free speech tasks obtain the highest accuracy rates of classifying AD vs. MD vs. MCI vs. SCI. Using the data, we demonstrated classification accuracy as follows: SCI vs. AD = 92% accuracy; SCI vs. MD = 92% accuracy; SCI vs. MCI = 86% accuracy and MCI vs. AD = 86%. Our results indicate the potential value of vocal analytics and the use of a mobile application for accurate automatic differentiation between SCI, MCI and AD. This tool can provide the clinician with meaningful information for assessment and monitoring of people with MCI and AD based on a non-invasive, simple and low-cost method. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Varying geospatial analyses to assess climate risk and adaptive capacity in a hotter, drier Southwestern United States

    Science.gov (United States)

    Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.

    2017-12-01

    Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and

  1. Measurement Error in Education and Growth Regressions

    NARCIS (Netherlands)

    Portela, M.; Teulings, C.N.; Alessie, R.

    The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations

  2. Measurement error in education and growth regressions

    NARCIS (Netherlands)

    Portela, Miguel; Teulings, Coen; Alessie, R.

    2004-01-01

    The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations

  3. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  4. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Coupled 230Th/234U-ESR analyses for corals: A new method to assess sealevel change

    International Nuclear Information System (INIS)

    Blackwell, Bonnie A.B.; Teng, Steve J.T.; Lundberg, Joyce A.; Blickstein, Joel I.B.; Skinner, Anne R.

    2007-01-01

    Although coupled 230 Th/ 234 U-ESR analyses have become routine for dating teeth, they have never been used for corals. While the ESR age depends on, and requires assumptions about, the time-averaged cosmic dose rate, D-bar cos (t), 230 Th/ 234 U dates do not. Since D-bar cos (t) received by corals depends on the attenuation by any intervening material, D-bar cos (t) response reflects changing water depths and sediment cover. By coupling the two methods, one can determine the age and a unique D-bar cos,coupled (t) simultaneously. From a coral's water depth and sedimentary history as predicted by a given sealevel curve, one can predict D-bar cos,sealevel (t). If D-bar cos,coupled (t) agrees well with D-bar cos,sealevel (t), this provides independent validation for the curve used to build D-bar cos,sealevel (t). For six corals dated at 7-128 ka from Florida Platform reef crests, the sealevel curve by Waelbroeck et al. [2002. Sea-level and deep water temperature changes derived from benthonic foraminifera isotopic records. Quat. Sci. Rev. 21, 295-305] predicted their D-bar cos,coupled (t) values as well as, or better than, the SPECMAP sealevel curve. Where a whole reef can be sampled over a transect, a precise test for sealevel curves could be developed

  6. Long-term impact of farm management and crops on soil microorganisms assessed by combined DGGE and PLFA analyses.

    Science.gov (United States)

    Stagnari, Fabio; Perpetuini, Giorgia; Tofalo, Rosanna; Campanelli, Gabriele; Leteo, Fabrizio; Della Vella, Umberto; Schirone, Maria; Suzzi, Giovanna; Pisante, Michele

    2014-01-01

    In the present study, long-term organic and conventional managements were compared at the experimental field of Monsampolo del Tronto (Marche region, Italy) with the aim of investigating soil chemical fertility and microbial community structure. A polyphasic approach, combining soil fertility indicators with microbiological analyses (plate counts, PCR-denaturing gradient gel electrophoresis [DGGE] and phospholipid fatty acid analysis [PLFA]) was applied. Organic matter, N as well as some important macro and micronutrients (K, P, Mg, Mn, Cu, and Zn) for crop growth, were more available under organic management. Bacterial counts were higher in organic management. A significant influence of management system and management x crop interaction was observed for total mesophilic bacteria, nitrogen fixing bacteria and actinobacteria. Interestingly, cultivable fungi were not detected in all analyzed samples. PLFA biomass was higher in the organic and Gram positive bacteria dominated the microbial community in both systems. Even if fungal biomass was higher in organic management, fungal PCR-DGGE fingerprinting revealed that the two systems were very similar in terms of fungal species suggesting that 10 years were not enough to establish a new dynamic equilibrium among ecosystem components. A better knowledge of soil biota and in particular of fungal community structure will be useful for the development of sustainable management strategies.

  7. Coupled {sup 230}Th/{sup 234}U-ESR analyses for corals: A new method to assess sealevel change

    Energy Technology Data Exchange (ETDEWEB)

    Blackwell, Bonnie A.B. [Department of Chemistry, Williams College, Williamstown, MA 01267 (United States); RFK Science Research Institute, Glenwood Landing, NY 11547 (United States)], E-mail: bonnie.a.b.blackwell@williams.edu; Teng, Steve J.T. [RFK Science Research Institute, Glenwood Landing, NY 11547 (United States)], E-mail: mteng1584@gmail.com; Lundberg, Joyce A. [Department of Geography and Environmental Studies, Carleton University, Ottawa, ON, Canada K1S 5B6 (Canada)], E-mail: joyce_lundberg@carleton.ca; Blickstein, Joel I.B. [Department of Chemistry, Williams College, Williamstown, MA 01267 (United States); RFK Science Research Institute, Glenwood Landing, NY 11547 (United States); Skinner, Anne R. [Department of Chemistry, Williams College, Williamstown, MA 01267 (United States); RFK Science Research Institute, Glenwood Landing, NY 11547 (United States)], E-mail: anne.r.skinner@williams.edu

    2007-07-15

    Although coupled {sup 230}Th/{sup 234}U-ESR analyses have become routine for dating teeth, they have never been used for corals. While the ESR age depends on, and requires assumptions about, the time-averaged cosmic dose rate, D-bar{sub cos}(t), {sup 230}Th/{sup 234}U dates do not. Since D-bar{sub cos}(t) received by corals depends on the attenuation by any intervening material, D-bar{sub cos}(t) response reflects changing water depths and sediment cover. By coupling the two methods, one can determine the age and a unique D-bar{sub cos,coupled}(t) simultaneously. From a coral's water depth and sedimentary history as predicted by a given sealevel curve, one can predict D-bar{sub cos,sealevel}(t). If D-bar{sub cos,coupled}(t) agrees well with D-bar{sub cos,sealevel}(t), this provides independent validation for the curve used to build D-bar{sub cos,sealevel}(t). For six corals dated at 7-128 ka from Florida Platform reef crests, the sealevel curve by Waelbroeck et al. [2002. Sea-level and deep water temperature changes derived from benthonic foraminifera isotopic records. Quat. Sci. Rev. 21, 295-305] predicted their D-bar{sub cos,coupled}(t) values as well as, or better than, the SPECMAP sealevel curve. Where a whole reef can be sampled over a transect, a precise test for sealevel curves could be developed.

  8. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...... and/or the process that makes it. It can be used in conjunction with process simulation and economic analysis tools to evaluate the design of any existing and/or new chemical-biochemical process and to propose improvement options in order to arrive at the best design among various alternatives...

  9. Considerations for Probabilistic Analyses to Assess Potential Changes to Large-Break LOCA Definition for ECCS Requirements

    International Nuclear Information System (INIS)

    Wilkowski, G.; Rudland, D.; Wolterman, R.; Krishnaswamy, P.; Scott, P.; Rahman, S.; Fairbanks, C.

    2002-01-01

    The U.S.NRC has undertaken a study to explore changes to the body of Part 50 of the U.S. Federal Code of Regulations, to incorporate risk-informed attributes. One of the regulations selected for this study is 10 CFR 50.46, A cceptance Criteria for Emergency Core Cooling Systems for Light-Water Nuclear Power Reactors . These changes will potentially enhance safety and reduce unnecessary burden on utilities. Specific attention is being paid to redefining the maximum pipe break size for LB-LOCA by determining the spectrum of pipe diameter (or equivalent opening area) versus failure probabilities. In this regard, it is necessary to ensure that all contributors to probabilistic failures are accounted for when redefining ECCS requirements. This paper describes initial efforts being conducted for the U.S.NRC on redefining the LB-LOCA requirements. Consideration of the major contributors to probabilistic failure, and deterministic aspects for modeling them, are being addressed. At this time three major contributors to probabilistic failures are being considered. These include: (1) Analyses of the failure probability from cracking mechanisms that could involve rupture or large opening areas from either through-wall or surface flaws, whether the pipe system was approved for leak-before-break (LBB) or not. (2) Future degradation mechanisms, such as recent occurrence of PWSCC in PWR piping need to be included. This degradation mechanism was not recognized as being an issue when LBB was approved for many plants or when the initial risk-informed inspection plans were developed. (3) Other indirect causes of loss of pressure-boundary integrity than from cracks in the pipe system also should be included. The failure probability from probabilistic fracture mechanics will not account for these other indirect causes that could result in a large opening in the pressure boundary: i.e., failure of bolts on a steam generator manway, flanges, and valves; outside force damage from the

  10. The mental health care model in Brazil: analyses of the funding, governance processes, and mechanisms of assessment.

    Science.gov (United States)

    Trapé, Thiago Lavras; Campos, Rosana Onocko

    2017-03-23

    This study aims to analyze the current status of the mental health care model of the Brazilian Unified Health System, according to its funding, governance processes, and mechanisms of assessment. We have carried out a documentary analysis of the ordinances, technical reports, conference reports, normative resolutions, and decrees from 2009 to 2014. This is a time of consolidation of the psychosocial model, with expansion of the health care network and inversion of the funding for community services with a strong emphasis on the area of crack cocaine and other drugs. Mental health is an underfunded area within the chronically underfunded Brazilian Unified Health System. The governance model constrains the progress of essential services, which creates the need for the incorporation of a process of regionalization of the management. The mechanisms of assessment are not incorporated into the health policy in the bureaucratic field. There is a need to expand the global funding of the area of health, specifically mental health, which has been shown to be a successful policy. The current focus of the policy seems to be archaic in relation to the precepts of the psychosocial model. Mechanisms of assessment need to be expanded. Analisar o estágio atual do modelo de atenção à saúde mental do Sistema Único de Saúde, segundo seu financiamento, processos de governança e mecanismos de avaliação. Foi realizada uma análise documental de portarias, informes técnicos, relatórios de conferência, resoluções e decretos de 2009 a 2014. Trata-se de um momento de consolidação do modelo psicossocial, com ampliação da rede assistencial, inversão de financiamento para serviços comunitários com forte ênfase na área de crack e outras drogas. A saúde mental é uma área subfinanciada dentro do subfinanciamento crônico do Sistema Único de Saúde. O modelo de governança constrange o avanço de serviços essenciais, havendo a necessidade da incorporação de um

  11. Assessing the ecological long-term impact of wastewater irrigation on soil and water based on bioassays and chemical analyses.

    Science.gov (United States)

    Richter, Elisabeth; Hecht, Fabian; Schnellbacher, Nadine; Ternes, Thomas A; Wick, Arne; Wode, Florian; Coors, Anja

    2015-11-01

    The reuse of treated wastewater for irrigation and groundwater recharge can counteract water scarcity and reduce pollution of surface waters, but assessing its environmental risk should likewise consider effects associated to the soil. The present study therefore aimed at determining the impact of wastewater irrigation on the habitat quality of water after soil passage and of soil after percolation by applying bioassays and chemical analysis. Lab-scale columns of four different soils encompassing standard European soil and three field soils of varying characteristics and pre-contamination were continuously percolated with treated wastewater to simulate long-term irrigation. Wastewater and its percolates were tested for immobilization of Daphnia magna and growth inhibition of green algae (Pseudokirchneriella subcapitata) and water lentils (Lemna minor). The observed phytotoxicity of the treated wastewater was mostly reduced by soil passage, but in some percolates also increased for green algae. Chemical analysis covering an extensive set of wastewater-born organic pollutants demonstrated that many of them were considerably reduced by soil passage, particularly through peaty soils. Taken together, these results indicated that wastewater-born phytotoxic substances may be removed by soil passage, while existing soil pollutants (e.g. metals) may leach and impair percolate quality. Soils with and without wastewater irrigation were tested for growth of plants (Avena sativa, Brassica napus) and soil bacteria (Arthrobacter globiformis) and reproduction of collembolans (Folsomia candida) and oligochaetes (Enchytraeus crypticus, Eisenia fetida). The habitat quality of the standard and two field soils appeared to be deteriorated by wastewater percolation for at least one organism (enchytraeids, plants or bacteria), while for two pre-contaminated field soils it also was improved (for plants and/or enchytraeids). Wastewater percolation did not seem to raise soil concentrations

  12. On-site phytoremediation applicability assessment in Alur Ilmu, Universiti Kebangsaan Malaysia based on spatial and pollution removal analyses.

    Science.gov (United States)

    Mahmud, Mohd Hafiyyan; Lee, Khai Ern; Goh, Thian Lai

    2017-10-01

    The present paper aims to assess the phytoremediation performance based on pollution removal efficiency of the highly polluted region of Alur Ilmu urban river for its applicability of on-site treatment. Thirteen stations along Alur Ilmu were selected to produce thematic maps through spatial distribution analysis based on six water quality parameters of Malaysia's Water Quality Index (WQI) for dry and raining seasons. The maps generated were used to identify the highly polluted region for phytoremediation applicability assessment. Four free-floating plants were tested in treating water samples from the highly polluted region under three different conditions, namely controlled, aerated and normal treatments. The selected free-floating plants were water hyacinth (Eichhornia crassipes), water lettuce (Pistia stratiotes), rose water lettuce (Pistia sp.) and pennywort (Centella asiatica). The results showed that Alur Ilmu was more polluted during dry season compared to raining season based on the water quality analysis. During dry season, four parameters were marked as polluted along Alur Ilmu, namely dissolve oxygen (DO), 4.72 mg/L (class III); ammoniacal nitrogen (NH 3 -N), 0.85 mg/L (class IV); total suspended solid (TSS), 402 mg/L (class V) and biological oxygen demand (BOD), 3.89 mg/L (class III), whereas, two parameters were classed as polluted during raining season, namely total suspended solid (TSS), 571 mg/L (class V) and biological oxygen demand (BOD), 4.01 mg/L (class III). The thematic maps generated from spatial distribution analysis using Kriging gridding method showed that the highly polluted region was recorded at station AL 5. Hence, water samples were taken from this station for pollution removal analysis. All the free-floating plants were able to reduce TSS and COD in less than 14 days. However, water hyacinth showed the least detrimental effect from the phytoremediation process compared to other free-floating plants, thus made it a suitable

  13. Preliminary assessment of late quaternary vegetation and climate of southeastern Utah based on analyses of packrat middens

    International Nuclear Information System (INIS)

    Betancourt, J.L.; Biggar, N.

    1985-06-01

    Packrat midden sequences from two caves (elevations 1585 and 2195 m; 5200 and 7200 ft) southwest of the Abajo Mountains in southeast Utah record vegetation changes that are attributed to climatic changes occurring during the last 13,000 years. These data are useful in assessing potential future climates at proposed nuclear waste sites in the area. Paleoclimates are reconstructed by defining modern elevational analogs for the vegetation assemblages identified in the middens. Based on the midden record, a climate most extreme from the present occurred prior to approximately 10,000 years before present (BP), when mean annual temperature was probably 3 to 4C (5.5 to 7F) cooler than present. However, cooling could not have exceeded 5C (9F) at 1585 m (5200 ft). Accompanying mean annual precipitation is estimated to have been from 35 to 140% greater than at present, with rainfall concentrated in the winter months. Vegetational changes beginning approximately 10,000 years BP are attributed to increased summer and mean annual temperatures, a decreasing frequency of spring freezes, and a shift from winter- to summer-dominant rainfall. Greater effective moisture than present is inferred at both cave sites from approximately 8000 to 4000 years BP. Modern flora was present at both sites by about 2000 years BP

  14. A method for assessing the regional vibratory pattern of vocal folds by analysing the video recording of stroboscopy.

    Science.gov (United States)

    Lee, J S; Kim, E; Sung, M W; Kim, K H; Sung, M Y; Park, K S

    2001-05-01

    Stroboscopy and kymography have been used to examine the motional abnormality of vocal folds and to visualise their regional vibratory pattern. In a previous study (Laryngoscope, 1999), we introduced the conceptual idea of videostrobokymography, in which we applied the concept of kymography on the pre-recorded video images using stroboscopy, and showed its possible clinical application to various disorders in vocal folds. However, a more detailed description about the software and the mathematical formulation used in this system is needed for the reproduction of similar systems. The composition of hardwares, user-interface and detail procedures including mathematical equations in videostrobokymography software is presented in this study. As an initial clinical trial, videostrobokymography was applied to the preoperative and postoperative videostroboscopic images of 15 patients with Reinke's edema. On preoperative examination, videostrobokymograms showed irregular pattern of mucosal wave and, in some patients, a relatively constant glottic gap during phonation. After the operation, the voice quality of all patients was improved in acoustic and aerodynamic assessments, and videostrobokymography showed clearly improved mucosal waves (change in open quotient: mean +/- SD= 0.11 +/- 0.05).

  15. Assessment of the quality of water by hierarchical cluster and variance analyses of the Koudiat Medouar Watershed, East Algeria

    Science.gov (United States)

    Tiri, Ammar; Lahbari, Noureddine; Boudoukha, Abderrahmane

    2017-12-01

    The assessment of surface water in Koudiat Medouar watershed is very important especially when it comes to pollution of the dam waters by discharges of wastewater from neighboring towns in Oued Timgad, who poured into the basin of the dam, and agricultural lands located along the Oued Reboa. To this end, the multivariable method was used to evaluate the spatial and temporal variation of the water surface quality of the Koudiat Medouar dam, eastern Algeria. The stiff diagram has identified two main hydrochemical facies. The first facies Mg-HCO3 is reflected in the first sampling station (Oued Reboa) and in the second one (Oued Timgad), while the second facies Mg-SO4 is reflected in the third station (Basin Dam). The results obtained by the analysis of variance show that in the three stations all parameters are significant, except for Na, K and HCO3 in the first station (Oued Reboa) and the EC in the second station (Oued Timgad) and at the end NO3 and pH in the third station (Basin Dam). Q-mode hierarchical cluster analysis showed that two main groups in each sampling station. The chemistry of major ions (Mg, Ca, HCO3 and SO4) within the three stations results from anthropogenic impacts and water-rock interaction sources.

  16. Assessment of the effects of cage fish-farming on damselfish-associated food chains using stable-isotope analyses

    International Nuclear Information System (INIS)

    Jan, Rong-Quen; Kao, Shuh-Ji; Dai, Chang-Feng; Ho, Cheng-Tze

    2014-01-01

    Highlights: • Damselfishes living at sites near a cage farm bore lower δ 13 C and higher δ 15 N. • Similar trends occurred in zooplankton and detritus, major foods for damselfishes. • δ 15 N enrichment in fish may have arisen from the uptake of excess feed and prey. • Farm wastes were documented entering the ecosystem through the pelagic food chain. • No clear evidence of the effects of cage farming on stable isotopes in macroalgae. - Abstract: To assess the effect of cage fish-farming on the coral reef ecosystem off Xiaoliuchiu Island, southern Taiwan, geographical differences in the food chain of each of two damselfishes, Pomacentrus vaiuli and Chromis margaritifer, were examined using a stable-isotope approach. For each damselfish, individuals were found to consume similar foods at all sites. However, specimens collected at sites near the cage farm (as the experimental sites) exhibited lower δ 13 C and higher δ 15 N signatures compared to those from reference sites. Similar trends also occurred in the zooplankton and detritus, two major food sources for both damselfishes. This finding indicates that particulate organic matter released by the farm may have entered the coral reef ecosystem through the pelagic food chain. Artificial reef emplacement is recommended to provide extra habitats under cage farms to support additional pelagic-feeding fish populations, thereby reducing environmental impacts of cage farming on coral reefs

  17. The enhanced value of combining conventional and 'omics' analyses in early assessment of drug-induced hepatobiliary injury

    International Nuclear Information System (INIS)

    Ellinger-Ziegelbauer, Heidrun; Adler, Melanie; Amberg, Alexander; Brandenburg, Arnd; Callanan, John J.; Connor, Susan; Fountoulakis, Michael; Gmuender, Hans; Gruhler, Albrecht; Hewitt, Philip; Hodson, Mark; Matheis, Katja A.; McCarthy, Diane; Raschke, Marian; Riefke, Bjoern; Schmitt, Christina S.; Sieber, Max; Sposny, Alexandra; Suter, Laura; Sweatman, Brian

    2011-01-01

    The InnoMed PredTox consortium was formed to evaluate whether conventional preclinical safety assessment can be significantly enhanced by incorporation of molecular profiling ('omics') technologies. In short-term toxicological studies in rats, transcriptomics, proteomics and metabolomics data were collected and analyzed in relation to routine clinical chemistry and histopathology. Four of the sixteen hepato- and/or nephrotoxicants given to rats for 1, 3, or 14 days at two dose levels induced similar histopathological effects. These were characterized by bile duct necrosis and hyperplasia and/or increased bilirubin and cholestasis, in addition to hepatocyte necrosis and regeneration, hepatocyte hypertrophy, and hepatic inflammation. Combined analysis of liver transcriptomics data from these studies revealed common gene expression changes which allowed the development of a potential sequence of events on a mechanistic level in accordance with classical endpoint observations. This included genes implicated in early stress responses, regenerative processes, inflammation with inflammatory cell immigration, fibrotic processes, and cholestasis encompassing deregulation of certain membrane transporters. Furthermore, a preliminary classification analysis using transcriptomics data suggested that prediction of cholestasis may be possible based on gene expression changes seen at earlier time-points. Targeted bile acid analysis, based on LC-MS metabonomics data demonstrating increased levels of conjugated or unconjugated bile acids in response to individual compounds, did not provide earlier detection of toxicity as compared to conventional parameters, but may allow distinction of different types of hepatobiliary toxicity. Overall, liver transcriptomics data delivered mechanistic and molecular details in addition to the classical endpoint observations which were further enhanced by targeted bile acid analysis using LC/MS metabonomics.

  18. Assessing the 'Waste Hierarchy' a social cost-benefit analyse of MSW management in the European Union

    International Nuclear Information System (INIS)

    Brisson, I. E.

    1997-01-01

    This paper discusses, in the context of an impending 'waste crisis', the concept of optimal waste generation and an optimal mix of municipal solid waste (MSW) management methods. It argues that excessive quantities of MSW are likely to be generated, and consequently excessive demand for waste services will exist, as long as the marginal cost of waste services facing the household is zero. In order to avoid this excess demand, households should be charged for waste services according to their use of it, and not as presently at a flat rate. In the price to be paid by householders should be included financial as well as external costs. With respect to the optimal mix of MSW management methods, the paper asserts that this would be attained when the marginal net social costs of each management methods were equal. After setting out the theoretical background, the paper then proceeds to undertake a social cost-benefit analysis of waste management methods currently employed by the 12 'old' European Union Member States, including external and financial costs of landfill, incineration, recycling and composting. The estimates obtained from this analysis are used to assess the validity of the 'waste hierarchy', which has won widespread acceptance, and is used as a guideline in a number of countries' waste policies. In the light of the widespread focus on increasing recycling efforts, a sensitivity analysis is carried out to ascertain whether particular materials are especially suited for recycling, and whether there are other materials for which recycling should not be encouraged. (au) 16 refs

  19. On Rigorous Drought Assessment Using Daily Time Scale: Non-Stationary Frequency Analyses, Revisited Concepts, and a New Method to Yield Non-Parametric Indices

    Directory of Open Access Journals (Sweden)

    Charles Onyutha

    2017-10-01

    Full Text Available Some of the problems in drought assessments are that: analyses tend to focus on coarse temporal scales, many of the methods yield skewed indices, a few terminologies are ambiguously used, and analyses comprise an implicit assumption that the observations come from a stationary process. To solve these problems, this paper introduces non-stationary frequency analyses of quantiles. How to use non-parametric rescaling to obtain robust indices that are not (or minimally skewed is also introduced. To avoid ambiguity, some concepts on, e.g., incidence, extremity, etc., were revisited through shift from monthly to daily time scale. Demonstrations on the introduced methods were made using daily flow and precipitation insufficiency (precipitation minus potential evapotranspiration from the Blue Nile basin in Africa. Results show that, when a significant trend exists in extreme events, stationarity-based quantiles can be far different from those when non-stationarity is considered. The introduced non-parametric indices were found to closely agree with the well-known standardized precipitation evapotranspiration indices in many aspects but skewness. Apart from revisiting some concepts, the advantages of the use of fine instead of coarse time scales in drought assessment were given. The links for obtaining freely downloadable tools on how to implement the introduced methods were provided.

  20. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  1. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  2. On the applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Lara-Curzio, Edgar [ORNL; Radovic, Miladin [Texas A& M University; Luttrell, Claire R [ORNL

    2016-01-01

    The applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells (SOFC) is investigated by measuring the failure rate of Ni-YSZ when subjected to a temperature gradient and comparing it with that predicted using the Ceramics Analysis and Reliability Evaluation of Structures (CARES) code. The use of a temperature gradient to induce stresses was chosen because temperature gradients resulting from gas flow patterns generate stresses during SOFC operation that are the likely to control the structural reliability of cell components The magnitude of the predicted failure rate was found to be comparable to that determined experimentally, which suggests that such probabilistic analyses are appropriate for predicting the structural reliability of materials and components for SOFCs. Considerations for performing more comprehensive studies are discussed.

  3. Human imprint on archaeological anthroposols: first assessment of combined micromophological, pedological and lipid biomarkers analyses of organic matter

    Science.gov (United States)

    Cammas, Cécilia; Thuy Nguyen Tu, Thanh; Plessis, Marion; Clotuche, Raphaël; Derenne, Sylvie

    2013-04-01

    Archaeological anthroposol matrix contains significant amounts of fine organic matter (OM), which can give archaeological information. Geoarchaeological studies of OM aim to reveal its origin in order to reconstruct past human activities. Such studies are complex because the nature and the abundance of OM is the result of human activities together with natural processes. Also, MO evolves over time, a process that is not well understood. Combination of complementary approaches may give further insights into human imprint on archaeological anthroposols. For example, micromorphology gives data on in situ activities and pedological processes with the result that components of animal and vegetal origin can be identified but not some amorphous / fibrous material and very fine residues (pedo-sedimentary history and OM preservation. Two tanning pits in urban craft areas were selected for sampling, as they are likely to contain large amounts of organic matter of vegetal and animal origin. The pit of Saint-Denis (SDN, 10 km at the north of Paris, calcareous alluvium, 13th cAD) was a reference tanning pit. The pit of Famars (FAM, near the Belgian border, luvisols, Roman period) was hypothesized to be a part of the tanning process. To assess preservation of organic components and molecules in relation with pedo-sedimentary context and their potential as biomarkers of human activities, methodology combined micromorphology, pedological analysis (C, N, LOI, P total, organic and inorganic phosphorus) and lipid analysis by GC/MS, lipids having a high preservation potential and containing biomarkers indicative of OM origin. Micromorphological study showed a high amount and diversity of organic components in the two pits. At the SDN pit, the interpretation of tanning (liming) was supported by the presence of scarce fragments of lime with calcitic hairs pseudomorphoses. Plant remains and bone fragments were identified, but red fibrous and yellow amorphous material were not. At the FAM

  4. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 contains an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.

  5. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  6. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  7. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  8. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  9. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-15

    The objective with this report is to: - provide design premises from a long term safety aspect of a KBS-3V repository for spent nuclear fuel, to form the basis for the development of the reference design of the repository. The design premises are used as input to the documents, called production reports, that present the reference design to be analysed in the long term safety assessment SR-Site. It is the aim that the production reports should verify that the chosen design complies with the design premises given in this report, whereas this report takes the burden of justifying why these design premises are relevant. The more specific aims and objectives with the production reports are provided in these reports. The following approach is used: - The reference design analysed in SR-Can is a starting point for setting safety related design premises for the next design step. - A few design basis cases, in accordance with the definition used in the regulation SSMFS 2008:211 and mainly related to the canister, can be derived from the results of the SR-Can assessment. From these it is possible to formulate some specific design premises for the canister. - The design basis cases involve several assumptions on the state of other barriers. These implied conditions are thus set as design premises for these barriers. - Even if there are few load cases on individual barriers that can be directly derived from the analyses, SR-Can provides substantial feedback on most aspects of the analysed reference design. This feedback is also formulated as design premises. - An important part of SR-Can Main report is the formulation and assessment of safety function indicator criteria. These criteria are a basis for formulating design premises, but they are not the same as the design premises discussed in the present report. Whereas the former should be upheld throughout the assessment period, the latter refer to the initial state and must be defined such that they give a margin for

  10. Design premises for a KBS-3V repository based on results from the safety assessment SR-Can and some subsequent analyses

    International Nuclear Information System (INIS)

    2009-11-01

    The objective with this report is to: - provide design premises from a long term safety aspect of a KBS-3V repository for spent nuclear fuel, to form the basis for the development of the reference design of the repository. The design premises are used as input to the documents, called production reports, that present the reference design to be analysed in the long term safety assessment SR-Site. It is the aim that the production reports should verify that the chosen design complies with the design premises given in this report, whereas this report takes the burden of justifying why these design premises are relevant. The more specific aims and objectives with the production reports are provided in these reports. The following approach is used: - The reference design analysed in SR-Can is a starting point for setting safety related design premises for the next design step. - A few design basis cases, in accordance with the definition used in the regulation SSMFS 2008:211 and mainly related to the canister, can be derived from the results of the SR-Can assessment. From these it is possible to formulate some specific design premises for the canister. - The design basis cases involve several assumptions on the state of other barriers. These implied conditions are thus set as design premises for these barriers. - Even if there are few load cases on individual barriers that can be directly derived from the analyses, SR-Can provides substantial feedback on most aspects of the analysed reference design. This feedback is also formulated as design premises. - An important part of SR-Can Main report is the formulation and assessment of safety function indicator criteria. These criteria are a basis for formulating design premises, but they are not the same as the design premises discussed in the present report. Whereas the former should be upheld throughout the assessment period, the latter refer to the initial state and must be defined such that they give a margin for

  11. Life Cycle Assessment Applied to Naphtha Catalytic Reforming Analyse de cycle de vie appliquée au reformage catalytique du naphta

    Directory of Open Access Journals (Sweden)

    Portha J.-F.

    2010-10-01

    Full Text Available Facing the increase of environmental concerns in the oil and gas industry, engineers and scientists need information to assess sustainability of chemical processes. Among the different methods available, Life Cycle Assessment (LCA is widely used. In this study, LCA is applied to a catalytic reforming process using the Eco- Indicator 99 as life cycle impact assessment method. The main identified environmental impacts are fossil fuels consumption, climate change and respiratory effects due to inorganics compounds. The influence of different process parameters (feed composition, reaction temperature is determined with respect to environmental impacts. Two allocation methods are analysed (mass and exergetic allocation and two different process versions are compared in order to determine the effect of some improvements on environmental impact. Les considérations liées à l’environnement doivent de plus en plus être prises en compte par les ingénieurs et les scientifiques afin de juger de la durabilité des procédés chimiques dans l’industrie pétrolière et gazière. Parmi les différentes méthodes d’analyse environnementale, l’Analyse de Cycle de Vie (ACV est très utilisée. Dans cette étude, l’ACV est appliquée au procédé de reformage catalytique du naphta en utilisant la méthode Eco-Indicateur 99 comme méthode d’analyse des impacts du cycle de vie. Les principaux impacts environnementaux du procédé sont la consommation de combustibles fossiles, le changement climatique et les effets sur la respiration liés aux composés organiques. L’influence de différents paramètres (composition de l’alimentation, température de réaction sur les impacts environnementaux est testée. Deux méthodes d’allocation sont analysées (allocation massique et énergétique et deux versions du procédé de reformage catalytique sont comparées afin de déterminer les améliorations possibles permettant de minimiser les impacts.

  12. Maintenance Plan for the Performance Assessments and Composite Analyses for the Area 3 and Area 5 Radioactive Waste Management Sites at the NTS

    Energy Technology Data Exchange (ETDEWEB)

    Vefa Yucel

    2007-01-03

    U.S. Department of Energy (DOE) Manual M 435.1-1 requires that performance assessments (PAs) and composite analyses (CAs) for low-level waste (LLW) disposal facilities be maintained by the field offices. This plan describes the activities performed to maintain the PA and the CA for the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) at the Nevada Test Site (NTS). This plan supersedes the Maintenance Plan for the Performance Assessments and Composite Analyses for the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada Test Site (DOE/NV/11718--491-REV 1, dated September 2002). The plan is based on U.S. Department of Energy (DOE) Order 435.1 (DOE, 1999a), DOE Manual M 435.1-1 (DOE, 1999b), the DOE M 435.1-1 Implementation Guide DOE G 435.1-1 (DOE, 1999c), and the Maintenance Guide for PAs and CAs (DOE, 1999d). The plan includes a current update on PA/CA documentation, a revised schedule, and a section on Quality Assurance.

  13. Maintenance Plan for the Performance Assessments and Composite Analyses of the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada Test Site

    International Nuclear Information System (INIS)

    Vefa Yucel

    2007-01-01

    U.S. Department of Energy (DOE) Manual M 435.1-1 requires that performance assessments (PAs) and composite analyses (CAs) for low-level waste (LLW) disposal facilities be maintained by the field offices. This plan describes the activities performed to maintain the PA and the CA for the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) at the Nevada Test Site (NTS). This plan supersedes the Maintenance Plan for the Performance Assessments and Composite Analyses for the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada Test Site (DOE/NV/11718--491-REV 1, dated September 2002). The plan is based on U.S. Department of Energy (DOE) Order 435.1 (DOE, 1999a), DOE Manual M 435.1-1 (DOE, 1999b), the DOE M 435.1-1 Implementation Guide DOE G 435.1-1 (DOE, 1999c), and the Maintenance Guide for PAs and CAs (DOE, 1999d). The plan includes a current update on PA/CA documentation, a revised schedule, and a section on Quality Assurance

  14. Implementation of a multi-variable regression analysis in the assessment of the generation rate and composition of hospital solid waste for the design of a sustainable management system in developing countries.

    Science.gov (United States)

    Al-Khatib, Issam A; Abu Fkhidah, Ismail; Khatib, Jumana I; Kontogianni, Stamatia

    2016-03-01

    Forecasting of hospital solid waste generation is a critical challenge for future planning. The composition and generation rate of hospital solid waste in hospital units was the field where the proposed methodology of the present article was applied in order to validate the results and secure the outcomes of the management plan in national hospitals. A set of three multiple-variable regression models has been derived for estimating the daily total hospital waste, general hospital waste, and total hazardous waste as a function of number of inpatients, number of total patients, and number of beds. The application of several key indicators and validation procedures indicates the high significance and reliability of the developed models in predicting the hospital solid waste of any hospital. Methodology data were drawn from existent scientific literature. Also, useful raw data were retrieved from international organisations and the investigated hospitals' personnel. The primal generation outcomes are compared with other local hospitals and also with hospitals from other countries. The main outcome, which is the developed model results, are presented and analysed thoroughly. The goal is this model to act as leverage in the discussions among governmental authorities on the implementation of a national plan for safe hospital waste management in Palestine. © The Author(s) 2016.

  15. SuperTRI: A new approach based on branch support analyses of multiple independent data sets for assessing reliability of phylogenetic inferences.

    Science.gov (United States)

    Ropiquet, Anne; Li, Blaise; Hassanin, Alexandre

    2009-09-01

    Supermatrix and supertree are two methods for constructing a phylogenetic tree by using multiple data sets. However, these methods are not a panacea, as conflicting signals between data sets can lead to misinterpret the evolutionary history of taxa. In particular, the supermatrix approach is expected to be misleading if the species-tree signal is not dominant after the combination of the data sets. Moreover, most current supertree methods suffer from two limitations: (i) they ignore or misinterpret secondary (non-dominant) phylogenetic signals of the different data sets; and (ii) the logical basis of node robustness measures is unclear. To overcome these limitations, we propose a new approach, called SuperTRI, which is based on the branch support analyses of the independent data sets, and where the reliability of the nodes is assessed using three measures: the supertree Bootstrap percentage and two other values calculated from the separate analyses: the mean branch support (mean Bootstrap percentage or mean posterior probability) and the reproducibility index. The SuperTRI approach is tested on a data matrix including seven genes for 82 taxa of the family Bovidae (Mammalia, Ruminantia), and the results are compared to those found with the supermatrix approach. The phylogenetic analyses of the supermatrix and independent data sets were done using four methods of tree reconstruction: Bayesian inference, maximum likelihood, and unweighted and weighted maximum parsimony. The results indicate, firstly, that the SuperTRI approach shows less sensitivity to the four phylogenetic methods, secondly, that it is more accurate to interpret the relationships among taxa, and thirdly, that interesting conclusions on introgression and radiation can be drawn from the comparisons between SuperTRI and supermatrix analyses.

  16. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  17. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  18. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  19. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  20. The use of sonographic subjective tumor assessment, IOTA logistic regression model 1, IOTA Simple Rules and GI-RADS system in the preoperative prediction of malignancy in women with adnexal masses.

    Science.gov (United States)

    Koneczny, Jarosław; Czekierdowski, Artur; Florczak, Marek; Poziemski, Paweł; Stachowicz, Norbert; Borowski, Dariusz

    2017-01-01

    Sonography based methods with various tumor markers are currently used to discriminate the type of adnexal masses. To compare the predictive value of selected sonography-based models along with subjective assessment in ovarian cancer prediction. We analyzed data of 271 women operated because of adnexal masses. All masses were verified by histological examination. Preoperative sonography was performed in all patients and various predictive models includ¬ing IOTA group logistic regression model LR1 (LR1), IOTA simple ultrasound-based rules by IOTA (SR), GI-RADS and risk of malignancy index (RMI3) were used. ROC curves were constructed and respective AUC's with 95% CI's were compared. Of 271 masses 78 proved to be malignant including 6 borderline tumors. LR1 had sensitivity of 91.0%, specificity of 91.2%, AUC = 0.95 (95% CI: 0.92-0.98). Sensitivity for GI-RADS for 271 patients was 88.5% with specificity of 85% and AUC = 0.91 (95% CI: 0.88-0.95). Subjective assessment yielded sensitivity and specificity of 85.9% and 96.9%, respectively with AUC = 0.97 (95% CI: 0.94-0.99). SR were applicable in 236 masses and had sensitivity of 90.6% with specificity of 95.3% and AUC = 0.93 (95% CI 0.89-0.97). RMI3 was calculated only in 104 women who had CA125 available and had sensitivity of 55.3%, specificity of 94% and AUC = 0.85 (95% CI: 0.77-0.93). Although subjective assessment by the ultrasound expert remains the best current method of adnexal tumors preoperative discrimination, the simplicity and high predictive value favor the IOTA SR method, and when not applicable, the IOTA LR1 or GI-RADS models to be primarily and effectively used.

  1. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Directory of Open Access Journals (Sweden)

    M. Guns

    2012-06-01

    Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  2. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Science.gov (United States)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  3. Tax System in Poland – Progressive or Regressive?

    Directory of Open Access Journals (Sweden)

    Jacek Tomkiewicz

    2016-03-01

    Full Text Available Purpose: To analyse the impact of the Polish fiscal regime on the general revenue of the country, and specifically to establish whether the cumulative tax burden borne by Polish households is progressive or regressive.Methodology: On the basis of Eurostat and OECD data, the author has analysed fiscal regimes in EU Member States and in OECD countries. The tax burden of households within different income groups has also been examined pursuant to applicable fiscal laws and data pertaining to the revenue and expenditure of households published by the Central Statistical Office (CSO.Conclusions: The fiscal regime in Poland is regressive; that is, the relative fiscal burden decreases as the taxpayer’s income increases.Research Implications: The article contributes to the on-going discussion on social cohesion, in particular with respect to economic policy instruments aimed at the redistribution of income within the economy.Originality: The author presents an analysis of data pertaining to fiscal policies in EU Member States and OECD countries and assesses the impact of the legal environment (fiscal regime and social security system in Poland on income distribution within the economy. The impact of the total tax burden (direct and indirect taxes, social security contributions on the economic situation of households from different income groups has been calculated using an original formula.

  4. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  5. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  6. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  7. Logistic Regression: Concept and Application

    Science.gov (United States)

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  8. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  9. Assessing the risk of pelvic and para-aortic nodal involvement in apparent early-stage ovarian cancer: A predictors- and nomogram-based analyses.

    Science.gov (United States)

    Bogani, Giorgio; Tagliabue, Elena; Ditto, Antonino; Signorelli, Mauro; Martinelli, Fabio; Casarin, Jvan; Chiappa, Valentina; Dondi, Giulia; Leone Roberti Maggiore, Umberto; Scaffa, Cono; Borghi, Chiara; Montanelli, Luca; Lorusso, Domenica; Raspagliesi, Francesco

    2017-10-01

    To estimate the prevalence of lymph node involvement in early-stage epithelial ovarian cancer in order to assess the prognostic value of lymph node dissection. Data of consecutive patients undergoing staging for early-stage epithelial ovarian cancer were retrospectively evaluated. Logistic regression and a nomogram-based analysis were used to assess the risk of lymph node involvement. Overall, 290 patients were included. All patients had lymph node dissection including pelvic and para-aortic lymphadenectomy. Forty-two (14.5%) patients were upstaged due to lymph node metastatic disease. Pelvic and para-aortic nodal metastases were observed in 22 (7.6%) and 42 (14.5%) patients. Lymph node involvement was observed in 18/95 (18.9%), 1/37 (2.7%), 4/29 (13.8%), 11/63 (17.4%), 3/41 (7.3%) and 5/24 (20.8%) patients with high-grade serous, low-grade-serous, endometrioid G1, endometrioid G2&3, clear cell and undifferentiated, histology, respectively (p=0.12, Chi-square test). We observed that high-grade serous histology was associated with an increased risk of pelvic node involvement; while, histology rather than low-grade serous and bilateral tumors were independently associated with para-aortic lymph node involvement (p<0.05). Nomograms displaying the risk of nodal involvement in the pelvic and para-aortic areas were built. High-grade serous histology and bilateral tumors are the main characteristics suggesting lymph node positivity. Our data suggested that high-grade serous and bilateral early-stage epithelial ovarian cancer are at high risk of having disease harboring in the lymphatic tissues of both pelvic and para-aortic area. After receiving external validation, our data will help to identify patients deserving comprehensive retroperitoneal staging. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Tumor regression patterns in retinoblastoma

    International Nuclear Information System (INIS)

    Zafar, S.N.; Siddique, S.N.; Zaheer, N.

    2016-01-01

    To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)

  11. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 4: Uncertainty and sensitivity analyses for 40 CFR 191, Subpart B

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to the EPA`s Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Additional information about the 1992 PA is provided in other volumes. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions, the choice of parameters selected for sampling, and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect compliance with 40 CFR 191B are: drilling intensity, intrusion borehole permeability, halite and anhydrite permeabilities, radionuclide solubilities and distribution coefficients, fracture spacing in the Culebra Dolomite Member of the Rustler Formation, porosity of the Culebra, and spatial variability of Culebra transmissivity. Performance with respect to 40 CFR 191B is insensitive to uncertainty in other parameters; however, additional data are needed to confirm that reality lies within the assigned distributions.

  12. Electricity mix and ecological assessments. Consequences of the choice of specific electricity mixes in analyses of the environmental performance of products and services

    International Nuclear Information System (INIS)

    Menard, M.; Dones, R.; Gantner, U.

    1998-12-01

    The study aims at analysing the methodological issues associated with the definition of electricity mixes and discussing the consequences of the choice of specific electricity mixes in analyses of the environmental performance of products and services, based on Life Cycle Assessment (LCA). This report has been designed as a guideline to support LCA practitioners in the systematic identification of the most appropriate electricity mixes for LCA applications. A detailed checklist has been developed for this purpose. It includes the following items: type of electricity supply (from the net, self production, direct contracts); voltage level; country/place of utilisation; year of utilisation; season/daytime of utilisation; import/export model; and, marginal vs. average approach. A few examples, utilising published LCA studies, illustrate the impacts of the insights gained in the present work. Although primarily aimed at applications in Switzerland, the main concepts, the modelling and parts of the information provided can also be applied to other European countries. In addition to the three models proposed earlier for the assessment of the Swiss yearly average electricity mix, a new model (M4) has been developed in the frame of the present task in order to take into account the conditions characteristic for Switzerland as a transit land for electricity trades between its neighbour countries. All existing electricity mix models as well as selected environmental inventories are described and compared in the report. As an example of results, the CO 2 emissions calculated for the Swiss yearly electricity supply mix are relatively small (48 g/kWh with model M4, as compared with 497 g/kWh for the average UCPTE mix). Key information on the structure of electricity generation and trade in Europe is provided. The modelling of the electricity supply for most of the European countries is less sensitive to the choice of an electricity model than for Switzerland. Considering that

  13. Maintenance Plan for the Performance Assessments and Composite Analyses for the Area 3 and Area 5 Radioactive Waste Management Sites at the Nevada Test Site

    International Nuclear Information System (INIS)

    V. Yucel

    2002-01-01

    U.S. Department of Energy (DOE) Order 435.1 requires that performance assessments (PAs) and composite analyses (CAs) for low-level waste (LLW) disposal facilities be maintained by the field offices. This plan describes the activities to be performed in maintaining the Performance Assessment (PA) and Composite Analysis (CA) for the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) at the Nevada Test Site (NTS). The Disposal Authorization Statement (DAS) for the continuing operations of a LLW facility at the DOE complex specifies the conditions for operations based on approval of a PA and CA, and requires the facility to implement a maintenance program to assure that these conditions will remain protective of the public health and the environment in the future. The goal of the maintenance program is to provide that assurance. The maintenance process is an iterative one in which changing conditions may result in a revision of PA and CA; the revised PA and CA may impose a different set of conditions for facility operation, closure, and postclosure. The maintenance process includes managing uncertainty, performing annual reviews, submitting annual summary reports to DOE Headquarters (DOE/HQ), carrying out special analyses, and revising the PAs and CAs, if necessary. Management of uncertainty is an essential component of the maintenance program because results of the original PAs and CAs are understood to be based on uncertain assumptions about the conceptual models; the mathematical models and parameters; and the future state of the lands, disposal facilities, and human activities. The annual reviews for the PAs include consideration of waste receipts, facility specific factors, results of monitoring, and results of research and development (R and D) activities. Likewise, results of ongoing R and D, changes in land-use planning, new information on known sources of residual radioactive materials, and identification of new sources may warrant an evaluation to

  14. Assessing Diabetes Self-Management with the Diabetes Self-Management Questionnaire (DSMQ Can Help Analyse Behavioural Problems Related to Reduced Glycaemic Control.

    Directory of Open Access Journals (Sweden)

    Andreas Schmitt

    Full Text Available To appraise the Diabetes Self-Management Questionnaire (DSMQ's measurement of diabetes self-management as a statistical predictor of glycaemic control relative to the widely used SDSCA.248 patients with type 1 diabetes and 182 patients with type 2 diabetes were cross-sectionally assessed using the two self-report measures of diabetes self-management DSMQ and SDSCA; the scales were used as competing predictors of HbA1c. We developed a structural equation model of self-management as measured by the DSMQ and analysed the amount of variation explained in HbA1c; an analogue model was developed for the SDSCA.The structural equation models of self-management and glycaemic control showed very good fit to the data. The DSMQ's measurement of self-management showed associations with HbA1c of -0.53 for type 1 and -0.46 for type 2 diabetes (both P < 0.001, explaining 21% and 28% of variation in glycaemic control, respectively. The SDSCA's measurement showed associations with HbA1c of -0.14 (P = 0.030 for type 1 and -0.31 (P = 0.003 for type 2 diabetes, explaining 2% and 10% of glycaemic variation. Predictive power for glycaemic control was significantly higher for the DSMQ (P < 0.001.This study supports the DSMQ as the preferred tool when analysing self-reported behavioural problems related to reduced glycaemic control. The scale may be useful for clinical assessments of patients with suboptimal diabetes outcomes or research on factors affecting associations between self-management behaviours and glycaemic control.

  15. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  16. Assessing the influence of traffic-related air pollution on risk of term low birth weight on the basis of land-use-based regression models and measures of air toxics.

    Science.gov (United States)

    Ghosh, Jo Kay C; Wilhelm, Michelle; Su, Jason; Goldberg, Daniel; Cockburn, Myles; Jerrett, Michael; Ritz, Beate

    2012-06-15

    Few studies have examined associations of birth outcomes with toxic air pollutants (air toxics) in traffic exhaust. This study included 8,181 term low birth weight (LBW) children and 370,922 term normal-weight children born between January 1, 1995, and December 31, 2006, to women residing within 5 miles (8 km) of an air toxics monitoring station in Los Angeles County, California. Additionally, land-use-based regression (LUR)-modeled estimates of levels of nitric oxide, nitrogen dioxide, and nitrogen oxides were used to assess the influence of small-area variations in traffic pollution. The authors examined associations with term LBW (≥37 weeks' completed gestation and birth weight variations) resulted in 2%-5% increased odds per interquartile-range increase in third-trimester benzene, toluene, ethyl benzene, and xylene exposures, with some confidence intervals containing the null value. This analysis highlights the importance of both spatial and temporal contributions to air pollution in epidemiologic birth outcome studies.

  17. Augmenting Data with Published Results in Bayesian Linear Regression

    Science.gov (United States)

    de Leeuw, Christiaan; Klugkist, Irene

    2012-01-01

    In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…

  18. Predicting Word Reading Ability: A Quantile Regression Study

    Science.gov (United States)

    McIlraith, Autumn L.

    2018-01-01

    Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…

  19. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  20. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Assessing public speaking fear with the short form of the Personal Report of Confidence as a Speaker scale: confirmatory factor analyses among a French-speaking community sample.

    Science.gov (United States)

    Heeren, Alexandre; Ceschi, Grazia; Valentiner, David P; Dethier, Vincent; Philippot, Pierre

    2013-01-01

    The main aim of this study was to assess the reliability and structural validity of the French version of the 12-item version of the Personal Report of Confidence as Speaker (PRCS), one of the most promising measurements of public speaking fear. A total of 611 French-speaking volunteers were administered the French versions of the short PRCS, the Liebowitz Social Anxiety Scale, the Fear of Negative Evaluation scale, as well as the Trait version of the Spielberger State-Trait Anxiety Inventory and the Beck Depression Inventory-II, which assess the level of anxious and depressive symptoms, respectively. Regarding its structural validity, confirmatory factor analyses indicated a single-factor solution, as implied by the original version. Good scale reliability (Cronbach's alpha = 0.86) was observed. The item discrimination analysis suggested that all the items contribute to the overall scale score reliability. The French version of the short PRCS showed significant correlations with the Liebowitz Social Anxiety Scale (r = 0.522), the Fear of Negative Evaluation scale (r = 0.414), the Spielberger State-Trait Anxiety Inventory (r = 0.516), and the Beck Depression Inventory-II (r = 0.361). The French version of the short PRCS is a reliable and valid measure for the evaluation of the fear of public speaking among a French-speaking sample. These findings have critical consequences for the measurement of psychological and pharmacological treatment effectiveness in public speaking fear among a French-speaking sample.

  2. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.

  3. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    International Nuclear Information System (INIS)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef

    2014-01-01

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers

  4. Inconsistent trial assessments by the National Institute for Health and Clinical Excellence and IQWiG: standards for the performance and interpretation of subgroup analyses are needed.

    Science.gov (United States)

    Hasford, J; Bramlage, P; Koch, G; Lehmacher, W; Einhäupl, K; Rothwell, P M

    2010-12-01

    The methodology for the critical assessment of medical interventions is well established. Regulatory agencies and institutions adhere, in principle, to the same standards. This consistency, however, is not always the case in practice. Using the evaluation of the CAPRIE (Clopidogrel versus Aspirin in Patients at risk of Ischemic Events) trial by the British National Institute for Health and Clinical Excellence (NICE) and the German Institute for Quality and Efficiency in Health Care (IQWiG), we illustrate that there was no consensus for the interpretation of possible heterogeneity in treatment comparisons across subgroups. The NICE concluded that CAPRIE demonstrated clinical benefit for the overall intention-to-treat (ITT) population with sufficient robustness to possible sources of heterogeneity. The IQWiG interpreted the alleged heterogeneity as implying that the clinical benefit only applied to the subgroup of patients with a statistically significant result irrespective of the results of the ITT analysis. International standards for the performance and interpretation of subgroup analyses as well as for the assessment of heterogeneity between strata are needed. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Abstract Expression Grammar Symbolic Regression

    Science.gov (United States)

    Korns, Michael F.

    This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.

  6. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying; Carroll, Raymond J.

    2009-01-01

    . The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a

  7. From Rasch scores to regression

    DEFF Research Database (Denmark)

    Christensen, Karl Bang

    2006-01-01

    Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....

  8. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  9. Regression methods for medical research

    CERN Document Server

    Tai, Bee Choo

    2013-01-01

    Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the

  10. Forecasting with Dynamic Regression Models

    CERN Document Server

    Pankratz, Alan

    2012-01-01

    One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.

  11. [Regulatory factors for images of the elderly among elementary school students assessed through secular trend analyses by frequency of inter-exchange with "REPRINTS" senior volunteers].

    Science.gov (United States)

    Fujiwara, Yoshinori; Watanabe, Naoki; Nishi, Mariko; Lee, Sangyoon; Ohba, Hiromi; Yoshida, Hiroto; Sakuma, Naoko; Fukaya, Taro; Kousa, Youko; Inoue, Kazuko; Amano, Hidenori; Uchida, Hayato; Kakuno, Fumihiko; Shinkai, Shoji

    2007-09-01

    We have launched a new intervention study, called "REPRINTS" (Research of productivity by intergenerational sympathy) in which senior volunteers aged 60 years and over engage in reading picture books to school children, regularly visiting public elementary schools since 2004. The purpose of this study was to clarify characteristics of images of older people held by elementary school children and factors associated with such images, as well as to examine changes in images through intervention by "REPRINTS" senior volunteers (volunteers) for the initial one year period. Four to six volunteers as a group visited A elementary school in a suburb Kawasaki city (470 students) twice a week to read picture books. The baseline survey was conducted one month after launching the volunteer activity. First and second follow-up surveys were conducted at 6 month intervals after the baseline survey. Grade, gender, short version of emotional-like image scale of older adults assessed by the SD (Semantic Differential) method (6 items in the subscale for "evaluation" and 4 items in the subscale for "potency/activity"), experience of living with grandparents, experience of interchange with older people, frequency of interchange with volunteers and the social desirability scale for children. Related variables for a higher score in the subscale for "evaluation" included lower grade and abundant experience of interchange with older people such as grandparents. Those for "potency/ activity" included lower grade, male gender, and a higher social desirability scale for children in the multiple logistic regression model. Students were divided into two groups in terms of frequency of interchange with volunteers (low and high-frequency groups) through three surveys. In the subscale for "evaluation", the general linear model demonstrated a significant interaction between the group and number of surveys adjusted for confounding factors. Although emotional images of older people significantly

  12. Strengthening analyses and mechanical assessment of Ti/Al{sub 2}O{sub 3} nano-composites produced by friction stir processing

    Energy Technology Data Exchange (ETDEWEB)

    Shafiei-Zarghani, Aziz, E-mail: ashafiei@ut.ac.ir [Center of Excellence for Surface Engineering and Corrosion Protection of Industries, School of Metallurgy and Materials Engineering, College of Engineering, University of Tehran, Tehran (Iran, Islamic Republic of); Kashani-Bozorg, Seyed Farshid, E-mail: fkashani@ut.ac.ir [Center of Excellence for Surface Engineering and Corrosion Protection of Industries, School of Metallurgy and Materials Engineering, College of Engineering, University of Tehran, Tehran (Iran, Islamic Republic of); Gerlich, Adrian P., E-mail: adrian.gerlich@uwaterloo.ca [Department of Mechanical and Mechatronics Engineering, University of Waterloo, Waterloo (Canada)

    2015-04-17

    The present work investigates strengthening mechanisms and mechanical assessment of Ti/Al{sub 2}O{sub 3} nano-composites produced by friction stir processing of commercially pure titanium using nano-sized Al{sub 2}O{sub 3} with different volume fractions and particle sizes. Microstructural analyses were conducted to characterize the grain size of matrix, size and dispersion of reinforcing particles. The mean grain size of the composites ranged from ~0.7 to 1.1 μm that is much lower than 28 μm of the as-received material. Reduction of grain size was found to be in agreement with Rios approach (based on energy dissipated during the motion of an interface through particle dispersion), and showed deviation from Zener pinning model. Scanning and transmission electron microscopies revealed a near uniform dispersion of Al{sub 2}O{sub 3} nano-particles, with only a small fraction of widely spaced clusters. The maximum compression yield strength of the fabricated nano-composite (Ti/3.9%vol of 20 nm-Al{sub 2}O{sub 3}) was found to be ~494 MPa that is ~1.5 times higher than that of the as-received material. Strengthening analyses based on grain refining (Hall–Petch approach), load transfer from matrix to reinforcements, Orowan looping, and enhanced dislocation density due to thermal mismatch effects were carried out considering Al{sub 2}O{sub 3} reinforcement with different volume fractions and sizes. However, Hall–Petch approach was found to be the dominant mechanism for the enhancement of yield strength.

  13. Dynamic performance assessment of a residential building-integrated cogeneration system under different boundary conditions. Part II: Environmental and economic analyses

    International Nuclear Information System (INIS)

    Rosato, Antonio; Sibilio, Sergio; Scorpio, Michelangelo

    2014-01-01

    Highlights: • A building-integrated micro-cogeneration system was dynamically simulated. • Simulation data were analyzed from both environmental and economic point of views. • The proposed system was compared with a conventional supply system. • The proposed system reduces the environmental impact under heat-led operation. • The proposed system reduces the operating costs whatever the control logic is. - Abstract: This work examines the performance of a residential building-integrated micro-cogeneration system during the winter by means of a whole building simulation software. The cogeneration unit was coupled with a multi-family house composed of three floors, compliant with the transmittance values of both walls and windows suggested by the Italian Law; a stratified combined tank for both heating purposes and domestic hot water production was also used for storing heat. Simulations were performed considering the transient nature of the building and occupant driven loads as well as the part-load characteristics of the cogeneration unit. This system was described in detail and analyzed from an energy point of view in the companion paper. In this paper the simulation results were evaluated in terms of both carbon dioxide equivalent emissions and operating costs; detailed analyses were performed in order to estimate the influence of the most significant boundary conditions on both environmental and economic performance of the proposed system: in particular, three volumes of the hot water storage, four climatic zones corresponding to four Italian cities, two electric demand profiles, as well as two control strategies micro-cogeneration unit were considered. The assessment of environmental impact was performed by using the standard emission factors approach, neglecting the effects of local pollutants. The operating costs due to both natural gas and electric energy consumption were evaluated in detail, whereas both the capital and maintenance costs were

  14. Assessment of wastewater and recycled water quality: a comparison of lines of evidence from in vitro, in vivo and chemical analyses.

    Science.gov (United States)

    Leusch, Frederic D L; Khan, Stuart J; Gagnon, M Monique; Quayle, Pam; Trinh, Trang; Coleman, Heather; Rawson, Christopher; Chapman, Heather F; Blair, Palenque; Nice, Helen; Reitsema, Tarren

    2014-03-01

    We investigated water quality at an advanced water reclamation plant and three conventional wastewater treatment plants using an "ecotoxicity toolbox" consisting of three complementary analyses (chemical analysis, in vitro bioanalysis and in situ biological monitoring), with a focus on endocrine disruption. The in vitro bioassays were chosen to provide an appropriately wide coverage of biological effects relevant to managed aquifer recharge and environmental discharge of treated wastewater, and included bioassays for bacterial toxicity (Microtox), genotoxicity (umuC), photosynthesis inhibition (Max-I-PAM) and endocrine effects (E-SCREEN and AR-CALUX). Chemical analysis of hormones and pesticides using LCMSMS was performed in parallel to correlate standard analytical methods with the in vitro assessment. For two plants with surface water discharge into open drains, further field work was carried out to examine in situ effects using mosquitofish (Gambusia holbrooki) as a bioindicator species for possible endocrine effects. The results show considerable cytotoxicity, phytotoxicity, estrogenicity and androgenicity in raw sewage, all of which were significantly reduced by conventional wastewater treatment. No biological response was detected to RO water, suggesting that reverse osmosis is a significant barrier to biologically active compounds. Chemical analysis and in situ monitoring revealed trends consistent with the in vitro results: chemical analysis confirmed the removal trends observed by the bioanalytical tools, and in situ sampling did not reveal any evidence of endocrine disruption specifically due to discharge of treated wastewater (although other sources may be present). Biomarkers of exposure (in vitro) and effect (in vivo or in situ) are complementary and together provide information with a high level of ecological relevance. This study illustrates the utility of combining multiple lines of evidence in the assessment of water quality. Copyright

  15. Sao Paulo Lightning Mapping Array (SP-LMA): Network Assessment and Analyses for Intercomparison Studies and GOES-R Proxy Activities

    Science.gov (United States)

    Bailey, J. C.; Blakeslee, R. J.; Carey, L. D.; Goodman, S. J.; Rudlosky, S. D.; Albrecht, R.; Morales, C. A.; Anselmo, E. M.; Neves, J. R.; Buechler, D. E.

    2014-01-01

    A 12 station Lightning Mapping Array (LMA) network was deployed during October 2011 in the vicinity of Sao Paulo, Brazil (SP-LMA) to contribute total lightning measurements to an international field campaign [CHUVA - Cloud processes of tHe main precipitation systems in Brazil: A contribUtion to cloud resolVing modeling and to the GPM (GlobAl Precipitation Measurement)]. The SP-LMA was operational from November 2011 through March 2012 during the Vale do Paraiba campaign. Sensor spacing was on the order of 15-30 km, with a network diameter on the order of 40-50km. The SP-LMA provides good 3-D lightning mapping out to 150 km from the network center, with 2-D coverage considerably farther. In addition to supporting CHUVA science/mission objectives, the SP-LMA is supporting the generation of unique proxy data for the Geostationary Lightning Mapper (GLM) and Advanced Baseline Imager (ABI), on NOAA's Geostationary Operational Environmental Satellite-R (GOES-R: scheduled for a 2015 launch). These proxy data will be used to develop and validate operational algorithms so that they will be ready to use on "day1" following the GOES-R launch. As the CHUVA Vale do Paraiba campaign opportunity was formulated, a broad community-based interest developed for a comprehensive Lightning Location System (LLS) intercomparison and assessment study, leading to the participation and/or deployment of eight other ground-based networks and the space-based Lightning Imaging Sensor (LIS). The SP-LMA data is being intercompared with lightning observations from other deployed lightning networks to advance our understanding of the capabilities/contributions of each of these networks toward GLM proxy and validation activities. This paper addresses the network assessment including noise reduction criteria, detection efficiency estimates, and statistical and climatological (both temporal and spatially) analyses for intercomparison studies and GOES-R proxy activities.

  16. Downscaling global land cover projections from an integrated assessment model for use in regional analyses: results and evaluation for the US from 2005 to 2095

    International Nuclear Information System (INIS)

    West, Tristram O; Le Page, Yannick; Wolf, Julie; Thomson, Allison M; Huang, Maoyi

    2014-01-01

    Projections of land cover change generated from integrated assessment models (IAM) and other economic-based models can be applied for analyses of environmental impacts at sub-regional and landscape scales. For those IAM and economic models that project land cover change at the continental or regional scale, these projections must be downscaled and spatially distributed prior to use in climate or ecosystem models. Downscaling efforts to date have been conducted at the national extent with relatively high spatial resolution (30 m) and at the global extent with relatively coarse spatial resolution (0.5°). We revised existing methods to downscale global land cover change projections for the US to 0.05° resolution using MODIS land cover data as the initial proxy for land class distribution. Land cover change realizations generated here represent a reference scenario and two emissions mitigation pathways (MPs) generated by the global change assessment model (GCAM). Future gridded land cover realizations are constructed for each MODIS plant functional type (PFT) from 2005 to 2095, commensurate with the community land model PFT land classes, and archived for public use. The GCAM land cover realizations provide spatially explicit estimates of potential shifts in croplands, grasslands, shrublands, and forest lands. Downscaling of the MPs indicate a net replacement of grassland by cropland in the western US and by forest in the eastern US. An evaluation of the downscaling method indicates that it is able to reproduce recent changes in cropland and grassland distributions in respective areas in the US, suggesting it could provide relevant insights into the potential impacts of socio-economic and environmental drivers on future changes in land cover. (letters)

  17. Assessing public speaking fear with the short form of the Personal Report of Confidence as a Speaker scale: confirmatory factor analyses among a French-speaking community sample

    Directory of Open Access Journals (Sweden)

    Heeren A

    2013-05-01

    Full Text Available Alexandre Heeren,1,2 Grazia Ceschi,3 David P Valentiner,4 Vincent Dethier,1 Pierre Philippot11Université Catholique de Louvain, Louvain-la-Neuve, Belgium; 2National Fund for Scientific Research, Brussels, Belgium; 3Department of Psychology, University of Geneva, Geneva, Switzerland; 4Department of Psychology, Northern Illinois University, DeKalb, IL, USABackground: The main aim of this study was to assess the reliability and structural validity of the French version of the 12-item version of the Personal Report of Confidence as Speaker (PRCS, one of the most promising measurements of public speaking fear.Methods: A total of 611 French-speaking volunteers were administered the French versions of the short PRCS, the Liebowitz Social Anxiety Scale, the Fear of Negative Evaluation scale, as well as the Trait version of the Spielberger State-Trait Anxiety Inventory and the Beck Depression Inventory-II, which assess the level of anxious and depressive symptoms, respectively.Results: Regarding its structural validity, confirmatory factor analyses indicated a single-factor solution, as implied by the original version. Good scale reliability (Cronbach’s alpha = 0.86 was observed. The item discrimination analysis suggested that all the items contribute to the overall scale score reliability. The French version of the short PRCS showed significant correlations with the Liebowitz Social Anxiety Scale (r = 0.522, the Fear of Negative Evaluation scale (r = 0.414, the Spielberger State-Trait Anxiety Inventory (r = 0.516, and the Beck Depression Inventory-II (r = 0.361.Conclusion: The French version of the short PRCS is a reliable and valid measure for the evaluation of the fear of public speaking among a French-speaking sample. These findings have critical consequences for the measurement of psychological and pharmacological treatment effectiveness in public speaking fear among a French-speaking sample.Keywords: social phobia, public speaking, confirmatory

  18. Regression: The Apple Does Not Fall Far From the Tree.

    Science.gov (United States)

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  19. Lichens biomonitoring as feasible methodology to assess air pollution in natural ecosystems: combined study of quantitative PAHs analyses and lichen biodiversity in the Pyrenees Mountains.

    Science.gov (United States)

    Blasco, María; Domeño, Celia; Nerín, Cristina

    2008-06-01

    The air quality in the Aragón valley, in the central Pyrenees, has been assessed by evaluation of lichen biodiversity and mapped by elaboration of the Index of Air Purity (IAP) based on observations of the presence and abundance of eight kinds of lichen with different sensitivity to air pollution. The IAP values obtained have been compared with quantitative analytical measures of 16 PAHs in the lichen Evernia prunastri, because this species was associated with a wide range of traffic exposure and levels of urbanization. Analyses of PAHs were carried out by the DSASE method followed by an SPE clean-up step and GC-MS analysis. The concentration of total PAHs found in lichen samples from the Aragón valley ranged from 692 to 6420 ng g(-1) and the PAHs profile showed predominance of compounds with three aromatic rings. The influence of the road traffic in the area has been shown because values over the median concentration of PAHs (>1092 ng g(-1)), percentage of combustion PAHs (>50%), and equivalent toxicity (>169) were found in lichens collected at places exposed to the influence of traffic. The combination of both methods suggests IAP as a general method for evaluating the air pollution referenced to PAHs because it can be correlated with the content of combustion PAHs and poor lichen biodiversity can be partly explained by the air pollution caused by specific PAHs.

  20. Producing The New Regressive Left

    DEFF Research Database (Denmark)

    Crone, Christine

    members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...

  1. A Matlab program for stepwise regression

    Directory of Open Access Journals (Sweden)

    Yanhong Qi

    2016-03-01

    Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.

  2. Regression filter for signal resolution

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-01-01

    The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)

  3. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  4. Cactus: An Introduction to Regression

    Science.gov (United States)

    Hyde, Hartley

    2008-01-01

    When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…

  5. Regression Models for Repairable Systems

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2015-01-01

    Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf

  6. Survival analysis II: Cox regression

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the

  7. Kernel regression with functional response

    OpenAIRE

    Ferraty, Frédéric; Laksaci, Ali; Tadj, Amel; Vieu, Philippe

    2011-01-01

    We consider kernel regression estimate when both the response variable and the explanatory one are functional. The rates of uniform almost complete convergence are stated as function of the small ball probability of the predictor and as function of the entropy of the set on which uniformity is obtained.

  8. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The use of managerial grid to analyse the relationship between assessment of human and production or service issues in various companies

    Directory of Open Access Journals (Sweden)

    Krzysztof Knop

    2015-12-01

    Full Text Available In the article the results of BOST method usage was presented to analyse the importance of human and production/ services issues in three different companies - a steelworks, a plastic-processing and a retail chain company. The importance of human and production/service issues was analysed by using the concept of the managerial grid. The relation between workers answers with the use of managerial grid after division answers was analysed into four and three areas. The frequency of occurrence of ratings to determine a degree of perception of importance of human and production/service issues in these companies was analysed.

  10. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  11. Multivariate and semiparametric kernel regression

    OpenAIRE

    Härdle, Wolfgang; Müller, Marlene

    1997-01-01

    The paper gives an introduction to theory and application of multivariate and semiparametric kernel smoothing. Multivariate nonparametric density estimation is an often used pilot tool for examining the structure of data. Regression smoothing helps in investigating the association between covariates and responses. We concentrate on kernel smoothing using local polynomial fitting which includes the Nadaraya-Watson estimator. Some theory on the asymptotic behavior and bandwidth selection is pro...

  12. Regression algorithm for emotion detection

    OpenAIRE

    Berthelon , Franck; Sander , Peter

    2013-01-01

    International audience; We present here two components of a computational system for emotion detection. PEMs (Personalized Emotion Maps) store links between bodily expressions and emotion values, and are individually calibrated to capture each person's emotion profile. They are an implementation based on aspects of Scherer's theoretical complex system model of emotion~\\cite{scherer00, scherer09}. We also present a regression algorithm that determines a person's emotional feeling from sensor m...

  13. Directional quantile regression in R

    Czech Academy of Sciences Publication Activity Database

    Boček, Pavel; Šiman, Miroslav

    2017-01-01

    Roč. 53, č. 3 (2017), s. 480-492 ISSN 0023-5954 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : multivariate quantile * regression quantile * halfspace depth * depth contour Subject RIV: BD - Theory of Information OBOR OECD: Applied mathematics Impact factor: 0.379, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/bocek-0476587.pdf

  14. Polylinear regression analysis in radiochemistry

    International Nuclear Information System (INIS)

    Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.

    1995-01-01

    A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis

  15. Gaussian Process Regression Model in Spatial Logistic Regression

    Science.gov (United States)

    Sofro, A.; Oktaviarina, A.

    2018-01-01

    Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.

  16. Assessment of aflatoxin exposure of laboratory worker during food contamination analyses. Assessment of the procedures adopted by an A.R.P.A.L. laboratory (Liguria Region Environmental Protection Agency).

    Science.gov (United States)

    Traverso, A; Bassoli, Viviana; Cioè, A; Anselmo, Silvia; Ferro, Marta

    2010-01-01

    Aflatoxins are mycotoxins derived from foodstuffs colonized by fungal species of the genus Aspergillus; they are common food contaminants with immunosuppressive, mutagenic and carcinogenic activity. Aflatoxins are heat-resistant and are thus easily transmitted along the food chain. They are hepatotoxic and have the potential to induce hepatocellular carcinoma. Agri-food industry workers are thus at risk of ingestion as well as transmucosal absorption or inhalation of toxins released during product preparation or processing. To measure the levels of airborne mycotoxins, particularly aflatoxins, in a laboratory analysing imported foodstuffs for mycotoxin contamination. The protocol used to analyse a batch of shelled peanuts from Vietnam, especially the grinding phase, which is held to be at the highest risk ofgenerating airborne toxins, was assessed at the A.R.PA.L. laboratory (Liguria Region Environmental Protection Agency) of Genoa, Italy, which participates in a European aflatoxin monitoring project. Wet grinding was performed to avoid production of large amounts of dust. Comparison of airborne concentrations before and after grinding with legal thresholds disclosed that the analytical procedures involved negligible aflatoxin levels for operators (environmental burden 0.11 pg/ m3). Given the toxicity of aflatoxins, worker protection measures should be consistently adopted and enforced. Threshold limit values for working environments should be introduced besides the existing ones for public health.

  17. Mis-Match Limit Load Analyses and Fracture Mechanics Assessment for Welded Pipe with Circumferential Crack at the Center of Weldment

    Energy Technology Data Exchange (ETDEWEB)

    Song, Tae Kwang; Jeon, Jun Young; Shim, Kwang Bo; Kim, Yun Jae [Korea University, Seoul (Korea, Republic of); Kim, Jong Sung [Sunchon University, Suncheon (Korea, Republic of); Jin, Tae Eun [Korea Power Engineering Company, Daejeon (Korea, Republic of)

    2010-01-15

    In this paper, limit load analyses and fracture mechanics analyses were conducted via finite element analyses for the welded pipe with circumferential crack at the center of the weldment. Systematic changes for strength mismatch ratio, width of weldment, crack shape and thickness ratio of the pipe were considered to provide strength mismatch limit load. And J-integral calculations based on reference stress method were conducted for two materials, stainless steel and ferritic steel. Reference stress defined by provided strength mis-match limit load gives much more accurate J-integral.

  18. Assessing microbial degradation of o-xylene at field-scale from the reduction in mass flow rate combined with compound-specific isotope analyses

    Science.gov (United States)

    Peter, A.; Steinbach, A.; Liedl, R.; Ptak, T.; Michaelis, W.; Teutsch, G.

    2004-07-01

    In recent years, natural attenuation (NA) has evolved into a possible remediation alternative, especially in the case of BTEX spills. In order to be approved by the regulators, biodegradation needs to be demonstrated which requires efficient site investigation and monitoring tools. Three methods—the Integral Groundwater Investigation method, the compound-specific isotope analysis (CSIA) and a newly developed combination of both—were used in this work to quantify at field scale the biodegradation of o-xylene at a former gasworks site which is heavily contaminated with BTEX and PAHs. First, the Integral Groundwater Investigation method [Schwarz, R., Ptak, T., Holder, T., Teutsch, G., 1998. Groundwater risk assessment at contaminated sites: a new investigation approach. In: Herbert, M. and Kovar, K. (Editors), GQ'98 Groundwater Quality: Remediation and Protection. IAHS Publication 250, pp. 68-71; COH 4 (2000) 170] was applied, which allows the determination of mass flow rates of o-xylene by integral pumping tests. Concentration time series obtained during pumping at two wells were used to calculate inversely contaminant mass flow rates at the two control planes that are defined by the diameter of the maximum isochrone. A reactive transport model was used within a Monte Carlo approach to identify biodegradation as the dominant process for reduction in the contaminant mass flow rate between the two consecutive control planes. Secondly, compound-specific carbon isotope analyses of o-xylene were performed on the basis of point-scale samples from the same two wells. The Rayleigh equation was used to quantify the degree of biodegradation that occurred between the wells. Thirdly, a combination of the Integral Groundwater Investigation method and the compound-specific isotope analysis was developed and applied. It comprises isotope measurements during the integral pumping tests and the evaluation of δ13C time series by an inversion algorithm to obtain spatially

  19. The Combined Use of in Silico, in Vitro, and in Vivo Analyses to Assess Anti-cancerous Potential of a Bioactive Compound from Cyanobacterium Nostoc sp. MGL001

    Directory of Open Access Journals (Sweden)

    Niveshika

    2017-11-01

    Full Text Available Escalating incidences of cancer, especially in developed and developing countries, demand evaluation of potential unexplored natural drug resources. Here, anticancer potential of 9-Ethyliminomethyl-12-(morpholin-4-ylmethoxy-5,8,13,16-tetraaza -hexacene-2,3-dicarboxylic acid (EMTAHDCA isolated from fresh water cyanobacterium Nostoc sp. MGL001 was screened through in silico, in vitro, and in vivo studies. For in silico analysis, EMTAHDCA was selected as ligand and 11 cancer related proteins (Protein Data Bank ID: 1BIX, 1NOW, 1TE6, 2RCW, 2UVL, 2VCJ, 3CRY, 3HQU, 3NMQ, 5P21, and 4B7P which are common targets of various anticancer drugs were selected as receptors. The results obtained from in silico analysis showed that EMTAHDCA has strong binding affinity for all the 11 target protein receptors. The ability of EMTAHDCA to bind active sites of cancer protein targets indicated that it is functionally similar to commercially available anticancer drugs. For assessing cellular metabolic activities, in vitro studies were performed by using calorimetric assay viz. 3-(4,5-dimethylthiazol-2-yl-2,5 diphenyltetrazolium bromide (MTT. Results showed that EMTAHDCA induced significant cytotoxic response against Dalton's lymphoma ascites (DLA cells in a dose and time dependent manner with an inhibitory concentration (IC50 value of 372.4 ng/mL after 24 h of incubation. However, in case of normal bone marrow cells, the EMTAHDCA did not induce cytotoxicity as the IC50 value was not obtained even with higher dose of 1,000 ng/mL EMTAHDCA. Further, in vivo studies revealed that the median life span/survival days of tumor bearing mice treated with EMTAHDCA increased significantly with a fold change of ~1.9 and 1.81 corresponding to doses of 5 and 10 mg/kg body weight (B.W. of EMTAHDCA respectively, as compared to the DL group. Our results suggest that 5 mg/kg B.W. is effective since the dose of 10 mg/kg B.W. did not show any significant difference as compared to 5 mg/kg B

  20. Antimicrobial efficacy of Curcuma zedoaria extract as assessed by linear regression compared with commercial mouthrinses Eficácia antimicrobiana do extrato de Curcuma zedoaria avaliada por regressão linear comparada com anti-sépticos bucais comerciais

    Directory of Open Access Journals (Sweden)

    Adriana Bugno

    2007-09-01

    Full Text Available The antimicrobial activity of Curcuma zedoaria (Christm Roscoe extract against some oral microorganisms was compared with the antimicrobial activity of five commercial mouthrinses in order to evaluate the potential of the plant extract to be incorporated into formulas for improving or creating antiseptic activity. The in vitro antimicrobial efficacy of plant extracts and commercial products were evaluated against Streptococcus mutans, Enterococcus faecalis, Staphylococcus aureus and Candida albicans using a linear regression method to evaluate the microbial reduction obtained in function of the exposure time, considering as effectiveness a 99.999% reduction in count of standardized microbial populations within 60 seconds. The results showed that the antimicrobial efficacy of Curcuma zedoaria (Christm Roscoe extract was similar to that of commercial products, and its incorporation into a mouthrinse could be an alternative for improving the antimicrobial efficacy of the oral product.A atividade antimicrobiana do extrato de Curcuma zedoaria (Christm Roscoe contra algumas bactérias da microbiota bucal foi comparada com a atividade antimicrobiana de cinco anti-sépticos comerciais, a fim de avaliar o potencial do extrato vegetal de ser incorporado em formulações com a finalidade de melhorar ou conferir atividade anti-séptica. A eficácia antimicrobiana in vitro do extrato vegetal e produtos comerciais foi avaliada frente a Streptococcus mutans,Enterococcus faecalis,Staphylococcus aureus e Candida albicans, utilizando o método de regressão linear para avaliar a redução microbiana obtida em função do tempo de exposição, considerando como eficácia a redução de 99,999% na contagem de população microbiana padronizada em 60 segundos. Os resultados demonstraram que a eficácia antimicrobiana do extrato de Curcuma zedoaria (Christm Roscoe foi similar a de produtos comerciais e que sua incorporação em anti-sépticos bucais pode ser uma

  1. Tutorial on Using Regression Models with Count Outcomes Using R

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2016-02-01

    Full Text Available Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares either with or without transforming the count variables. In either case, using typical regression for count data can produce parameter estimates that are biased, thus diminishing any inferences made from such data. As count-variable regression models are seldom taught in training programs, we present a tutorial to help educational researchers use such methods in their own research. We demonstrate analyzing and interpreting count data using Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regression models. The count regression methods are introduced through an example using the number of times students skipped class. The data for this example are freely available and the R syntax used run the example analyses are included in the Appendix.

  2. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  3. Spontaneous regression of pulmonary bullae

    International Nuclear Information System (INIS)

    Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.

    2002-01-01

    The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd

  4. Quantum algorithm for linear regression

    Science.gov (United States)

    Wang, Guoming

    2017-07-01

    We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.

  5. Interpretation of commonly used statistical regression models.

    Science.gov (United States)

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  6. Multiple regression and beyond an introduction to multiple regression and structural equation modeling

    CERN Document Server

    Keith, Timothy Z

    2014-01-01

    Multiple Regression and Beyond offers a conceptually oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. Covers both MR and SEM, while explaining their relevance to one another Also includes path analysis, confirmatory factor analysis, and latent growth modeling Figures and tables throughout provide examples and illustrate key concepts and techniques For additional resources, please visit: http://tzkeith.com/.

  7. Logistic regression against a divergent Bayesian network

    Directory of Open Access Journals (Sweden)

    Noel Antonio Sánchez Trujillo

    2015-01-01

    Full Text Available This article is a discussion about two statistical tools used for prediction and causality assessment: logistic regression and Bayesian networks. Using data of a simulated example from a study assessing factors that might predict pulmonary emphysema (where fingertip pigmentation and smoking are considered; we posed the following questions. Is pigmentation a confounding, causal or predictive factor? Is there perhaps another factor, like smoking, that confounds? Is there a synergy between pigmentation and smoking? The results, in terms of prediction, are similar with the two techniques; regarding causation, differences arise. We conclude that, in decision-making, the sum of both: a statistical tool, used with common sense, and previous evidence, taking years or even centuries to develop; is better than the automatic and exclusive use of statistical resources.

  8. Hydrological Assessment of Model Performance and Scenario Analyses of Land Use Change and Climate Change in lowlands of Veneto Region (Italy)

    Science.gov (United States)

    Pijl, Anton; Brauer, Claudia; Sofia, Giulia; Teuling, Ryan; Tarolli, Paolo

    2017-04-01

    Growing water-related challenges in lowland areas of the world call for good assessment of our past and present actions, in order to guide our future decisions. The novel Wageningen Lowland Runoff Simulator (WALRUS; Brauer et al., 2014) was developed to simulate hydrological processes and has showed promising performance in recent studies in the Netherlands. Here the model was applied to a coastal basin of 2800 ha in the Veneto Region (northern Italy) to test model performance and evaluate scenario analyses of land use change and climate change. Located partially below sea-level, the reclaimed area is facing persistent land transformation and climate change trends, which alter not only the processes in the catchment but also the demands from it (Tarolli and Sofia, 2016). Firstly results of the calibration (NSE = 0.77; year simulation, daily resolution) and validation (NSE = 0.53; idem) showed that the model is able to reproduce the dominant hydrological processes of this lowland area (e.g. discharge and groundwater fluxes). Land use scenarios between 1951 and 2060 were constructed using demographic models, supported by orthographic interpretation techniques. Climate scenarios were constructed by historical records and future projections by COSMO-CLM regional climate model (Rockel et al., 2008) under the RCP4.5 pathway. WALRUS simulations showed that the land use changes result in a wetter catchment with more discharge, and the climatic changes cause more extremes with longer droughts and stronger rain events. These changes combined show drier summers (-33{%} rainfall, +27{%} soil moisture deficit) and wetter (+13{%} rainfall) and intenser (+30{%} rain intensity) autumn and winters in the future. The simulated discharge regime -particularly peak flow- follows these polarising trends, in good agreement with similar studies in the geographical zone (e.g. Vezzoli et al., 2015). This will increase the pressure on the fully-artificial drainage and agricultural systems

  9. Life Cycle Assessment Applied to Naphtha Catalytic Reforming Analyse de cycle de vie appliquée au reformage catalytique du naphta

    OpenAIRE

    Portha J.-F.; Jaubert J.-N.; Louret S.; Pons M.-N.

    2010-01-01

    Facing the increase of environmental concerns in the oil and gas industry, engineers and scientists need information to assess sustainability of chemical processes. Among the different methods available, Life Cycle Assessment (LCA) is widely used. In this study, LCA is applied to a catalytic reforming process using the Eco- Indicator 99 as life cycle impact assessment method. The main identified environmental impacts are fossil fuels consumption, climate change and respiratory effects du...

  10. On Weighted Support Vector Regression

    DEFF Research Database (Denmark)

    Han, Xixuan; Clemmensen, Line Katrine Harder

    2014-01-01

    We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...

  11. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    Science.gov (United States)

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  12. A Tutorial for Analysing the Cost-effectiveness of Alternative Methods for Assessing Chemical Toxicology: The Case of Acute Oral Toxicity Prediction

    NARCIS (Netherlands)

    Norlen, H.; Worth, A.P.; Gabbert, S.G.M.

    2014-01-01

    Compared with traditional animal methods for toxicity testing, in vitro and in silico methods are widely considered to permit a more cost-effective assessment of chemicals. However, how to assess the cost-effectiveness of alternative methods has remained unclear. This paper offers a user-oriented

  13. Estimating Loess Plateau Average Annual Precipitation with Multiple Linear Regression Kriging and Geographically Weighted Regression Kriging

    Directory of Open Access Journals (Sweden)

    Qiutong Jin

    2016-06-01

    Full Text Available Estimating the spatial distribution of precipitation is an important and challenging task in hydrology, climatology, ecology, and environmental science. In order to generate a highly accurate distribution map of average annual precipitation for the Loess Plateau in China, multiple linear regression Kriging (MLRK and geographically weighted regression Kriging (GWRK methods were employed using precipitation data from the period 1980–2010 from 435 meteorological stations. The predictors in regression Kriging were selected by stepwise regression analysis from many auxiliary environmental factors, such as elevation (DEM, normalized difference vegetation index (NDVI, solar radiation, slope, and aspect. All predictor distribution maps had a 500 m spatial resolution. Validation precipitation data from 130 hydrometeorological stations were used to assess the prediction accuracies of the MLRK and GWRK approaches. Results showed that both prediction maps with a 500 m spatial resolution interpolated by MLRK and GWRK had a high accuracy and captured detailed spatial distribution data; however, MLRK produced a lower prediction error and a higher variance explanation than GWRK, although the differences were small, in contrast to conclusions from similar studies.

  14. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  15. Comparative physical-chemical characterization of encapsulated lipid-based isotretinoin products assessed by particle size distribution and thermal behavior analyses

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Carla Aiolfi, E-mail: carlaaiolfi@usp.br [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Menaa, Farid [Department of Dermatology, School of Medicine Wuerzburg, Wuerzburg 97080 (Germany); Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Menaa, Bouzid, E-mail: bouzid.menaa@gmail.com [Fluorotronics, Inc., 1425 Russ Bvld, San Diego Technology Incubator, San Diego, CA 92101 (United States); Quenca-Guillen, Joyce S. [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Matos, Jivaldo do Rosario [Department of Fundamental Chemistry, Institute of Chemistry, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil); Mercuri, Lucildes Pita [Department of Exact and Earth Sciences, Federal University of Sao Paulo, Diadema, SP 09972-270 (Brazil); Braz, Andre Borges [Department of Engineering of Mines and Oil, Polytechnical School, University of Sao Paulo, SP 05508-900 (Brazil); Rossetti, Fabia Cristina [Department of Pharmaceutical Sciences, Faculty of Pharmaceutical Sciences of Ribeirao Preto, University of Sao Paulo, Ribeirao Preto, SP 14015-120 (Brazil); Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Ines Rocha Miritello [Department of Pharmacy, Faculty of Pharmaceutical Sciences, University of Sao Paulo, Sao Paulo, SP 05508-000 (Brazil)

    2010-06-10

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Nevertheless, some of its physical-chemical properties are still poorly known. Hence, the aim of our study consisted to comparatively evaluate the particle size distribution (PSD) and characterize the thermal behavior of the three encapsulated isotretinoin products in oil suspension (one reference and two generics) commercialized in Brazil. Here, we show that the PSD, estimated by laser diffraction and by polarized light microscopy, differed between the generics and the reference product. However, the thermal behavior of the three products, determined by thermogravimetry (TGA), differential thermal (DTA) analyses and differential scanning calorimetry (DSC), displayed no significant changes and were more thermostable than the isotretinoin standard used as internal control. Thus, our study suggests that PSD analyses in isotretinoin lipid-based formulations should be routinely performed in order to improve their quality and bioavailability.

  16. Credit Scoring Problem Based on Regression Analysis

    OpenAIRE

    Khassawneh, Bashar Suhil Jad Allah

    2014-01-01

    ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....

  17. Variable selection and model choice in geoadditive regression models.

    Science.gov (United States)

    Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard

    2009-06-01

    Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.

  18. An Original Stepwise Multilevel Logistic Regression Analysis of Discriminatory Accuracy

    DEFF Research Database (Denmark)

    Merlo, Juan; Wagner, Philippe; Ghith, Nermin

    2016-01-01

    BACKGROUND AND AIM: Many multilevel logistic regression analyses of "neighbourhood and health" focus on interpreting measures of associations (e.g., odds ratio, OR). In contrast, multilevel analysis of variance is rarely considered. We propose an original stepwise analytical approach that disting...

  19. Interpreting Multiple Linear Regression: A Guidebook of Variable Importance

    Science.gov (United States)

    Nathans, Laura L.; Oswald, Frederick L.; Nimon, Kim

    2012-01-01

    Multiple regression (MR) analyses are commonly employed in social science fields. It is also common for interpretation of results to typically reflect overreliance on beta weights, often resulting in very limited interpretations of variable importance. It appears that few researchers employ other methods to obtain a fuller understanding of what…

  20. An initial exploration for comprehensive assessment of IgG4-related lung disease: analyses on the cases enrolled from a systematic review.

    Science.gov (United States)

    Wang, An; Fan, Jie; Chen, Xiaofeng; Wang, Shaohua

    2018-03-01

    The existence of two diagnostic systems, the Boston and Japan criteria, for immunoglobulin G4-related disease (IgG4-RD) confuse the medical practice. We aimed to develop a comprehensive assessment based on the weight of each diagnostic item in the existing criteria to improve the diagnostic efficiency of Boston criteria. We assessed the patients enrolled by a systematic review of the literatures using the Boston criteria, Japan criteria and a tentative comprehensive assessment respectively, and evaluated the efficiency of each system and their consistency. Our analysis showed that the distinction in pathological diagnostic items was similar for the Boston criteria (IgG4+/IgG+ ratio, Pcomprehensive assessment (IgG4+/IgG+ ratio and the number of pathological features, Pcomprehensive assessment. The current two diagnostic systems have poor consistency. Comprehensive assessment has good agreement with the Boston criteria, but can identify those cases in Boston Category 3 who could still be diagnosed as IgG4-related lung disease. Considering the weight of diagnostic items, the scoring system is a tentative exploration that should be improved with further experience in diagnosing IgG4-related lung disease.

  1. Regularized Label Relaxation Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu

    2018-04-01

    Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.

  2. Estimating the exceedance probability of rain rate by logistic regression

    Science.gov (United States)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  3. Linear and logistic regression analysis

    NARCIS (Netherlands)

    Tripepi, G.; Jager, K. J.; Dekker, F. W.; Zoccali, C.

    2008-01-01

    In previous articles of this series, we focused on relative risks and odds ratios as measures of effect to assess the relationship between exposure to risk factors and clinical outcomes and on control for confounding. In randomized clinical trials, the random allocation of patients is hoped to

  4. Use of probabilistic weights to enhance linear regression myoelectric control.

    Science.gov (United States)

    Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J

    2015-12-01

    Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts' law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p linear regression control. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.

  5. Independent contrasts and PGLS regression estimators are equivalent.

    Science.gov (United States)

    Blomberg, Simon P; Lefevre, James G; Wells, Jessie A; Waterhouse, Mary

    2012-05-01

    We prove that the slope parameter of the ordinary least squares regression of phylogenetically independent contrasts (PICs) conducted through the origin is identical to the slope parameter of the method of generalized least squares (GLSs) regression under a Brownian motion model of evolution. This equivalence has several implications: 1. Understanding the structure of the linear model for GLS regression provides insight into when and why phylogeny is important in comparative studies. 2. The limitations of the PIC regression analysis are the same as the limitations of the GLS model. In particular, phylogenetic covariance applies only to the response variable in the regression and the explanatory variable should be regarded as fixed. Calculation of PICs for explanatory variables should be treated as a mathematical idiosyncrasy of the PIC regression algorithm. 3. Since the GLS estimator is the best linear unbiased estimator (BLUE), the slope parameter estimated using PICs is also BLUE. 4. If the slope is estimated using different branch lengths for the explanatory and response variables in the PIC algorithm, the estimator is no longer the BLUE, so this is not recommended. Finally, we discuss whether or not and how to accommodate phylogenetic covariance in regression analyses, particularly in relation to the problem of phylogenetic uncertainty. This discussion is from both frequentist and Bayesian perspectives.

  6. Case study Sylt - Consequences and integrated assessment of climate change. Final report; Klimaaenderung und Kueste. Fallstudie Sylt - Integrative Analyse und Bewertung der Folgen von Klimaaenderungen. Endbericht

    Energy Technology Data Exchange (ETDEWEB)

    Fraenzle, O.; Sterr, H.; Daschkeit, A.

    2001-05-01

    This final report deals with the structure of the 'case study Sylt' against the background of climate change and possible consequences. In cooperation with the other projects of the case study an instrument is developed which maintains interdisciplinary communication and cooperation. First the 'System Sylt' is described to identify and specify the relevant aspects of functional relationships between the natural and the social system. The focal points are (1) the first-order impacts of climate change, (2) the potential ecological changes in the near future and (3) the image of the North-Sea island Sylt. With regard to the image of Sylt we find some discrepancies existing between a statical respectively a dynamical view; these discrepancies are inherent parts of the future development. All results are seen in the context of 'Integrated Coastal Zone Management' (ICZM) to derive general and specific recommendations for political action and further research. (orig.) [German] Vor dem Hintergrund von Annahmen bezueglich der zukuenftigen klimatischen Entwicklung werden die Konzeption sowie die Vorgehensweise der integrativen Analyse im Rahmen der Fallstudie Sylt dargestellt. Unter Anbindung an das Sylt-GIS wird ein Instrument entwickelt und erprobt, das die (bislang seltene) fachuebergreifende Analyse von Klimafolgen unterstuetzt. Diese muendet zunaechst in eine Darstellung des 'Systems Sylt' auf der Grundlage der wichtigsten Prozesse und Randbedingungen. Die auf dieser Basis identifizierten Kernberichte des 'Systems Sylt' werden in einem weiteren exemplarisch und unter Einbezug des in den disziplinaeren Teilvorhaben der Fallstudie erarbeiteten Wissens einer vertiefenden Analyse unterzogen. Dabei wird erstens auf den Bereich der Folgen eines moeglichen Klimawandels eingegangen, zweitens auf vergangene und zukuenftig moegliche oekologische Veraenderungen und drittens auf das Sylt-Image. Durch eine detailliertere Analyse des

  7. Assessment of anti-inflammatory and anti-arthritic properties of Acmella uliginosa (Sw. Cass. based on experiments in arthritic rat models and qualitative GC/MS analyses.

    Directory of Open Access Journals (Sweden)

    Subhashis Paul

    2016-09-01

    of AU and AV showed the best recovery potential in all the studied parameters, confirming the synergistic efficacy of the herbal formulation. GC/MS analyses revealed the presence of at least 5 anti-inflammatory compounds including 9-octadecenoic acid (Z-, phenylmethyl ester, astaxanthin, à-N-Normethadol, fenretinide that have reported anti-inflammatory/anti-arthritic properties. Conclusion: Our findings indicated that the crude flower homogenate of AU contains potential anti-inflammatory compounds which could be used as an anti-inflammatory/anti-arthritic medication. [J Complement Med Res 2016; 5(3.000: 257-262

  8. Case study Sylt - Consequences and integrated assessment of climate change. Final report; Klimaaenderung und Kueste. Fallstudie Sylt - Integrative Analyse und Bewertung der Folgen von Klimaaenderungen. Endbericht

    Energy Technology Data Exchange (ETDEWEB)

    Fraenzle, O; Sterr, H; Daschkeit, A

    2001-05-01

    This final report deals with the structure of the 'case study Sylt' against the background of climate change and possible consequences. In cooperation with the other projects of the case study an instrument is developed which maintains interdisciplinary communication and cooperation. First the 'System Sylt' is described to identify and specify the relevant aspects of functional relationships between the natural and the social system. The focal points are (1) the first-order impacts of climate change, (2) the potential ecological changes in the near future and (3) the image of the North-Sea island Sylt. With regard to the image of Sylt we find some discrepancies existing between a statical respectively a dynamical view; these discrepancies are inherent parts of the future development. All results are seen in the context of 'Integrated Coastal Zone Management' (ICZM) to derive general and specific recommendations for political action and further research. (orig.) [German] Vor dem Hintergrund von Annahmen bezueglich der zukuenftigen klimatischen Entwicklung werden die Konzeption sowie die Vorgehensweise der integrativen Analyse im Rahmen der Fallstudie Sylt dargestellt. Unter Anbindung an das Sylt-GIS wird ein Instrument entwickelt und erprobt, das die (bislang seltene) fachuebergreifende Analyse von Klimafolgen unterstuetzt. Diese muendet zunaechst in eine Darstellung des 'Systems Sylt' auf der Grundlage der wichtigsten Prozesse und Randbedingungen. Die auf dieser Basis identifizierten Kernberichte des 'Systems Sylt' werden in einem weiteren exemplarisch und unter Einbezug des in den disziplinaeren Teilvorhaben der Fallstudie erarbeiteten Wissens einer vertiefenden Analyse unterzogen. Dabei wird erstens auf den Bereich der Folgen eines moeglichen Klimawandels eingegangen, zweitens auf vergangene und zukuenftig moegliche oekologische Veraenderungen und drittens auf das Sylt-Image. Durch eine detailliertere Analyse des Sylt-Image kann aufgezeigt werden, dass sich eine

  9. Predictors of postoperative outcomes of cubital tunnel syndrome treatments using multiple logistic regression analysis.

    Science.gov (United States)

    Suzuki, Taku; Iwamoto, Takuji; Shizu, Kanae; Suzuki, Katsuji; Yamada, Harumoto; Sato, Kazuki

    2017-05-01

    This retrospective study was designed to investigate prognostic factors for postoperative outcomes for cubital tunnel syndrome (CubTS) using multiple logistic regression analysis with a large number of patients. Eighty-three patients with CubTS who underwent surgeries were enrolled. The following potential prognostic factors for disease severity were selected according to previous reports: sex, age, type of surgery, disease duration, body mass index, cervical lesion, presence of diabetes mellitus, Workers' Compensation status, preoperative severity, and preoperative electrodiagnostic testing. Postoperative severity of disease was assessed 2 years after surgery by Messina's criteria which is an outcome measure specifically for CubTS. Bivariate analysis was performed to select candidate prognostic factors for multiple linear regression analyses. Multiple logistic regression analysis was conducted to identify the association between postoperative severity and selected prognostic factors. Both bivariate and multiple linear regression analysis revealed only preoperative severity as an independent risk factor for poor prognosis, while other factors did not show any significant association. Although conflicting results exist regarding prognosis of CubTS, this study supports evidence from previous studies and concludes early surgical intervention portends the most favorable prognosis. Copyright © 2017 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  10. Assessment of the RELAP4/MOD6 thermal-hydraulic transient code for PWR experimental applications. Addendum. Analyses completed and reported in FY 1979. Interim report

    International Nuclear Information System (INIS)

    1979-09-01

    The results of three subtasks that complete the assessment of the RELAP4/MOD6 computer code are reported. These subtasks constitute the remainder of a broadly scoped assessment matrix defined and described in detail in a previously published document. The specific subtasks provide comparisons of code calculations with experimental results from core blowdown and critical-flow separate-effects experiments and from an integral systems-effects loss-of-coolant experiment. The basic emphasis of the comparisons is in the presentation of the study results in error form suitable for statistical analysis

  11. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  12. Mixed Frequency Data Sampling Regression Models: The R Package midasr

    Directory of Open Access Journals (Sweden)

    Eric Ghysels

    2016-08-01

    Full Text Available When modeling economic relationships it is increasingly common to encounter data sampled at different frequencies. We introduce the R package midasr which enables estimating regression models with variables sampled at different frequencies within a MIDAS regression framework put forward in work by Ghysels, Santa-Clara, and Valkanov (2002. In this article we define a general autoregressive MIDAS regression model with multiple variables of different frequencies and show how it can be specified using the familiar R formula interface and estimated using various optimization methods chosen by the researcher. We discuss how to check the validity of the estimated model both in terms of numerical convergence and statistical adequacy of a chosen regression specification, how to perform model selection based on a information criterion, how to assess forecasting accuracy of the MIDAS regression model and how to obtain a forecast aggregation of different MIDAS regression models. We illustrate the capabilities of the package with a simulated MIDAS regression model and give two empirical examples of application of MIDAS regression.

  13. Semisupervised Clustering by Iterative Partition and Regression with Neuroscience Applications

    Directory of Open Access Journals (Sweden)

    Guoqi Qian

    2016-01-01

    Full Text Available Regression clustering is a mixture of unsupervised and supervised statistical learning and data mining method which is found in a wide range of applications including artificial intelligence and neuroscience. It performs unsupervised learning when it clusters the data according to their respective unobserved regression hyperplanes. The method also performs supervised learning when it fits regression hyperplanes to the corresponding data clusters. Applying regression clustering in practice requires means of determining the underlying number of clusters in the data, finding the cluster label of each data point, and estimating the regression coefficients of the model. In this paper, we review the estimation and selection issues in regression clustering with regard to the least squares and robust statistical methods. We also provide a model selection based technique to determine the number of regression clusters underlying the data. We further develop a computing procedure for regression clustering estimation and selection. Finally, simulation studies are presented for assessing the procedure, together with analyzing a real data set on RGB cell marking in neuroscience to illustrate and interpret the method.

  14. Mapping geogenic radon potential by regression kriging

    Energy Technology Data Exchange (ETDEWEB)

    Pásztor, László [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Szabó, Katalin Zsuzsanna, E-mail: sz_k_zs@yahoo.de [Department of Chemistry, Institute of Environmental Science, Szent István University, Páter Károly u. 1, Gödöllő 2100 (Hungary); Szatmári, Gábor; Laborczi, Annamária [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Horváth, Ákos [Department of Atomic Physics, Eötvös University, Pázmány Péter sétány 1/A, 1117 Budapest (Hungary)

    2016-02-15

    , regression-kriging (RK) was tested for GRP mapping. • Usage of spatially exhaustive, auxiliary data on soil, geology, topography, land use and climate. • Inherent accuracy assessment (both global and local). • Interval estimation for the spatial extension of the areas of GRP risk categories. • Significance of fluvial sedimentary rock, pyroclast and land use properties on radon risk.

  15. Mapping geogenic radon potential by regression kriging

    International Nuclear Information System (INIS)

    Pásztor, László; Szabó, Katalin Zsuzsanna; Szatmári, Gábor; Laborczi, Annamária; Horváth, Ákos

    2016-01-01

    Radon ( 222 Rn) gas is produced in the radioactive decay chain of uranium ( 238 U) which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on the physical and meteorological parameters of the soil and can enter and accumulate in buildings. Health risks originating from indoor radon concentration can be attributed to natural factors and is characterized by geogenic radon potential (GRP). Identification of areas with high health risks require spatial modeling, that is, mapping of radon risk. In addition to geology and meteorology, physical soil properties play a significant role in the determination of GRP. In order to compile a reliable GRP map for a model area in Central-Hungary, spatial auxiliary information representing GRP forming environmental factors were taken into account to support the spatial inference of the locally measured GRP values. Since the number of measured sites was limited, efficient spatial prediction methodologies were searched for to construct a reliable map for a larger area. Regression kriging (RK) was applied for the interpolation using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly, the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Overall accuracy of the map was tested by Leave-One-Out Cross-Validation. Furthermore the spatial reliability of the resultant map is also estimated by the calculation of the 90% prediction interval of the local prediction values. The applicability of the applied method as well as that of the map is discussed briefly. - Highlights: • A new method, regression

  16. Secondary structure analyses of the nuclear rRNA internal transcribed spacers and assessment of its phylogenetic utility across the Brassicaceae (mustards.

    Directory of Open Access Journals (Sweden)

    Patrick P Edger

    Full Text Available The internal transcribed spacers of the nuclear ribosomal RNA gene cluster, termed ITS1 and ITS2, are the most frequently used nuclear markers for phylogenetic analyses across many eukaryotic groups including most plant families. The reasons for the popularity of these markers include: 1. Ease of amplification due to high copy number of the gene clusters, 2. Available cost-effective methods and highly conserved primers, 3. Rapidly evolving markers (i.e. variable between closely related species, and 4. The assumption (and/or treatment that these sequences are non-functional, neutrally evolving phylogenetic markers. Here, our analyses of ITS1 and ITS2 for 50 species suggest that both sequences are instead under selective constraints to preserve proper secondary structure, likely to maintain complete self-splicing functions, and thus are not neutrally-evolving phylogenetic markers. Our results indicate the majority of sequence sites are co-evolving with other positions to form proper secondary structure, which has implications for phylogenetic inference. We also found that the lowest energy state and total number of possible alternate secondary structures are highly significantly different between ITS regions and random sequences with an identical overall length and Guanine-Cytosine (GC content. Lastly, we review recent evidence highlighting some additional problematic issues with using these regions as the sole markers for phylogenetic studies, and thus strongly recommend additional markers and cost-effective approaches for future studies to estimate phylogenetic relationships.

  17. Linkages to Public Land Framework: toward embedding humans in ecosystem analyses by using “inside-out social assessment.”

    Science.gov (United States)

    Joanna Endter-Wada; Dale J. Blahna

    2011-01-01

    This article presents the " Linkages to Public Land" (LPL) Framework, a general but comprehensive data-gathering and analysis approach aimed at informing citizen and agency decision making about the social environment of public land. This social assessment and planning approach identifies and categorizes various types of linkages that people have to public...

  18. Witnessman: The software tool to design, analyse and assess a witness pack with respect to military and medical effects on an (UN)protected (DIS)mounted soldier

    NARCIS (Netherlands)

    Verhagen, Th.L.A.; Kemper, R.; Huisjes, H.; Knijnenburg, S.G.; Pronk, A.; Klink, M.H. van

    2005-01-01

    Witness packs are widely used to enable the characterization of fragments generated during ballistic experiments. To assess the effect of fragment impact on an exposed (dis)mounted soldier a follow-up Vulnerability/ Lethality (V/L) simulation is essential. The overall process is elaborative and time

  19. Principal component regression analysis with SPSS.

    Science.gov (United States)

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  20. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the