WorldWideScience

Sample records for proportional hazard regression

  1. Comparing treatment effects after adjustment with multivariable Cox proportional hazards regression and propensity score methods

    NARCIS (Netherlands)

    Martens, Edwin P; de Boer, Anthonius; Pestman, Wiebe R; Belitser, Svetlana V; Stricker, Bruno H Ch; Klungel, Olaf H

    PURPOSE: To compare adjusted effects of drug treatment for hypertension on the risk of stroke from propensity score (PS) methods with a multivariable Cox proportional hazards (Cox PH) regression in an observational study with censored data. METHODS: From two prospective population-based cohort

  2. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  3. A Proportional Hazards Regression Model for the Subdistribution with Covariates-adjusted Censoring Weight for Competing Risks Data

    DEFF Research Database (Denmark)

    He, Peng; Eriksson, Frank; Scheike, Thomas H.

    2016-01-01

    function by fitting the Cox model for the censoring distribution and using the predictive probability for each individual. Our simulation study shows that the covariate-adjusted weight estimator is basically unbiased when the censoring time depends on the covariates, and the covariate-adjusted weight......With competing risks data, one often needs to assess the treatment and covariate effects on the cumulative incidence function. Fine and Gray proposed a proportional hazards regression model for the subdistribution of a competing risk with the assumption that the censoring distribution...... and the covariates are independent. Covariate-dependent censoring sometimes occurs in medical studies. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with proper adjustments for covariate-dependent censoring. We consider a covariate-adjusted weight...

  4. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  5. Limitations of Cox Proportional Hazards Analysis in Mortality Prediction of Patients with Acute Coronary Syndrome

    Directory of Open Access Journals (Sweden)

    Babińska Magdalena

    2015-12-01

    Full Text Available The aim of this study was to evaluate the possibility of incorrect assessment of mortality risk factors in a group of patients affected by acute coronary syndrome, due to the lack of hazard proportionality in the Cox regression model. One hundred and fifty consecutive patients with acute coronary syndrome (ACS and no age limit were enrolled. Univariable and multivariable Cox proportional hazard analyses were performed. The proportional hazard assumptions were verified using Schoenfeld residuals, χ2 test and rank correlation coefficient t between residuals and time. In the total group of 150 patients, 33 (22.0% deaths from any cause were registered in the follow-up time period of 64 months. The non-survivors were significantly older and had increased prevalence of diabetes and erythrocyturia, longer history of coronary artery disease, higher concentrations of serum creatinine, cystatin C, uric acid, glucose, C-reactive protein (CRP, homocysteine and B-type natriuretic peptide (NT-proBNP, and lower concentrations of serum sodium. No significant differences in echocardiography parameters were observed between groups. The following factors were risk of death factors and fulfilled the proportional hazard assumption in the univariable model: smoking, occurrence of diabetes and anaemia, duration of coronary artery disease, and abnormal serum concentrations of uric acid, sodium, homocysteine, cystatin C and NT-proBNP, while in the multivariable model, the risk of death factors were: smoking and elevated concentrations of homocysteine and NT-proBNP. The study has demonstrated that violation of the proportional hazard assumption in the Cox regression model may lead to creating a false model that does not include only time-independent predictive factors.

  6. Cox proportional hazards models have more statistical power than logistic regression models in cross-sectional genetic association studies

    NARCIS (Netherlands)

    van der Net, Jeroen B.; Janssens, A. Cecile J. W.; Eijkemans, Marinus J. C.; Kastelein, John J. P.; Sijbrands, Eric J. G.; Steyerberg, Ewout W.

    2008-01-01

    Cross-sectional genetic association studies can be analyzed using Cox proportional hazards models with age as time scale, if age at onset of disease is known for the cases and age at data collection is known for the controls. We assessed to what degree and under what conditions Cox proportional

  7. Proportional hazards models of infrastructure system recovery

    International Nuclear Information System (INIS)

    Barker, Kash; Baroud, Hiba

    2014-01-01

    As emphasis is being placed on a system's ability to withstand and to recover from a disruptive event, collectively referred to as dynamic resilience, there exists a need to quantify a system's ability to bounce back after a disruptive event. This work applies a statistical technique from biostatistics, the proportional hazards model, to describe (i) the instantaneous rate of recovery of an infrastructure system and (ii) the likelihood that recovery occurs prior to a given point in time. A major benefit of the proportional hazards model is its ability to describe a recovery event as a function of time as well as covariates describing the infrastructure system or disruptive event, among others, which can also vary with time. The proportional hazards approach is illustrated with a publicly available electric power outage data set

  8. Functional form diagnostics for Cox's proportional hazards model.

    Science.gov (United States)

    León, Larry F; Tsai, Chih-Ling

    2004-03-01

    We propose a new type of residual and an easily computed functional form test for the Cox proportional hazards model. The proposed test is a modification of the omnibus test for testing the overall fit of a parametric regression model, developed by Stute, González Manteiga, and Presedo Quindimil (1998, Journal of the American Statistical Association93, 141-149), and is based on what we call censoring consistent residuals. In addition, we develop residual plots that can be used to identify the correct functional forms of covariates. We compare our test with the functional form test of Lin, Wei, and Ying (1993, Biometrika80, 557-572) in a simulation study. The practical application of the proposed residuals and functional form test is illustrated using both a simulated data set and a real data set.

  9. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Science.gov (United States)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  10. A ¤flexible additive multiplicative hazard model

    DEFF Research Database (Denmark)

    Martinussen, T.; Scheike, T. H.

    2002-01-01

    Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect......Aalen's additive model; Counting process; Cox regression; Hazard model; Proportional excess harzard model; Time-varying effect...

  11. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Directory of Open Access Journals (Sweden)

    M. Guns

    2012-06-01

    Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  12. Landslide Hazard Mapping in Rwanda Using Logistic Regression

    Science.gov (United States)

    Piller, A.; Anderson, E.; Ballard, H.

    2015-12-01

    Landslides in the United States cause more than $1 billion in damages and 50 deaths per year (USGS 2014). Globally, figures are much more grave, yet monitoring, mapping and forecasting of these hazards are less than adequate. Seventy-five percent of the population of Rwanda earns a living from farming, mostly subsistence. Loss of farmland, housing, or life, to landslides is a very real hazard. Landslides in Rwanda have an impact at the economic, social, and environmental level. In a developing nation that faces challenges in tracking, cataloging, and predicting the numerous landslides that occur each year, satellite imagery and spatial analysis allow for remote study. We have focused on the development of a landslide inventory and a statistical methodology for assessing landslide hazards. Using logistic regression on approximately 30 test variables (i.e. slope, soil type, land cover, etc.) and a sample of over 200 landslides, we determine which variables are statistically most relevant to landslide occurrence in Rwanda. A preliminary predictive hazard map for Rwanda has been produced, using the variables selected from the logistic regression analysis.

  13. Logistic regression applied to natural hazards: rare event logistic regression with replications

    OpenAIRE

    Guns, M.; Vanacker, Veerle

    2012-01-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logisti...

  14. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  15. Non-proportional odds multivariate logistic regression of ordinal family data.

    Science.gov (United States)

    Zaloumis, Sophie G; Scurrah, Katrina J; Harrap, Stephen B; Ellis, Justine A; Gurrin, Lyle C

    2015-03-01

    Methods to examine whether genetic and/or environmental sources can account for the residual variation in ordinal family data usually assume proportional odds. However, standard software to fit the non-proportional odds model to ordinal family data is limited because the correlation structure of family data is more complex than for other types of clustered data. To perform these analyses we propose the non-proportional odds multivariate logistic regression model and take a simulation-based approach to model fitting using Markov chain Monte Carlo methods, such as partially collapsed Gibbs sampling and the Metropolis algorithm. We applied the proposed methodology to male pattern baldness data from the Victorian Family Heart Study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... models that is easy to fit and contains the Fine-Gray model as a special case. One advantage of this approach is that our regression modeling allows for non-proportional hazards. This leads to a new simple goodness-of-fit procedure for the proportional subdistribution hazards assumption that is very easy...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  17. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2proportional hazards model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  18. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  19. Measures to assess the prognostic ability of the stratified Cox proportional hazards model

    DEFF Research Database (Denmark)

    (Tybjaerg-Hansen, A.) The Fibrinogen Studies Collaboration.The Copenhagen City Heart Study; Tybjærg-Hansen, Anne

    2009-01-01

    Many measures have been proposed to summarize the prognostic ability of the Cox proportional hazards (CPH) survival model, although none is universally accepted for general use. By contrast, little work has been done to summarize the prognostic ability of the stratified CPH model; such measures...

  20. Survival analysis II: Cox regression

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the

  1. The Application of Extended Cox Proportional Hazard Method for Estimating Survival Time of Breast Cancer

    Science.gov (United States)

    Husain, Hartina; Astuti Thamrin, Sri; Tahir, Sulaiha; Mukhlisin, Ahmad; Mirna Apriani, M.

    2018-03-01

    Breast cancer is one type of cancer that is the leading cause of death worldwide. This study aims to model the factors that affect the survival time and rate of cure of breast cancer patients. The extended cox model, which is a modification of the proportional hazard cox model in which the proportional hazard assumptions are not met, is used in this study. The maximum likelihood estimation approach is used to estimate the parameters of the model. This method is then applied to medical record data of breast cancer patient in 2011-2016, which is taken from Hasanuddin University Education Hospital. The results obtained indicate that the factors that affect the survival time of breast cancer patients are malignancy and leukocyte levels.

  2. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    Science.gov (United States)

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  3. Optimization of maintenance policy using the proportional hazard model

    Energy Technology Data Exchange (ETDEWEB)

    Samrout, M. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: mohamad.el_samrout@utt.fr; Chatelet, E. [Information Sciences and Technologies Institute, University of Technology of Troyes, 10000 Troyes (France)], E-mail: chatelt@utt.fr; Kouta, R. [M3M Laboratory, University of Technology of Belfort Montbeliard (France); Chebbo, N. [Industrial Systems Laboratory, IUT, Lebanese University (Lebanon)

    2009-01-15

    The evolution of system reliability depends on its structure as well as on the evolution of its components reliability. The latter is a function of component age during a system's operating life. Component aging is strongly affected by maintenance activities performed on the system. In this work, we consider two categories of maintenance activities: corrective maintenance (CM) and preventive maintenance (PM). Maintenance actions are characterized by their ability to reduce this age. PM consists of actions applied on components while they are operating, whereas CM actions occur when the component breaks down. In this paper, we expound a new method to integrate the effect of CM while planning for the PM policy. The proportional hazard function was used as a modeling tool for that purpose. Interesting results were obtained when comparison between policies that take into consideration the CM effect and those that do not is established.

  4. Proportional Odds Logistic Regression - Effective Means of Dealing with Limited Uncertainty in Dichotomizing Clinical Outcomes

    Czech Academy of Sciences Publication Activity Database

    Valenta, Zdeněk; Pitha, J.; Poledne, R.

    2006-01-01

    Roč. 25, č. 24 (2006), s. 4227-4234 ISSN 0277-6715 R&D Projects: GA MZd NA7512 Institutional research plan: CEZ:AV0Z10300504 Keywords : proportional odds logistic regression * dichotomized outcomes * uncertainty Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.737, year: 2006

  5. Extended cox regression model: The choice of timefunction

    Science.gov (United States)

    Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu

    2017-07-01

    Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.

  6. Proportional hazards model with varying coefficients for length-biased data.

    Science.gov (United States)

    Zhang, Feipeng; Chen, Xuerong; Zhou, Yong

    2014-01-01

    Length-biased data arise in many important applications including epidemiological cohort studies, cancer prevention trials and studies of labor economics. Such data are also often subject to right censoring due to loss of follow-up or the end of study. In this paper, we consider a proportional hazards model with varying coefficients for right-censored and length-biased data, which is used to study the interact effect nonlinearly of covariates with an exposure variable. A local estimating equation method is proposed for the unknown coefficients and the intercept function in the model. The asymptotic properties of the proposed estimators are established by using the martingale theory and kernel smoothing techniques. Our simulation studies demonstrate that the proposed estimators have an excellent finite-sample performance. The Channing House data is analyzed to demonstrate the applications of the proposed method.

  7. Augmenting the logrank test in the design of clinical trials in which non-proportional hazards of the treatment effect may be anticipated

    Directory of Open Access Journals (Sweden)

    Patrick Royston

    2016-02-01

    Full Text Available Abstract Background Most randomized controlled trials with a time-to-event outcome are designed assuming proportional hazards (PH of the treatment effect. The sample size calculation is based on a logrank test. However, non-proportional hazards are increasingly common. At analysis, the estimated hazards ratio with a confidence interval is usually presented. The estimate is often obtained from a Cox PH model with treatment as a covariate. If non-proportional hazards are present, the logrank and equivalent Cox tests may lose power. To safeguard power, we previously suggested a ‘joint test’ combining the Cox test with a test of non-proportional hazards. Unfortunately, a larger sample size is needed to preserve power under PH. Here, we describe a novel test that unites the Cox test with a permutation test based on restricted mean survival time. Methods We propose a combined hypothesis test based on a permutation test of the difference in restricted mean survival time across time. The test involves the minimum of the Cox and permutation test P-values. We approximate its null distribution and correct it for correlation between the two P-values. Using extensive simulations, we assess the type 1 error and power of the combined test under several scenarios and compare with other tests. We investigate powering a trial using the combined test. Results The type 1 error of the combined test is close to nominal. Power under proportional hazards is slightly lower than for the Cox test. Enhanced power is available when the treatment difference shows an ‘early effect’, an initial separation of survival curves which diminishes over time. The power is reduced under a ‘late effect’, when little or no difference in survival curves is seen for an initial period and then a late separation occurs. We propose a method of powering a trial using the combined test. The ‘insurance premium’ offered by the combined test to safeguard power under non

  8. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

    Science.gov (United States)

    Faruk, Alfensi

    2018-03-01

    Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

  9. Regression analysis of informative current status data with the additive hazards model.

    Science.gov (United States)

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  10. ADVANCES IN RENEWAL DECISION-MAKING UTILISING THE PROPORTIONAL HAZARDS MODEL WITH VIBRATION COVARIATES

    Directory of Open Access Journals (Sweden)

    Pieter-Jan Vlok

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Increased competitiveness in the production world necessitates improved maintenance strategies to increase availabilities and drive down cost . The maintenance engineer is thus faced with the need to make more intelligent pre ventive renewal decisions . Two of the main techniques to achieve this is through Condition Monitoring (such as vibrat ion monitoring and oil anal ysis and Statistical Failure Analysis (typically using probabilistic techniques . The present paper discusses these techniques, their uses and weaknesses and then presents th e Proportional Hazard Model as an solution to most of these weaknesses. It then goes on to compare the results of the different techniques in monetary terms, using a South African case study. This comparison shows clearly that the Proportional Hazards Model is sup erior to the present t echniques and should be the preferred model for many actual maintenance situations.

    AFRIKAANSE OPSOMMING: Verhoogde vlakke van mededinging in die produksie omgewing noodsaak verbeterde instandhouding strategies om beskikbaarheid van toerusting te verhoog en koste te minimeer. Instandhoudingsingenieurs moet gevolglik meer intellegente voorkomende hernuwings besluite neem. Twee prominente tegnieke om hierdie doelwit te bereik is Toestandsmonitering (soos vibrasie monitering of olie analise en Statistiese Falingsanalise (gewoonlik m.b.v. probabilistiese metodes. In hierdie artikel beskou ons beide hierdie tegnieke, hulle gebruike en tekortkominge en stel dan die Proporsionele Gevaarkoers Model voor as 'n oplossing vir meeste van die tekortkominge. Die artikel vergelyk ook die verskillende tegnieke in geldelike terme deur gebruik te maak van 'n Suid-Afrikaanse gevalle studie. Hierdie vergelyking wys duidelik-uit dat die Proporsionele Gevaarkoers Model groter beloft e inhou as die huidige tegni eke en dat dit die voorkeur oplossing behoort te wees in baie werklike instandhoudings situasies.

  11. DeepSurv: personalized treatment recommender system using a Cox proportional hazards deep neural network.

    Science.gov (United States)

    Katzman, Jared L; Shaham, Uri; Cloninger, Alexander; Bates, Jonathan; Jiang, Tingting; Kluger, Yuval

    2018-02-26

    Medical practitioners use survival models to explore and understand the relationships between patients' covariates (e.g. clinical and genetic features) and the effectiveness of various treatment options. Standard survival models like the linear Cox proportional hazards model require extensive feature engineering or prior medical knowledge to model treatment interaction at an individual level. While nonlinear survival methods, such as neural networks and survival forests, can inherently model these high-level interaction terms, they have yet to be shown as effective treatment recommender systems. We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations. We perform a number of experiments training DeepSurv on simulated and real survival data. We demonstrate that DeepSurv performs as well as or better than other state-of-the-art survival models and validate that DeepSurv successfully models increasingly complex relationships between a patient's covariates and their risk of failure. We then show how DeepSurv models the relationship between a patient's features and effectiveness of different treatment options to show how DeepSurv can be used to provide individual treatment recommendations. Finally, we train DeepSurv on real clinical studies to demonstrate how it's personalized treatment recommendations would increase the survival time of a set of patients. The predictive and modeling capabilities of DeepSurv will enable medical researchers to use deep neural networks as a tool in their exploration, understanding, and prediction of the effects of a patient's characteristics on their risk of failure.

  12. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  13. Logistic Regression in the Identification of Hazards in Construction

    Science.gov (United States)

    Drozd, Wojciech

    2017-10-01

    The construction site and its elements create circumstances that are conducive to the formation of risks to safety during the execution of works. Analysis indicates the critical importance of these factors in the set of characteristics that describe the causes of accidents in the construction industry. This article attempts to analyse the characteristics related to the construction site, in order to indicate their importance in defining the circumstances of accidents at work. The study includes sites inspected in 2014 - 2016 by the employees of the District Labour Inspectorate in Krakow (Poland). The analysed set of detailed (disaggregated) data includes both quantitative and qualitative characteristics. The substantive task focused on classification modelling in the identification of hazards in construction and identifying those of the analysed characteristics that are important in an accident. In terms of methodology, resource data analysis using statistical classifiers, in the form of logistic regression, was the method used.

  14. Remote sensing and GIS-based landslide hazard analysis and cross-validation using multivariate logistic regression model on three test areas in Malaysia

    Science.gov (United States)

    Pradhan, Biswajeet

    2010-05-01

    This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross

  15. Hierarchical Neural Regression Models for Customer Churn Prediction

    Directory of Open Access Journals (Sweden)

    Golshan Mohammadi

    2013-01-01

    Full Text Available As customers are the main assets of each industry, customer churn prediction is becoming a major task for companies to remain in competition with competitors. In the literature, the better applicability and efficiency of hierarchical data mining techniques has been reported. This paper considers three hierarchical models by combining four different data mining techniques for churn prediction, which are backpropagation artificial neural networks (ANN, self-organizing maps (SOM, alpha-cut fuzzy c-means (α-FCM, and Cox proportional hazards regression model. The hierarchical models are ANN + ANN + Cox, SOM + ANN + Cox, and α-FCM + ANN + Cox. In particular, the first component of the models aims to cluster data in two churner and nonchurner groups and also filter out unrepresentative data or outliers. Then, the clustered data as the outputs are used to assign customers to churner and nonchurner groups by the second technique. Finally, the correctly classified data are used to create Cox proportional hazards model. To evaluate the performance of the hierarchical models, an Iranian mobile dataset is considered. The experimental results show that the hierarchical models outperform the single Cox regression baseline model in terms of prediction accuracy, Types I and II errors, RMSE, and MAD metrics. In addition, the α-FCM + ANN + Cox model significantly performs better than the two other hierarchical models.

  16. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier.

    Science.gov (United States)

    Wolters, Mark A; Dean, C B

    2017-01-01

    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  17. Global Drought Proportional Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Drought Proportional Economic Loss Risk Deciles is a 2.5 minute grid of drought hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  18. [Hazard function and life table: an introduction to the failure time analysis].

    Science.gov (United States)

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  19. Semiparametric regression analysis of interval-censored competing risks data.

    Science.gov (United States)

    Mao, Lu; Lin, Dan-Yu; Zeng, Donglin

    2017-09-01

    Interval-censored competing risks data arise when each study subject may experience an event or failure from one of several causes and the failure time is not observed directly but rather is known to lie in an interval between two examinations. We formulate the effects of possibly time-varying (external) covariates on the cumulative incidence or sub-distribution function of competing risks (i.e., the marginal probability of failure from a specific cause) through a broad class of semiparametric regression models that captures both proportional and non-proportional hazards structures for the sub-distribution. We allow each subject to have an arbitrary number of examinations and accommodate missing information on the cause of failure. We consider nonparametric maximum likelihood estimation and devise a fast and stable EM-type algorithm for its computation. We then establish the consistency, asymptotic normality, and semiparametric efficiency of the resulting estimators for the regression parameters by appealing to modern empirical process theory. In addition, we show through extensive simulation studies that the proposed methods perform well in realistic situations. Finally, we provide an application to a study on HIV-1 infection with different viral subtypes. © 2017, The International Biometric Society.

  20. Determining the effects of patient casemix on length of hospital stay: a proportional hazards frailty model approach.

    Science.gov (United States)

    Lee, A H; Yau, K K

    2001-01-01

    To identify factors associated with hospital length of stay (LOS) and to model variations in LOS within Diagnosis Related Groups (DRGs). A proportional hazards frailty modelling approach is proposed that accounts for patient transfers and the inherent correlation of patients clustered within hospitals. The investigation is based on patient discharge data extracted for a group of obstetrical DRGs. Application of the frailty approach has highlighted several significant factors after adjustment for patient casemix and random hospital effects. In particular, patients admitted for childbirth with private medical insurance coverage have higher risk of prolonged hospitalization compared to public patients. The determination of pertinent factors provides important information to hospital management and clinicians in assessing the risk of prolonged hospitalization. The analysis also enables the comparison of inter-hospital variations across adjacent DRGs.

  1. Statistical analysis of Caterpillar 793D haul truck engine data and through-life diagnostic information using the proportional hazards model

    Directory of Open Access Journals (Sweden)

    Carstens, W. A.

    2013-08-01

    Full Text Available Physical asset management (PAM is of increasing concern for companies in industry today. A key performance area of PAM is asset care plans (ACPs, which consist of maintenance strategies such as usage based maintenance (UBM and condition based maintenance (CBM. Data obtained from the South African mining industry was modelled using a CBM prognostic model called the proportional hazards model (PHM. Results indicated that the developed model produced estimates that were reasonable representations of reality. These findings provide an exciting basis for the development of future Weibull PHMs that could result in huge maintenance cost savings and reduced failure occurrences.

  2. Evaluation of protocol change in burn-care management using the Cox proportional hazards model with time-dependent covariates.

    Science.gov (United States)

    Ichida, J M; Wassell, J T; Keller, M D; Ayers, L W

    1993-02-01

    Survival analysis methods are valuable for detecting intervention effects because detailed information from patient records and sensitive outcome measures are used. The burn unit at a large university hospital replaced routine bathing with total body bathing using chlorhexidine gluconate for antimicrobial effect. A Cox proportional hazards model was used to analyse time from admission until either infection with Staphylococcus aureus or discharge for 155 patients, controlling for burn severity and two time-dependent covariates: days until first wound excision and days until first administration of prophylactic antibiotics. The risk of infection was 55 per cent higher in the historical control group, although not statistically significant. There was also some indication that early wound excision may be important as an infection-control measure for burn patients.

  3. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    Science.gov (United States)

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  4. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    Science.gov (United States)

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  5. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

    Directory of Open Access Journals (Sweden)

    Lois A Gelfand

    2016-03-01

    Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  6. Comparison of exact, efron and breslow parameter approach method on hazard ratio and stratified cox regression model

    Science.gov (United States)

    Fatekurohman, Mohamat; Nurmala, Nita; Anggraeni, Dian

    2018-04-01

    Lungs are the most important organ, in the case of respiratory system. Problems related to disorder of the lungs are various, i.e. pneumonia, emphysema, tuberculosis and lung cancer. Comparing all those problems, lung cancer is the most harmful. Considering about that, the aim of this research applies survival analysis and factors affecting the endurance of the lung cancer patient using comparison of exact, Efron and Breslow parameter approach method on hazard ratio and stratified cox regression model. The data applied are based on the medical records of lung cancer patients in Jember Paru-paru hospital on 2016, east java, Indonesia. The factors affecting the endurance of the lung cancer patients can be classified into several criteria, i.e. sex, age, hemoglobin, leukocytes, erythrocytes, sedimentation rate of blood, therapy status, general condition, body weight. The result shows that exact method of stratified cox regression model is better than other. On the other hand, the endurance of the patients is affected by their age and the general conditions.

  7. High-Dimensional Additive Hazards Regression for Oral Squamous Cell Carcinoma Using Microarray Data: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Omid Hamidi

    2014-01-01

    Full Text Available Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P<0.001. The Lasso showed maximum median of area under ROC curve over time (0.95 and smoothly clipped absolute deviation showed the lowest prediction error (0.105. It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements.

  8. Reported Theory Use by Digital Interventions for Hazardous and Harmful Alcohol Consumption, and Association With Effectiveness: Meta-Regression

    Science.gov (United States)

    Crane, David; Brown, Jamie; Kaner, Eileen; Beyer, Fiona; Muirhead, Colin; Hickman, Matthew; Redmore, James; de Vocht, Frank; Beard, Emma; Michie, Susan

    2018-01-01

    Background Applying theory to the design and evaluation of interventions is likely to increase effectiveness and improve the evidence base from which future interventions are developed, though few interventions report this. Objective The aim of this paper was to assess how digital interventions to reduce hazardous and harmful alcohol consumption report the use of theory in their development and evaluation, and whether reporting of theory use is associated with intervention effectiveness. Methods Randomized controlled trials were extracted from a Cochrane review on digital interventions for reducing hazardous and harmful alcohol consumption. Reporting of theory use within these digital interventions was investigated using the theory coding scheme (TCS). Reported theory use was analyzed by frequency counts and descriptive statistics. Associations were analyzed with meta-regression models. Results Of 41 trials involving 42 comparisons, half did not mention theory (50% [21/42]), and only 38% (16/42) used theory to select or develop the intervention techniques. Significant heterogeneity existed between studies in the effect of interventions on alcohol reduction (I2=77.6%, Ptheory use and intervention effectiveness in unadjusted models, though the meta-regression was underpowered to detect modest associations. Conclusions Digital interventions offer a unique opportunity to refine and develop new dynamic, temporally sensitive theories, yet none of the studies reported refining or developing theory. Clearer selection, application, and reporting of theory use is needed to accurately assess how useful theory is in this field and to advance the field of behavior change theories. PMID:29490895

  9. Race and Beta-Blocker Survival Benefit in Patients With Heart Failure: An Investigation of Self-Reported Race and Proportion of African Genetic Ancestry.

    Science.gov (United States)

    Luzum, Jasmine A; Peterson, Edward; Li, Jia; She, Ruicong; Gui, Hongsheng; Liu, Bin; Spertus, John A; Pinto, Yigal M; Williams, L Keoki; Sabbah, Hani N; Lanfear, David E

    2018-05-08

    It remains unclear whether beta-blockade is similarly effective in black patients with heart failure and reduced ejection fraction as in white patients, but self-reported race is a complex social construct with both biological and environmental components. The objective of this study was to compare the reduction in mortality associated with beta-blocker exposure in heart failure and reduced ejection fraction patients by both self-reported race and by proportion African genetic ancestry. Insured patients with heart failure and reduced ejection fraction (n=1122) were included in a prospective registry at Henry Ford Health System. This included 575 self-reported blacks (129 deaths, 22%) and 547 self-reported whites (126 deaths, 23%) followed for a median 3.0 years. Beta-blocker exposure (BBexp) was calculated from pharmacy claims, and the proportion of African genetic ancestry was determined from genome-wide array data. Time-dependent Cox proportional hazards regression was used to separately test the association of BBexp with all-cause mortality by self-reported race or by proportion of African genetic ancestry. Both sets of models were evaluated unadjusted and then adjusted for baseline risk factors and beta-blocker propensity score. BBexp effect estimates were protective and of similar magnitude both by self-reported race and by African genetic ancestry (adjusted hazard ratio=0.56 in blacks and adjusted hazard ratio=0.48 in whites). The tests for interactions with BBexp for both self-reported race and for African genetic ancestry were not statistically significant in any model ( P >0.1 for all). Among black and white patients with heart failure and reduced ejection fraction, reduction in all-cause mortality associated with BBexp was similar, regardless of self-reported race or proportion African genetic ancestry. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  10. A SAS-macro for estimation of the cumulative incidence using Poisson regression

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    2009-01-01

    the hazard rates, and the hazard rates are often estimated by the Cox regression. This procedure may not be suitable for large studies due to limited computer resources. Instead one uses Poisson regression, which approximates the Cox regression. Rosthøj et al. presented a SAS-macro for the estimation...... of the cumulative incidences based on the Cox regression. I present the functional form of the probabilities and variances when using piecewise constant hazard rates and a SAS-macro for the estimation using Poisson regression. The use of the macro is demonstrated through examples and compared to the macro presented...

  11. Further Results on Dynamic Additive Hazard Rate Model

    Directory of Open Access Journals (Sweden)

    Zhengcheng Zhang

    2014-01-01

    Full Text Available In the past, the proportional and additive hazard rate models have been investigated in the works. Nanda and Das (2011 introduced and studied the dynamic proportional (reversed hazard rate model. In this paper we study the dynamic additive hazard rate model, and investigate its aging properties for different aging classes. The closure of the model under some stochastic orders has also been investigated. Some examples are also given to illustrate different aging properties and stochastic comparisons of the model.

  12. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design.

    Science.gov (United States)

    Meaney, Christopher; Moineddin, Rahim

    2014-01-24

    In biomedical research, response variables are often encountered which have bounded support on the open unit interval--(0,1). Traditionally, researchers have attempted to estimate covariate effects on these types of response data using linear regression. Alternative modelling strategies may include: beta regression, variable-dispersion beta regression, and fractional logit regression models. This study employs a Monte Carlo simulation design to compare the statistical properties of the linear regression model to that of the more novel beta regression, variable-dispersion beta regression, and fractional logit regression models. In the Monte Carlo experiment we assume a simple two sample design. We assume observations are realizations of independent draws from their respective probability models. The randomly simulated draws from the various probability models are chosen to emulate average proportion/percentage/rate differences of pre-specified magnitudes. Following simulation of the experimental data we estimate average proportion/percentage/rate differences. We compare the estimators in terms of bias, variance, type-1 error and power. Estimates of Monte Carlo error associated with these quantities are provided. If response data are beta distributed with constant dispersion parameters across the two samples, then all models are unbiased and have reasonable type-1 error rates and power profiles. If the response data in the two samples have different dispersion parameters, then the simple beta regression model is biased. When the sample size is small (N0 = N1 = 25) linear regression has superior type-1 error rates compared to the other models. Small sample type-1 error rates can be improved in beta regression models using bias correction/reduction methods. In the power experiments, variable-dispersion beta regression and fractional logit regression models have slightly elevated power compared to linear regression models. Similar results were observed if the

  13. Enzyme replacement therapy for Anderson-Fabry disease: A complementary overview of a Cochrane publication through a linear regression and a pooled analysis of proportions from cohort studies.

    Science.gov (United States)

    El Dib, Regina; Gomaa, Huda; Ortiz, Alberto; Politei, Juan; Kapoor, Anil; Barreto, Fellype

    2017-01-01

    Anderson-Fabry disease (AFD) is an X-linked recessive inborn error of glycosphingolipid metabolism caused by a deficiency of alpha-galactosidase A. Renal failure, heart and cerebrovascular involvement reduce survival. A Cochrane review provided little evidence on the use of enzyme replacement therapy (ERT). We now complement this review through a linear regression and a pooled analysis of proportions from cohort studies. To evaluate the efficacy and safety of ERT for AFD. For the systematic review, a literature search was performed, from inception to March 2016, using Medline, EMBASE and LILACS. Inclusion criteria were cohort studies, patients with AFD on ERT or natural history, and at least one patient-important outcome (all-cause mortality, renal, cardiovascular or cerebrovascular events, and adverse events) reported. The pooled proportion and the confidence interval (CI) are shown for each outcome. Simple linear regressions for composite endpoints were performed. 77 cohort studies involving 15,305 participants proved eligible. The pooled proportions were as follows: a) for renal complications, agalsidase alfa 15.3% [95% CI 0.048, 0.303; I2 = 77.2%, p = 0.0005]; agalsidase beta 6% [95% CI 0.04, 0.07; I2 = not applicable]; and untreated patients 21.4% [95% CI 0.1522, 0.2835; I2 = 89.6%, plinear regression showed that Fabry patients receiving agalsidase alfa are more likely to have higher rates of composite endpoints compared to those receiving agalsidase beta. Agalsidase beta is associated to a significantly lower incidence of renal, cardiovascular and cerebrovascular events than no ERT, and to a significantly lower incidence of cerebrovascular events than agalsidase alfa. In view of these results, the use of agalsidase beta for preventing major organ complications related to AFD can be recommended.

  14. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  15. Modelos de regresión para variables expresadas como una proporción continua Regression models for variables expressed as a continuous proportion

    Directory of Open Access Journals (Sweden)

    Aarón Salinas-Rodríguez

    2006-10-01

    the Public Health field. MATERIAL AND METHODS: From the National Reproductive Health Survey performed in 2003, the proportion of individual coverage in the family planning program -proposed in one study carried out in the National Institute of Public Health in Cuernavaca, Morelos, Mexico (2005- was modeled using the Normal, Gamma, Beta and quasi-likelihood regression models. The Akaike Information Criterion (AIC proposed by McQuarrie and Tsai was used to define the best model. Then, using a simulation (Monte Carlo/Markov Chains approach a variable with a Beta distribution was generated to evaluate the behavior of the 4 models while varying the sample size from 100 to 18 000 observations. RESULTS: Results showed that the best statistical option for the analysis of continuous proportions was the Beta regression model, since its assumptions are easily accomplished and because it had the lowest AIC value. Simulation evidenced that while the sample size increases the Gamma, and even more so the quasi-likelihood, models come significantly close to the Beta regression model. CONCLUSIONS: The use of parametric Beta regression is highly recommended to model continuous proportions and the normal model should be avoided. If the sample size is large enough, the use of quasi-likelihood model represents a good alternative.

  16. The median hazard ratio: a useful measure of variance and general contextual effects in multilevel survival analysis.

    Science.gov (United States)

    Austin, Peter C; Wagner, Philippe; Merlo, Juan

    2017-03-15

    Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster-specific random effects which allow one to partition the total individual variance into between-cluster variation and between-individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time-to-event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., 'frailty') Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  17. Reported Theory Use by Digital Interventions for Hazardous and Harmful Alcohol Consumption, and Association With Effectiveness: Meta-Regression.

    Science.gov (United States)

    Garnett, Claire; Crane, David; Brown, Jamie; Kaner, Eileen; Beyer, Fiona; Muirhead, Colin; Hickman, Matthew; Redmore, James; de Vocht, Frank; Beard, Emma; Michie, Susan

    2018-02-28

    Applying theory to the design and evaluation of interventions is likely to increase effectiveness and improve the evidence base from which future interventions are developed, though few interventions report this. The aim of this paper was to assess how digital interventions to reduce hazardous and harmful alcohol consumption report the use of theory in their development and evaluation, and whether reporting of theory use is associated with intervention effectiveness. Randomized controlled trials were extracted from a Cochrane review on digital interventions for reducing hazardous and harmful alcohol consumption. Reporting of theory use within these digital interventions was investigated using the theory coding scheme (TCS). Reported theory use was analyzed by frequency counts and descriptive statistics. Associations were analyzed with meta-regression models. Of 41 trials involving 42 comparisons, half did not mention theory (50% [21/42]), and only 38% (16/42) used theory to select or develop the intervention techniques. Significant heterogeneity existed between studies in the effect of interventions on alcohol reduction (I 2 =77.6%, Ptheory use and intervention effectiveness in unadjusted models, though the meta-regression was underpowered to detect modest associations. Digital interventions offer a unique opportunity to refine and develop new dynamic, temporally sensitive theories, yet none of the studies reported refining or developing theory. Clearer selection, application, and reporting of theory use is needed to accurately assess how useful theory is in this field and to advance the field of behavior change theories. ©Claire Garnett, David Crane, Jamie Brown, Eileen Kaner, Fiona Beyer, Colin Muirhead, Matthew Hickman, James Redmore, Frank de Vocht, Emma Beard, Susan Michie. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.02.2018.

  18. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    Science.gov (United States)

    Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang

    2017-12-01

    Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  19. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  20. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  1. Checking Fine and Gray subdistribution hazards model with cumulative sums of residuals

    DEFF Research Database (Denmark)

    Li, Jianing; Scheike, Thomas; Zhang, Mei Jie

    2015-01-01

    Recently, Fine and Gray (J Am Stat Assoc 94:496–509, 1999) proposed a semi-parametric proportional regression model for the subdistribution hazard function which has been used extensively for analyzing competing risks data. However, failure of model adequacy could lead to severe bias in parameter...... estimation, and only a limited contribution has been made to check the model assumptions. In this paper, we present a class of analytical methods and graphical approaches for checking the assumptions of Fine and Gray’s model. The proposed goodness-of-fit test procedures are based on the cumulative sums...

  2. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    Science.gov (United States)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  3. a study of the slope of cox proportional hazard and weibull models

    African Journals Online (AJOL)

    Adejumo & Ahmadu

    known and the hazard function is completely specified except for the values of the ... through the air when people who have an active TB infection, cough, sneeze ... The increase of. TB incidence is highest in Africa and Asia, areas with the highest ... further complicating treatment by increasing the length and cost of therapy.

  4. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  5. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    Directory of Open Access Journals (Sweden)

    Zhengnan Huang

    2017-12-01

    Full Text Available Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  6. Comparison of linear, skewed-linear, and proportional hazard models for the analysis of lambing interval in Ripollesa ewes.

    Science.gov (United States)

    Casellas, J; Bach, R

    2012-06-01

    Lambing interval is a relevant reproductive indicator for sheep populations under continuous mating systems, although there is a shortage of selection programs accounting for this trait in the sheep industry. Both the historical assumption of small genetic background and its unorthodox distribution pattern have limited its implementation as a breeding objective. In this manuscript, statistical performances of 3 alternative parametrizations [i.e., symmetric Gaussian mixed linear (GML) model, skew-Gaussian mixed linear (SGML) model, and piecewise Weibull proportional hazard (PWPH) model] have been compared to elucidate the preferred methodology to handle lambing interval data. More specifically, flock-by-flock analyses were performed on 31,986 lambing interval records (257.3 ± 0.2 d) from 6 purebred Ripollesa flocks. Model performances were compared in terms of deviance information criterion (DIC) and Bayes factor (BF). For all flocks, PWPH models were clearly preferred; they generated a reduction of 1,900 or more DIC units and provided BF estimates larger than 100 (i.e., PWPH models against linear models). These differences were reduced when comparing PWPH models with different number of change points for the baseline hazard function. In 4 flocks, only 2 change points were required to minimize the DIC, whereas 4 and 6 change points were needed for the 2 remaining flocks. These differences demonstrated a remarkable degree of heterogeneity across sheep flocks that must be properly accounted for in genetic evaluation models to avoid statistical biases and suboptimal genetic trends. Within this context, all 6 Ripollesa flocks revealed substantial genetic background for lambing interval with heritabilities ranging between 0.13 and 0.19. This study provides the first evidence of the suitability of PWPH models for lambing interval analysis, clearly discarding previous parametrizations focused on mixed linear models.

  7. Converging or Crossing Curves: Untie the Gordian Knot or Cut it? Appropriate Statistics for Non-Proportional Hazards in Decitabine DACO-016 Study (AML).

    Science.gov (United States)

    Tomeczkowski, Jörg; Lange, Ansgar; Güntert, Andreas; Thilakarathne, Pushpike; Diels, Joris; Xiu, Liang; De Porre, Peter; Tapprich, Christoph

    2015-09-01

    Among patients with acute myeloid leukemia (AML), the DACO-016 randomized study showed reduction in mortality for decitabine [Dacogen(®) (DAC), Eisai Inc., Woodcliff Lake, NJ, USA] compared with treatment choice (TC): at primary analysis the hazard ratio (HR) was 0.85 (95% confidence interval 0.69-1.04; stratified log-rank P = 0.108). With two interim analyses, two-sided alpha was adjusted to 0.0462. With 1-year additional follow-up the HR reached 0.82 (nominal P = 0.0373). These data resulted in approval of DAC in the European Union, though not in the United States. Though pre-specified, the log-rank test could be considered not optimal to assess the observed survival difference because of the non-proportional hazard nature of the survival curves. We applied the Wilcoxon test as a sensitivity analysis. Patients were randomized to DAC (N = 242) or TC (N = 243). One-hundred and eight (44.4%) patients in the TC arm and 91 (37.6%) patients in the DAC arm selectively crossed over to subsequent disease modifying therapies at progression, which might impact the survival beyond the median with resultant converging curves (and disproportional hazards). The stratified Wilcoxon test showed a significant improvement in median (CI 95%) overall survival with DAC [7.7 (6.2; 9.2) months] versus TC [5.0 (4.3; 6.3) months; P = 0.0458]. Wilcoxon test indicated significant increase in survival for DAC versus TC compared to log-rank test. Janssen-Cilag GmbH.

  8. Confidence intervals for the first crossing point of two hazard functions.

    Science.gov (United States)

    Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng

    2009-12-01

    The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.

  9. A comparison of random forest regression and multiple linear regression for prediction in neuroscience.

    Science.gov (United States)

    Smith, Paul F; Ganesh, Siva; Liu, Ping

    2013-10-30

    Regression is a common statistical tool for prediction in neuroscience. However, linear regression is by far the most common form of regression used, with regression trees receiving comparatively little attention. In this study, the results of conventional multiple linear regression (MLR) were compared with those of random forest regression (RFR), in the prediction of the concentrations of 9 neurochemicals in the vestibular nucleus complex and cerebellum that are part of the l-arginine biochemical pathway (agmatine, putrescine, spermidine, spermine, l-arginine, l-ornithine, l-citrulline, glutamate and γ-aminobutyric acid (GABA)). The R(2) values for the MLRs were higher than the proportion of variance explained values for the RFRs: 6/9 of them were ≥ 0.70 compared to 4/9 for RFRs. Even the variables that had the lowest R(2) values for the MLRs, e.g. ornithine (0.50) and glutamate (0.61), had much lower proportion of variance explained values for the RFRs (0.27 and 0.49, respectively). The RSE values for the MLRs were lower than those for the RFRs in all but two cases. In general, MLRs seemed to be superior to the RFRs in terms of predictive value and error. In the case of this data set, MLR appeared to be superior to RFR in terms of its explanatory value and error. This result suggests that MLR may have advantages over RFR for prediction in neuroscience with this kind of data set, but that RFR can still have good predictive value in some cases. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Early regression of severe left ventricular hypertrophy after transcatheter aortic valve replacement is associated with decreased hospitalizations.

    Science.gov (United States)

    Lindman, Brian R; Stewart, William J; Pibarot, Philippe; Hahn, Rebecca T; Otto, Catherine M; Xu, Ke; Devereux, Richard B; Weissman, Neil J; Enriquez-Sarano, Maurice; Szeto, Wilson Y; Makkar, Raj; Miller, D Craig; Lerakis, Stamatios; Kapadia, Samir; Bowers, Bruce; Greason, Kevin L; McAndrew, Thomas C; Lei, Yang; Leon, Martin B; Douglas, Pamela S

    2014-06-01

    This study sought to examine the relationship between left ventricular mass (LVM) regression and clinical outcomes after transcatheter aortic valve replacement (TAVR). LVM regression after valve replacement for aortic stenosis is assumed to be a favorable effect of LV unloading, but its relationship to improved clinical outcomes is unclear. Of 2,115 patients with symptomatic aortic stenosis at high surgical risk receiving TAVR in the PARTNER (Placement of Aortic Transcatheter Valves) randomized trial or continued access registry, 690 had both severe LV hypertrophy (left ventricular mass index [LVMi] ≥ 149 g/m(2) men, ≥ 122 g/m(2) women) at baseline and an LVMi measurement at 30-day post-TAVR follow-up. Clinical outcomes were compared for patients with greater than versus lesser than median percentage change in LVMi between baseline and 30 days using Cox proportional hazard models to evaluate event rates from 30 to 365 days. Compared with patients with lesser regression, patients with greater LVMi regression had a similar rate of all-cause mortality (14.1% vs. 14.3%, p = 0.99), but a lower rate of rehospitalization (9.5% vs. 18.5%, hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.32 to 0.78; p = 0.002) and a lower rate of rehospitalization specifically for heart failure (7.3% vs. 13.6%, p = 0.01). The association with a lower rate of rehospitalization was consistent across subgroups and remained significant after multivariable adjustment (HR: 0.53, 95% CI: 0.34 to 0.84; p = 0.007). Patients with greater LVMi regression had lower B-type natriuretic peptide (p = 0.002) and a trend toward better quality of life (p = 0.06) at 1-year follow-up than did those with lesser regression. In high-risk patients with severe aortic stenosis and severe LV hypertrophy undergoing TAVR, those with greater early LVM regression had one-half the rate of rehospitalization over the subsequent year compared to those with lesser regression. Copyright © 2014 American College of

  11. Assessing End-Of-Supply Risk of Spare Parts Using the Proportional Hazard Model

    NARCIS (Netherlands)

    X. Li (Xishu); R. Dekker (Rommert); C. Heij (Christiaan); M. Hekimoğlu (Mustafa)

    2016-01-01

    textabstractOperators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation

  12. Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.

    Science.gov (United States)

    Schmidtmann, I; Elsäßer, A; Weinmann, A; Binder, H

    2014-12-30

    For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivated by a clinical cancer registry application, where complex event patterns have to be dealt with and variable selection is needed at the same time, we propose a general approach for linking variable selection between several Cox models. Specifically, we combine score statistics for each covariate across models by Fisher's method as a basis for variable selection. This principle is implemented for a stepwise forward selection approach as well as for a regularized regression technique. In an application to data from hepatocellular carcinoma patients, the coupled stepwise approach is seen to facilitate joint interpretation of the different cause-specific Cox models. In conditional survival models at landmark times, which address updates of prediction as time progresses and both treatment and other potential explanatory variables may change, the coupled regularized regression approach identifies potentially important, stably selected covariates together with their effect time pattern, despite having only a small number of events. These results highlight the promise of the proposed approach for coupling variable selection between Cox models, which is particularly relevant for modeling for clinical cancer registries with their complex event patterns. Copyright © 2014 John Wiley & Sons

  13. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  14. The influence of craniofacial to standing height proportion on perceived attractiveness.

    Science.gov (United States)

    Naini, F B; Cobourne, M T; McDonald, F; Donaldson, A N A

    2008-10-01

    An idealised male image, based on Vitruvian Man, was created. The craniofacial height was altered from a proportion of 1/6 to 1/10 of standing height, creating 10 images shown in random order to 89 observers (74 lay people; 15 clinicians), who ranked the images from the most to the least attractive. The main outcome was the preference ranks of image attractiveness given by the observers. Linear regressions were used to assess what influences the choice for the most and the least attractive images, followed by a multivariate rank ordinal logistic regression to test the influence of age, gender, ethnicity and professional status of the observer. A craniofacial height to standing height proportion of 1/7.5 was perceived as the most attractive (36%), followed by a proportion of 1/8 (26%). The images chosen as most attractive by more than 10% of observers had a mean proportion of 1/7.8(min=1/7; max=1/8.5). The images perceived as most unattractive had a proportion of 1/6 and 1/10. The choice of images was not influenced by the age, gender, ethnicity or professional status of the observers. The ideal craniofacial height to standing height proportion is in the range 1/7 to 1/8.5. This finding should be considered when planning treatment to alter craniofacial or facial height.

  15. Predictors of time to relapse in amphetamine-type substance users in the matrix treatment program in Iran: a Cox proportional hazard model application.

    Science.gov (United States)

    Moeeni, Maryam; Razaghi, Emran M; Ponnet, Koen; Torabi, Fatemeh; Shafiee, Seyed Ali; Pashaei, Tahereh

    2016-07-26

    The aim of this study was to determine which predictors influence the risk of relapse among a cohort of amphetamine-type substance (ATS) users in Iran. A Cox proportional hazards model was conducted to determine factors associated with the relapse time in the Matrix treatment program provided by the Iranian National Center of Addiction Studies (INCAS) between March 2010 and October 2011. Participating in more treatment sessions was associated with a lower probability of relapse. On the other hand, patients with less family support, longer dependence on ATS, and those with an experience of casual sex and a history of criminal offenses were more likely to relapse. This study broadens our understanding of factors influencing the risk of relapse in ATS use among an Iranian sample. The findings can guide practitioners during the treatment program.

  16. Model of predicting proportion of diesel fuel and engine oil in diesel ...

    African Journals Online (AJOL)

    Viscosity of diesel adulterated SAE 40 engine oil at varying proportions of the mixture is presented. Regression, variation of intercept and the power parameters methods are used for developing polynomial and power law functions for predicting proportion of either diesel or engine oil in diesel adulterated SAE 40 engine oil ...

  17. The arcsine is asinine: the analysis of proportions in ecology.

    Science.gov (United States)

    Warton, David I; Hui, Francis K C

    2011-01-01

    The arcsine square root transformation has long been standard procedure when analyzing proportional data in ecology, with applications in data sets containing binomial and non-binomial response variables. Here, we argue that the arcsine transform should not be used in either circumstance. For binomial data, logistic regression has greater interpretability and higher power than analyses of transformed data. However, it is important to check the data for additional unexplained variation, i.e., overdispersion, and to account for it via the inclusion of random effects in the model if found. For non-binomial data, the arcsine transform is undesirable on the grounds of interpretability, and because it can produce nonsensical predictions. The logit transformation is proposed as an alternative approach to address these issues. Examples are presented in both cases to illustrate these advantages, comparing various methods of analyzing proportions including untransformed, arcsine- and logit-transformed linear models and logistic regression (with or without random effects). Simulations demonstrate that logistic regression usually provides a gain in power over other methods.

  18. A generalized partially linear mean-covariance regression model for longitudinal proportional data, with applications to the analysis of quality of life data from cancer clinical trials.

    Science.gov (United States)

    Zheng, Xueying; Qin, Guoyou; Tu, Dongsheng

    2017-05-30

    Motivated by the analysis of quality of life data from a clinical trial on early breast cancer, we propose in this paper a generalized partially linear mean-covariance regression model for longitudinal proportional data, which are bounded in a closed interval. Cholesky decomposition of the covariance matrix for within-subject responses and generalized estimation equations are used to estimate unknown parameters and the nonlinear function in the model. Simulation studies are performed to evaluate the performance of the proposed estimation procedures. Our new model is also applied to analyze the data from the cancer clinical trial that motivated this research. In comparison with available models in the literature, the proposed model does not require specific parametric assumptions on the density function of the longitudinal responses and the probability function of the boundary values and can capture dynamic changes of time or other interested variables on both mean and covariance of the correlated proportional responses. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. The proportional odds cumulative incidence model for competing risks

    DEFF Research Database (Denmark)

    Eriksson, Frank; Li, Jianing; Scheike, Thomas

    2015-01-01

    We suggest an estimator for the proportional odds cumulative incidence model for competing risks data. The key advantage of this model is that the regression parameters have the simple and useful odds ratio interpretation. The model has been considered by many authors, but it is rarely used...... in practice due to the lack of reliable estimation procedures. We suggest such procedures and show that their performance improve considerably on existing methods. We also suggest a goodness-of-fit test for the proportional odds assumption. We derive the large sample properties and provide estimators...

  20. Correlation-regression model for physico-chemical quality of ...

    African Journals Online (AJOL)

    abusaad

    areas, suggesting that groundwater quality in urban areas is closely related with land use ... the ground water, with correlation and regression model is also presented. ...... WHO (World Health Organization) (1985). Health hazards from nitrates.

  1. Artificial neural networks versus proportional hazards Cox models to predict 45-year all-cause mortality in the Italian Rural Areas of the Seven Countries Study

    Directory of Open Access Journals (Sweden)

    Puddu Paolo

    2012-07-01

    Full Text Available Abstract Background Projection pursuit regression, multilayer feed-forward networks, multivariate adaptive regression splines and trees (including survival trees have challenged classic multivariable models such as the multiple logistic function, the proportional hazards life table Cox model (Cox, the Poisson’s model, and the Weibull’s life table model to perform multivariable predictions. However, only artificial neural networks (NN have become popular in medical applications. Results We compared several Cox versus NN models in predicting 45-year all-cause mortality (45-ACM by 18 risk factors selected a priori: age; father life status; mother life status; family history of cardiovascular diseases; job-related physical activity; cigarette smoking; body mass index (linear and quadratic terms; arm circumference; mean blood pressure; heart rate; forced expiratory volume; serum cholesterol; corneal arcus; diagnoses of cardiovascular diseases, cancer and diabetes; minor ECG abnormalities at rest. Two Italian rural cohorts of the Seven Countries Study, made up of men aged 40 to 59 years, enrolled and first examined in 1960 in Italy. Cox models were estimated by: a forcing all factors; b a forward-; and c a backward-stepwise procedure. Observed cases of deaths and of survivors were computed in decile classes of estimated risk. Forced and stepwise NN were run and compared by C-statistics (ROC analysis with the Cox models. Out of 1591 men, 1447 died. Model global accuracies were extremely high by all methods (ROCs > 0.810 but there was no clear-cut superiority of any model to predict 45-ACM. The highest ROCs (> 0.838 were observed by NN. There were inter-model variations to select predictive covariates: whereas all models concurred to define the role of 10 covariates (mainly cardiovascular risk factors, family history, heart rate and minor ECG abnormalities were not contributors by Cox models but were so by forced NN. Forced expiratory volume and arm

  2. Analyzing Right-Censored Length-Biased Data with Additive Hazards Model

    Institute of Scientific and Technical Information of China (English)

    Mu ZHAO; Cun-jie LIN; Yong ZHOU

    2017-01-01

    Length-biased data are often encountered in observational studies,when the survival times are left-truncated and right-censored and the truncation times follow a uniform distribution.In this article,we propose to analyze such data with the additive hazards model,which specifies that the hazard function is the sum of an arbitrary baseline hazard function and a regression function of covariates.We develop estimating equation approaches to estimate the regression parameters.The resultant estimators are shown to be consistent and asymptotically normal.Some simulation studies and a real data example are used to evaluate the finite sample properties of the proposed estimators.

  3. Survival analysis of a treatment data for cancer of the larynx

    International Nuclear Information System (INIS)

    Khan, K.

    2002-01-01

    In this paper a survival analysis of the survival time is done. The Cox regression model is fitted to the survival time with the assumption of proportional hazard. A model is selected after inclusion and exclusion of factors and variables as explanatory variables. The assumption of proportional hazards is tested in the manner suggested by Harrell (1986). The assumption of proportional hazards is supported by these tests. However the plot of Schoenfeld residuals against dose gave a little evidence of non validity of the proportional hazard assumption. The assumption seems to be satisfied for variable time. The martingale residuals suggest no pattern for variable age. The functional form of dose is not linear. Hence the quadratic dose is used as an explanatory variable. A comparison of logistic regression analysis and survival analysis is also made in this paper. It can be concluded that Cox proportional hazards model is a better model than the logistic model as it is more parsimonious and utilizes more information. (author)

  4. Regression analysis of case K interval-censored failure time data in the presence of informative censoring.

    Science.gov (United States)

    Wang, Peijie; Zhao, Hui; Sun, Jianguo

    2016-12-01

    Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.

  5. Hazard avoidance via descent images for safe landing

    Science.gov (United States)

    Yan, Ruicheng; Cao, Zhiguo; Zhu, Lei; Fang, Zhiwen

    2013-10-01

    In planetary or lunar landing missions, hazard avoidance is critical for landing safety. Therefore, it is very important to correctly detect hazards and effectively find a safe landing area during the last stage of descent. In this paper, we propose a passive sensing based HDA (hazard detection and avoidance) approach via descent images to lower the landing risk. In hazard detection stage, a statistical probability model on the basis of the hazard similarity is adopted to evaluate the image and detect hazardous areas, so that a binary hazard image can be generated. Afterwards, a safety coefficient, which jointly utilized the proportion of hazards in the local region and the inside hazard distribution, is proposed to find potential regions with less hazards in the binary hazard image. By using the safety coefficient in a coarse-to-fine procedure and combining it with the local ISD (intensity standard deviation) measure, the safe landing area is determined. The algorithm is evaluated and verified with many simulated descent downward looking images rendered from lunar orbital satellite images.

  6. Two encyclopedia contributions

    DEFF Research Database (Denmark)

    Andersen, Per Kragh

    2003-01-01

    censoring; Cox regression model; counting process; event history analysis; intensity process; multi-state model; non-parametric inference; parametric models; survival analysis; Cox's proportional hazards model; hazard function, regression model; time-dependent covariate; time scale...

  7. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and

  8. The role of social networks and media receptivity in predicting age of smoking initiation: a proportional hazards model of risk and protective factors.

    Science.gov (United States)

    Unger, J B; Chen, X

    1999-01-01

    The increasing prevalence of adolescent smoking demonstrates the need to identify factors associated with early smoking initiation. Previous studies have shown that smoking by social network members and receptivity to pro-tobacco marketing are associated with smoking among adolescents. It is not clear, however, whether these variables also are associated with the age of smoking initiation. Using data from 10,030 California adolescents, this study identified significant correlates of age of smoking initiation using bivariate methods and a multivariate proportional hazards model. Age of smoking initiation was earlier among those adolescents whose friends, siblings, or parents were smokers, and among those adolescents who had a favorite tobacco advertisement, had received tobacco promotional items, or would be willing to use tobacco promotional items. Results suggest that the smoking behavior of social network members and pro-tobacco media influences are important determinants of age of smoking initiation. Because early smoking initiation is associated with higher levels of addiction in adulthood, tobacco control programs should attempt to counter these influences.

  9. Regression models for the restricted residual mean life for right-censored and left-truncated data

    DEFF Research Database (Denmark)

    Cortese, Giuliana; Holmboe, Stine A.; Scheike, Thomas H.

    2017-01-01

    The hazard ratios resulting from a Cox's regression hazards model are hard to interpret and to be converted into prolonged survival time. As the main goal is often to study survival functions, there is increasing interest in summary measures based on the survival function that are easier to inter......The hazard ratios resulting from a Cox's regression hazards model are hard to interpret and to be converted into prolonged survival time. As the main goal is often to study survival functions, there is increasing interest in summary measures based on the survival function that are easier...... to interpret than the hazard ratio; the residual mean time is an important example of those measures. However, because of the presence of right censoring, the tail of the survival distribution is often difficult to estimate correctly. Therefore, we consider the restricted residual mean time, which represents...... a partial area under the survival function, given any time horizon τ, and is interpreted as the residual life expectancy up to τ of a subject surviving up to time t. We present a class of regression models for this measure, based on weighted estimating equations and inverse probability of censoring weighted...

  10. Cox regression with missing covariate data using a modified partial likelihood method

    DEFF Research Database (Denmark)

    Martinussen, Torben; Holst, Klaus K.; Scheike, Thomas H.

    2016-01-01

    Missing covariate values is a common problem in survival analysis. In this paper we propose a novel method for the Cox regression model that is close to maximum likelihood but avoids the use of the EM-algorithm. It exploits that the observed hazard function is multiplicative in the baseline hazard...

  11. Detecting sea-level hazards: Simple regression-based methods for calculating the acceleration of sea level

    Science.gov (United States)

    Doran, Kara S.; Howd, Peter A.; Sallenger,, Asbury H.

    2016-01-04

    This report documents the development of statistical tools used to quantify the hazard presented by the response of sea-level elevation to natural or anthropogenic changes in climate and ocean circulation. A hazard is a physical process (or processes) that, when combined with vulnerability (or susceptibility to the hazard), results in risk. This study presents the development and comparison of new and existing sea-level analysis methods, exploration of the strengths and weaknesses of the methods using synthetic time series, and when appropriate, synthesis of the application of the method to observed sea-level time series. These reports are intended to enhance material presented in peer-reviewed journal articles where it is not always possible to provide the level of detail that might be necessary to fully support or recreate published results.

  12. The number of subjects per variable required in linear regression analyses.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2015-06-01

    To determine the number of independent variables that can be included in a linear regression model. We used a series of Monte Carlo simulations to examine the impact of the number of subjects per variable (SPV) on the accuracy of estimated regression coefficients and standard errors, on the empirical coverage of estimated confidence intervals, and on the accuracy of the estimated R(2) of the fitted model. A minimum of approximately two SPV tended to result in estimation of regression coefficients with relative bias of less than 10%. Furthermore, with this minimum number of SPV, the standard errors of the regression coefficients were accurately estimated and estimated confidence intervals had approximately the advertised coverage rates. A much higher number of SPV were necessary to minimize bias in estimating the model R(2), although adjusted R(2) estimates behaved well. The bias in estimating the model R(2) statistic was inversely proportional to the magnitude of the proportion of variation explained by the population regression model. Linear regression models require only two SPV for adequate estimation of regression coefficients, standard errors, and confidence intervals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Bayes estimation of the general hazard rate model

    International Nuclear Information System (INIS)

    Sarhan, A.

    1999-01-01

    In reliability theory and life testing models, the life time distributions are often specified by choosing a relevant hazard rate function. Here a general hazard rate function h(t)=a+bt c-1 , where c, a, b are constants greater than zero, is considered. The parameter c is assumed to be known. The Bayes estimators of (a,b) based on the data of type II/item-censored testing without replacement are obtained. A large simulation study using Monte Carlo Method is done to compare the performance of Bayes with regression estimators of (a,b). The criterion for comparison is made based on the Bayes risk associated with the respective estimator. Also, the influence of the number of failed items on the accuracy of the estimators (Bayes and regression) is investigated. Estimations for the parameters (a,b) of the linearly increasing hazard rate model h(t)=a+bt, where a, b are greater than zero, can be obtained as the special case, letting c=2

  14. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  15. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  16. Comparison of multinomial logistic regression and logistic regression: which is more efficient in allocating land use?

    Science.gov (United States)

    Lin, Yingzhi; Deng, Xiangzheng; Li, Xing; Ma, Enjun

    2014-12-01

    Spatially explicit simulation of land use change is the basis for estimating the effects of land use and cover change on energy fluxes, ecology and the environment. At the pixel level, logistic regression is one of the most common approaches used in spatially explicit land use allocation models to determine the relationship between land use and its causal factors in driving land use change, and thereby to evaluate land use suitability. However, these models have a drawback in that they do not determine/allocate land use based on the direct relationship between land use change and its driving factors. Consequently, a multinomial logistic regression method was introduced to address this flaw, and thereby, judge the suitability of a type of land use in any given pixel in a case study area of the Jiangxi Province, China. A comparison of the two regression methods indicated that the proportion of correctly allocated pixels using multinomial logistic regression was 92.98%, which was 8.47% higher than that obtained using logistic regression. Paired t-test results also showed that pixels were more clearly distinguished by multinomial logistic regression than by logistic regression. In conclusion, multinomial logistic regression is a more efficient and accurate method for the spatial allocation of land use changes. The application of this method in future land use change studies may improve the accuracy of predicting the effects of land use and cover change on energy fluxes, ecology, and environment.

  17. Landslide susceptibility mapping using logistic statistical regression in Babaheydar Watershed, Chaharmahal Va Bakhtiari Province, Iran

    Directory of Open Access Journals (Sweden)

    Ebrahim Karimi Sangchini

    2015-01-01

    Full Text Available Landslides are amongst the most damaging natural hazards in mountainous regions. Every year, hundreds of people all over the world lose their lives in landslides; furthermore, there are large impacts on the local and global economy from these events. In this study, landslide hazard zonation in Babaheydar watershed using logistic regression was conducted to determine landslide hazard areas. At first, the landslide inventory map was prepared using aerial photograph interpretations and field surveys. The next step, ten landslide conditioning factors such as altitude, slope percentage, slope aspect, lithology, distance from faults, rivers, settlement and roads, land use, and precipitation were chosen as effective factors on landsliding in the study area. Subsequently, landslide susceptibility map was constructed using the logistic regression model in Geographic Information System (GIS. The ROC and Pseudo-R2 indexes were used for model assessment. Results showed that the logistic regression model provided slightly high prediction accuracy of landslide susceptibility maps in the Babaheydar Watershed with ROC equal to 0.876. Furthermore, the results revealed that about 44% of the watershed areas were located in high and very high hazard classes. The resultant landslide susceptibility maps can be useful in appropriate watershed management practices and for sustainable development in the region.

  18. Shared and unshared exposure measurement error in occupational cohort studies and their effects on statistical inference in proportional hazards models

    Science.gov (United States)

    Laurier, Dominique; Rage, Estelle

    2018-01-01

    Exposure measurement error represents one of the most important sources of uncertainty in epidemiology. When exposure uncertainty is not or only poorly accounted for, it can lead to biased risk estimates and a distortion of the shape of the exposure-response relationship. In occupational cohort studies, the time-dependent nature of exposure and changes in the method of exposure assessment may create complex error structures. When a method of group-level exposure assessment is used, individual worker practices and the imprecision of the instrument used to measure the average exposure for a group of workers may give rise to errors that are shared between workers, within workers or both. In contrast to unshared measurement error, the effects of shared errors remain largely unknown. Moreover, exposure uncertainty and magnitude of exposure are typically highest for the earliest years of exposure. We conduct a simulation study based on exposure data of the French cohort of uranium miners to compare the effects of shared and unshared exposure uncertainty on risk estimation and on the shape of the exposure-response curve in proportional hazards models. Our results indicate that uncertainty components shared within workers cause more bias in risk estimation and a more severe attenuation of the exposure-response relationship than unshared exposure uncertainty or exposure uncertainty shared between individuals. These findings underline the importance of careful characterisation and modeling of exposure uncertainty in observational studies. PMID:29408862

  19. Modeling Tetanus Neonatorum case using the regression of negative binomial and zero-inflated negative binomial

    Science.gov (United States)

    Amaliana, Luthfatul; Sa'adah, Umu; Wayan Surya Wardhani, Ni

    2017-12-01

    Tetanus Neonatorum is an infectious disease that can be prevented by immunization. The number of Tetanus Neonatorum cases in East Java Province is the highest in Indonesia until 2015. Tetanus Neonatorum data contain over dispersion and big enough proportion of zero-inflation. Negative Binomial (NB) regression is an alternative method when over dispersion happens in Poisson regression. However, the data containing over dispersion and zero-inflation are more appropriately analyzed by using Zero-Inflated Negative Binomial (ZINB) regression. The purpose of this study are: (1) to model Tetanus Neonatorum cases in East Java Province with 71.05 percent proportion of zero-inflation by using NB and ZINB regression, (2) to obtain the best model. The result of this study indicates that ZINB is better than NB regression with smaller AIC.

  20. The R Package threg to Implement Threshold Regression Models

    Directory of Open Access Journals (Sweden)

    Tao Xiao

    2015-08-01

    This new package includes four functions: threg, and the methods hr, predict and plot for threg objects returned by threg. The threg function is the model-fitting function which is used to calculate regression coefficient estimates, asymptotic standard errors and p values. The hr method for threg objects is the hazard-ratio calculation function which provides the estimates of hazard ratios at selected time points for specified scenarios (based on given categories or value settings of covariates. The predict method for threg objects is used for prediction. And the plot method for threg objects provides plots for curves of estimated hazard functions, survival functions and probability density functions of the first-hitting-time; function curves corresponding to different scenarios can be overlaid in the same plot for comparison to give additional research insights.

  1. Proportionality lost - proportionality regained?

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2010-01-01

    In recent years, the European Court of Justice (the ECJ) seems to have accepted restrictions on the freedom of establishment and other basic freedoms, despite the fact that a more thorough proportionality test would have revealed that the restriction in question did not pass the 'rule of reason' ...

  2. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder; Juul, Anders

    2004-01-01

    -like growth factor I was associated with ischemic heart disease. The study was based on a population of 3784 Danes and 231 cases of ischemic heart disease where controls were matched on age and gender. We illustrate the use of the MLE for these data and show how the maximum likelihood framework can be used......Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazards...... model. The MLE is computed by the EM-algorithm, which is easy to implement in the proportional hazards setting. Standard errors are estimated by a numerical profile likelihood approach based on EM aided differentiation. The work was motivated by a nested case-control study that hypothesized that insulin...

  3. Multiple logistic regression model of signalling practices of drivers on urban highways

    Science.gov (United States)

    Puan, Othman Che; Ibrahim, Muttaka Na'iya; Zakaria, Rozana

    2015-05-01

    Giving signal is a way of informing other road users, especially to the conflicting drivers, the intention of a driver to change his/her movement course. Other users are exposed to hazard situation and risks of accident if the driver who changes his/her course failed to give signal as required. This paper describes the application of logistic regression model for the analysis of driver's signalling practices on multilane highways based on possible factors affecting driver's decision such as driver's gender, vehicle's type, vehicle's speed and traffic flow intensity. Data pertaining to the analysis of such factors were collected manually. More than 2000 drivers who have performed a lane changing manoeuvre while driving on two sections of multilane highways were observed. Finding from the study shows that relatively a large proportion of drivers failed to give any signals when changing lane. The result of the analysis indicates that although the proportion of the drivers who failed to provide signal prior to lane changing manoeuvre is high, the degree of compliances of the female drivers is better than the male drivers. A binary logistic model was developed to represent the probability of a driver to provide signal indication prior to lane changing manoeuvre. The model indicates that driver's gender, type of vehicle's driven, speed of vehicle and traffic volume influence the driver's decision to provide a signal indication prior to a lane changing manoeuvre on a multilane urban highway. In terms of types of vehicles driven, about 97% of motorcyclists failed to comply with the signal indication requirement. The proportion of non-compliance drivers under stable traffic flow conditions is much higher than when the flow is relatively heavy. This is consistent with the data which indicates a high degree of non-compliances when the average speed of the traffic stream is relatively high.

  4. Proportion of medication error reporting and associated factors among nurses: a cross sectional study.

    Science.gov (United States)

    Jember, Abebaw; Hailu, Mignote; Messele, Anteneh; Demeke, Tesfaye; Hassen, Mohammed

    2018-01-01

    A medication error (ME) is any preventable event that may cause or lead to inappropriate medication use or patient harm. Voluntary reporting has a principal role in appreciating the extent and impact of medication errors. Thus, exploration of the proportion of medication error reporting and associated factors among nurses is important to inform service providers and program implementers so as to improve the quality of the healthcare services. Institution based quantitative cross-sectional study was conducted among 397 nurses from March 6 to May 10, 2015. Stratified sampling followed by simple random sampling technique was used to select the study participants. The data were collected using structured self-administered questionnaire which was adopted from studies conducted in Australia and Jordan. A pilot study was carried out to validate the questionnaire before data collection for this study. Bivariate and multivariate logistic regression models were fitted to identify factors associated with the proportion of medication error reporting among nurses. An adjusted odds ratio with 95% confidence interval was computed to determine the level of significance. The proportion of medication error reporting among nurses was found to be 57.4%. Regression analysis showed that sex, marital status, having made a medication error and medication error experience were significantly associated with medication error reporting. The proportion of medication error reporting among nurses in this study was found to be higher than other studies.

  5. Maximum likelihood estimation for Cox's regression model under nested case-control sampling

    DEFF Research Database (Denmark)

    Scheike, Thomas; Juul, Anders

    2004-01-01

    Nested case-control sampling is designed to reduce the costs of large cohort studies. It is important to estimate the parameters of interest as efficiently as possible. We present a new maximum likelihood estimator (MLE) for nested case-control sampling in the context of Cox's proportional hazard...

  6. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  7. Regression and local control rates after radiotherapy for jugulotympanic paragangliomas: Systematic review and meta-analysis

    International Nuclear Information System (INIS)

    Hulsteijn, Leonie T. van; Corssmit, Eleonora P.M.; Coremans, Ida E.M.; Smit, Johannes W.A.; Jansen, Jeroen C.; Dekkers, Olaf M.

    2013-01-01

    The primary treatment goal of radiotherapy for paragangliomas of the head and neck region (HNPGLs) is local control of the tumor, i.e. stabilization of tumor volume. Interestingly, regression of tumor volume has also been reported. Up to the present, no meta-analysis has been performed giving an overview of regression rates after radiotherapy in HNPGLs. The main objective was to perform a systematic review and meta-analysis to assess regression of tumor volume in HNPGL-patients after radiotherapy. A second outcome was local tumor control. Design of the study is systematic review and meta-analysis. PubMed, EMBASE, Web of Science, COCHRANE and Academic Search Premier and references of key articles were searched in March 2012 to identify potentially relevant studies. Considering the indolent course of HNPGLs, only studies with ⩾12 months follow-up were eligible. Main outcomes were the pooled proportions of regression and local control after radiotherapy as initial, combined (i.e. directly post-operatively or post-embolization) or salvage treatment (i.e. after initial treatment has failed) for HNPGLs. A meta-analysis was performed with an exact likelihood approach using a logistic regression with a random effect at the study level. Pooled proportions with 95% confidence intervals (CI) were reported. Fifteen studies were included, concerning a total of 283 jugulotympanic HNPGLs in 276 patients. Pooled regression proportions for initial, combined and salvage treatment were respectively 21%, 33% and 52% in radiosurgery studies and 4%, 0% and 64% in external beam radiotherapy studies. Pooled local control proportions for radiotherapy as initial, combined and salvage treatment ranged from 79% to 100%. Radiotherapy for jugulotympanic paragangliomas results in excellent local tumor control and therefore is a valuable treatment for these types of tumors. The effects of radiotherapy on regression of tumor volume remain ambiguous, although the data suggest that regression can

  8. The Effect of IFRS Convergence and the Proportion of Woman in Audit Committee on Earning Management

    Directory of Open Access Journals (Sweden)

    Siswanti Dwi Surti

    2017-01-01

    Full Text Available This study aims to find empirical evidence the effect of IFRS convergence and the proportion of woman in audit committee on earning management. Respondents used in this study were companies listed in Jakarta Stock Exchange for the year 2011 until 2013. Using moderated regression analysis this study supports the hypothesis that IFRS convergence and the proportion of woman in audit committee have negative effect on earning management. This finding implies that IFRS convergence and the proportion of woman in audit committee will increase the quality of financial reporting.

  9. Proportional reasoning

    DEFF Research Database (Denmark)

    Dole, Shelley; Hilton, Annette; Hilton, Geoff

    2015-01-01

    Proportional reasoning is widely acknowledged as a key to success in school mathematics, yet students’ continual difficulties with proportion-related tasks are well documented. This paper draws on a large research study that aimed to support 4th to 9th grade teachers to design and implement tasks...

  10. Applied Prevalence Ratio estimation with different Regression models: An example from a cross-national study on substance use research.

    Science.gov (United States)

    Espelt, Albert; Marí-Dell'Olmo, Marc; Penelo, Eva; Bosque-Prous, Marina

    2016-06-14

    To examine the differences between Prevalence Ratio (PR) and Odds Ratio (OR) in a cross-sectional study and to provide tools to calculate PR using two statistical packages widely used in substance use research (STATA and R). We used cross-sectional data from 41,263 participants of 16 European countries participating in the Survey on Health, Ageing and Retirement in Europe (SHARE). The dependent variable, hazardous drinking, was calculated using the Alcohol Use Disorders Identification Test - Consumption (AUDIT-C). The main independent variable was gender. Other variables used were: age, educational level and country of residence. PR of hazardous drinking in men with relation to women was estimated using Mantel-Haenszel method, log-binomial regression models and poisson regression models with robust variance. These estimations were compared to the OR calculated using logistic regression models. Prevalence of hazardous drinkers varied among countries. Generally, men have higher prevalence of hazardous drinking than women [PR=1.43 (1.38-1.47)]. Estimated PR was identical independently of the method and the statistical package used. However, OR overestimated PR, depending on the prevalence of hazardous drinking in the country. In cross-sectional studies, where comparisons between countries with differences in the prevalence of the disease or condition are made, it is advisable to use PR instead of OR.

  11. Butt rot defect and potential hazard in lodgepole pine on selected California recreational areas

    Science.gov (United States)

    Lee A. Paine

    1966-01-01

    Within the area sampled, potentially hazardous lodgepole pine were common on recreational sites. The incidence of decayed and mechanically weak trees was correlated with fire damage. Two-thirds of fire-scarred trees were decayed; one-third were rated potentially hazardous. Fire scars occurred roughly in proportion to level of plot recreational use.

  12. Experiences of frontline nursing staff on workplace safety and occupational health hazards in two psychiatric hospitals in Ghana.

    Science.gov (United States)

    Alhassan, Robert Kaba; Poku, Kwabena Adu

    2018-06-06

    Psychiatric hospitals need safe working environments to promote productivity at the workplace. Even though occupational health and safety is not completely new to the corporate society, its scope is largely limited to the manufacturing/processing industries which are perceived to pose greater dangers to workers than the health sector. This paper sought to explore the experiences of frontline nursing personnel on the occupational health and safety conditions in two psychiatric hospitals in Ghana. This is an exploratory cross-sectional study among 296 nurses and nurse-assistants in Accra (n = 164) and Pantang (n = 132) psychiatric hospitals using the proportional stratified random sampling technique. Multivariate Ordinary Least Squares (OLS) regression test was conducted to ascertain the determinants of staff exposure to occupational health hazards and the frequency of exposure to these occupational health hazards on daily basis. Knowledge levels on occupational health hazards was high in Accra and Pantang psychiatric hospitals (i.e. 92 and 81% respectively), but barely 44% of the 296 interviewed staff in the two hospitals said they reported their most recent exposure to an occupational health hazard to hospital management. It was found that staff who worked for more years on the ward had higher likelihood of exposure to occupational health hazards than those who worked for lesser years (p = 0.002). The category of occupational health hazards reported most were the physical health hazards. Psychosocial hazards were the least reported health hazards. Frequency of exposure to occupational health hazards on daily basis was positively associated with work schedules of staff particularly, staff on routine day schedule (Coef = 4.49, p = 0.011) and those who alternated between day and night schedules (Coef = 4.48, p = 0.010). Occupational health and safety conditions in the two hospitals were found to be generally poor. Even though majority of

  13. Intermediate and advanced topics in multilevel logistic regression analysis.

    Science.gov (United States)

    Austin, Peter C; Merlo, Juan

    2017-09-10

    Multilevel data occur frequently in health services, population and public health, and epidemiologic research. In such research, binary outcomes are common. Multilevel logistic regression models allow one to account for the clustering of subjects within clusters of higher-level units when estimating the effect of subject and cluster characteristics on subject outcomes. A search of the PubMed database demonstrated that the use of multilevel or hierarchical regression models is increasing rapidly. However, our impression is that many analysts simply use multilevel regression models to account for the nuisance of within-cluster homogeneity that is induced by clustering. In this article, we describe a suite of analyses that can complement the fitting of multilevel logistic regression models. These ancillary analyses permit analysts to estimate the marginal or population-average effect of covariates measured at the subject and cluster level, in contrast to the within-cluster or cluster-specific effects arising from the original multilevel logistic regression model. We describe the interval odds ratio and the proportion of opposed odds ratios, which are summary measures of effect for cluster-level covariates. We describe the variance partition coefficient and the median odds ratio which are measures of components of variance and heterogeneity in outcomes. These measures allow one to quantify the magnitude of the general contextual effect. We describe an R 2 measure that allows analysts to quantify the proportion of variation explained by different multilevel logistic regression models. We illustrate the application and interpretation of these measures by analyzing mortality in patients hospitalized with a diagnosis of acute myocardial infarction. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  14. Principles of proportional recovery after stroke generalize to neglect and aphasia.

    Science.gov (United States)

    Marchi, N A; Ptak, R; Di Pietro, M; Schnider, A; Guggisberg, A G

    2017-08-01

    Motor recovery after stroke can be characterized into two different patterns. A majority of patients recover about 70% of initial impairment, whereas some patients with severe initial deficits show little or no improvement. Here, we investigated whether recovery from visuospatial neglect and aphasia is also separated into two different groups and whether similar proportions of recovery can be expected for the two cognitive functions. We assessed 35 patients with neglect and 14 patients with aphasia at 3 weeks and 3 months after stroke using standardized tests. Recovery patterns were classified with hierarchical clustering and the proportion of recovery was estimated from initial impairment using a linear regression analysis. Patients were reliably clustered into two different groups. For patients in the first cluster (n = 40), recovery followed a linear model where improvement was proportional to initial impairment and achieved 71% of maximal possible recovery for both cognitive deficits. Patients in the second cluster (n = 9) exhibited poor recovery (aphasia after stroke shows the same dichotomy and proportionality as observed in motor recovery. This is suggestive of common underlying principles of plasticity, which apply to motor and cognitive functions. © 2017 EAN.

  15. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  16. Efficient approximate k-fold and leave-one-out cross-validation for ridge regression

    NARCIS (Netherlands)

    Meijer, R.J.; Goeman, J.J.

    2013-01-01

    In model building and model evaluation, cross-validation is a frequently used resampling method. Unfortunately, this method can be quite time consuming. In this article, we discuss an approximation method that is much faster and can be used in generalized linear models and Cox' proportional hazards

  17. A Bayesian sampling strategy for hazardous waste site characterization

    International Nuclear Information System (INIS)

    Skalski, J.R.

    1987-12-01

    Prior knowledge based on historical records or physical evidence often suggests the existence of a hazardous waste site. Initial surveys may provide additional or even conflicting evidence of site contamination. This article presents a Bayes sampling strategy that allocates sampling at a site using this prior knowledge. This sampling strategy minimizes the environmental risks of missing chemical or radionuclide hot spots at a waste site. The environmental risk is shown to be proportional to the size of the undetected hot spot or inversely proportional to the probability of hot spot detection. 12 refs., 2 figs

  18. Beta-binomial regression and bimodal utilization.

    Science.gov (United States)

    Liu, Chuan-Fen; Burgess, James F; Manning, Willard G; Maciejewski, Matthew L

    2013-10-01

    To illustrate how the analysis of bimodal U-shaped distributed utilization can be modeled with beta-binomial regression, which is rarely used in health services research. Veterans Affairs (VA) administrative data and Medicare claims in 2001-2004 for 11,123 Medicare-eligible VA primary care users in 2000. We compared means and distributions of VA reliance (the proportion of all VA/Medicare primary care visits occurring in VA) predicted from beta-binomial, binomial, and ordinary least-squares (OLS) models. Beta-binomial model fits the bimodal distribution of VA reliance better than binomial and OLS models due to the nondependence on normality and the greater flexibility in shape parameters. Increased awareness of beta-binomial regression may help analysts apply appropriate methods to outcomes with bimodal or U-shaped distributions. © Health Research and Educational Trust.

  19. Health hazards related to Soba sewage treatment plant, Sudan

    Directory of Open Access Journals (Sweden)

    Rasha Osman Abdelwahab Abdelmoneim

    2017-12-01

    Full Text Available The aim of this study was to determine the health hazards acquired by the residents nearby Soba sewage treatment plant. A descriptive cross-sectional study was carried out in Soba locality, Khartoum, Sudan. An interviewer-administrated questionnaire was assigned to 462 residents of the area living in four geographically distributed squares around the sewage plant. The data was analyzed in SPSS; Cronbach’s alpha reliability scale of measurement was used to check the internal validity of six variables related to the quality of life. A logistic regression analysis was used to assess the relationship between the health hazards and the quality of life. Among the 462 residents, difficulty in breathing (37.9% and nausea (37.2 were the principal health hazards. Moreover, the residents had a satisfactory level of awareness (88.7% about the health hazards. The utmost impact on the quality of life was psychological (97.2%. It was statistically correlated with the reported factors, which impacted the quality of life in the district as revealed by the Cronbach’s alpha reliability test with absenteeism (P=0.026, disability (P=0.014, socialization (P=0.032 and death (P=0.016. A logistic regression analysis revealed chemical hazards had a statistically significant association (P<0.05 with quality of life of the residents of Soba district. The study strongly entails the fact that sewage treatment plants crave exceptional consideration from the concerned responsible authorities, together with the fact that the evolved health threats should be confronted with immense responsibility as soon as possible.

  20. The costs of hazardous alcohol consumption in Germany.

    Science.gov (United States)

    Effertz, Tobias; Verheyen, Frank; Linder, Roland

    2017-07-01

    Hazardous alcohol consumption in Germany is a main threat to health. By using insurance claim data from the German Statutory Health Insurance and a classification strategy based on ICD10 diagnoses-codes we analyzed a sample of 146,000 subjects with more than 19,000 hazardous alcohol consumers. Employing different regression models with a control function approach, we calculate life years lost due to alcohol consumption, annual direct and indirect health costs, and the burden of pain and suffering measured by the Charlson-Index and assessed pain diagnoses. Additionally, we simulate the net accumulated premium payments over expenses in the German Statutory Health Insurance and the Statutory Pension Fund for hazardous alcohol consumers from a lifecycle perspective. In total, €39.3 billion each year result from hazardous alcohol consumption with an average loss of 7 years in life expectancy. Hazardous alcohol consumers clearly do not "pay their way" in the two main German social security systems and also display a higher intangible burden according to our definitions of pain and suffering.

  1. Social Markers of Mild Cognitive Impairment: Proportion of Word Counts in Free Conversational Speech.

    Science.gov (United States)

    Dodge, Hiroko H; Mattek, Nora; Gregor, Mattie; Bowman, Molly; Seelye, Adriana; Ybarra, Oscar; Asgari, Meysam; Kaye, Jeffrey A

    2015-01-01

    Detecting early signs of Alzheimer's disease (AD) and mild cognitive impairment (MCI) during the pre-symptomatic phase is becoming increasingly important for costeffective clinical trials and also for deriving maximum benefit from currently available treatment strategies. However, distinguishing early signs of MCI from normal cognitive aging is difficult. Biomarkers have been extensively examined as early indicators of the pathological process for AD, but assessing these biomarkers is expensive and challenging to apply widely among pre-symptomatic community dwelling older adults. Here we propose assessment of social markers, which could provide an alternative or complementary and ecologically valid strategy for identifying the pre-symptomatic phase leading to MCI and AD. The data came from a larger randomized controlled clinical trial (RCT), where we examined whether daily conversational interactions using remote video telecommunications software could improve cognitive functions of older adult participants. We assessed the proportion of words generated by participants out of total words produced by both participants and staff interviewers using transcribed conversations during the intervention trial as an indicator of how two people (participants and interviewers) interact with each other in one-on-one conversations. We examined whether the proportion differed between those with intact cognition and MCI, using first, generalized estimating equations with the proportion as outcome, and second, logistic regression models with cognitive status as outcome in order to estimate the area under ROC curve (ROC AUC). Compared to those with normal cognitive function, MCI participants generated a greater proportion of words out of the total number of words during the timed conversation sessions (p=0.01). This difference remained after controlling for participant age, gender, interviewer and time of assessment (p=0.03). The logistic regression models showed the ROC AUC of

  2. Are increases in cigarette taxation regressive?

    Science.gov (United States)

    Borren, P; Sutton, M

    1992-12-01

    Using the latest published data from Tobacco Advisory Council surveys, this paper re-evaluates the question of whether or not increases in cigarette taxation are regressive in the United Kingdom. The extended data set shows no evidence of increasing price-elasticity by social class as found in a major previous study. To the contrary, there appears to be no clear pattern in the price responsiveness of smoking behaviour across different social classes. Increases in cigarette taxation, while reducing smoking levels in all groups, fall most heavily on men and women in the lowest social class. Men and women in social class five can expect to pay eight and eleven times more of a tax increase respectively, than their social class one counterparts. Taken as a proportion of relative incomes, the regressive nature of increases in cigarette taxation is even more pronounced.

  3. Early post-acute stroke seizures: Clinical profile and outcome in a ...

    African Journals Online (AJOL)

    Correspondence to: Dr. Imarhiagbe Frank Aiwansoba, P.O. Box 7184, GPO, ... and time to in-hospital death in EPASS was tested on logistic regression and Cox proportional hazard regression. ..... [4,14,19,20] Hemorrhagic transformation.

  4. A clinical trial design using the concept of proportional time using the generalized gamma ratio distribution.

    Science.gov (United States)

    Phadnis, Milind A; Wetmore, James B; Mayo, Matthew S

    2017-11-20

    Traditional methods of sample size and power calculations in clinical trials with a time-to-event end point are based on the logrank test (and its variations), Cox proportional hazards (PH) assumption, or comparison of means of 2 exponential distributions. Of these, sample size calculation based on PH assumption is likely the most common and allows adjusting for the effect of one or more covariates. However, when designing a trial, there are situations when the assumption of PH may not be appropriate. Additionally, when it is known that there is a rapid decline in the survival curve for a control group, such as from previously conducted observational studies, a design based on the PH assumption may confer only a minor statistical improvement for the treatment group that is neither clinically nor practically meaningful. For such scenarios, a clinical trial design that focuses on improvement in patient longevity is proposed, based on the concept of proportional time using the generalized gamma ratio distribution. Simulations are conducted to evaluate the performance of the proportional time method and to identify the situations in which such a design will be beneficial as compared to the standard design using a PH assumption, piecewise exponential hazards assumption, and specific cases of a cure rate model. A practical example in which hemorrhagic stroke patients are randomized to 1 of 2 arms in a putative clinical trial demonstrates the usefulness of this approach by drastically reducing the number of patients needed for study enrollment. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Hazardous alcohol consumption is a major factor in male premature mortality in a typical Russian city: prospective cohort study 2003-2009.

    Directory of Open Access Journals (Sweden)

    Susannah Tomkins

    Full Text Available Russia has experienced massive fluctuations in mortality at working ages over the past three decades. Routine data analyses suggest that these are largely driven by fluctuations in heavy alcohol drinking. However, individual-level evidence supporting alcohol having a major role in Russian mortality comes from only two case-control studies, which could be subject to serious biases due to their design.A prospective study of mortality (2003-9 of 2000 men aged 25-54 years at recruitment was conducted in the city of Izhevsk, Russia. This cohort was free from key limitations inherent in the design of the two earlier case-control studies. Cox proportional hazards regression was used to estimate hazard ratios of all-cause mortality by alcohol drinking type as reported by a proxy informant. Hazardous drinkers were defined as those who either drank non-beverage alcohols or were reported to regularly have hangovers or other behaviours related to heavy drinking episodes. Over the follow-up period 113 men died. Compared to non-hazardous drinkers and abstainers, men who drank hazardously had appreciably higher mortality (HR = 3.4, 95% CI 2.2, 5.1 adjusted for age, smoking and education. The population attributable risk percent (PAR% for hazardous drinking was 26% (95% CI 14,37. However, larger effects were seen in the first two years of follow-up, with a HR of 4.6 (2.5, 8.2 and a corresponding PAR% of 37% (17, 51.This prospective cohort study strengthens the evidence that hazardous alcohol consumption has been a major determinant of mortality among working age men in a typical Russian city. As such the similar findings of the previous case-control studies cannot be explained as artefacts of limitations of their design. As Russia struggles to raise life expectancy, which even in 2009 was only 62 years among men, control of hazardous drinking must remain a top public health priority.

  6. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    Science.gov (United States)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  7. Proportion and factors associated with recent HIV infection in a cohort of patients seen for care in Italy over 1996-2014: Data from the ICONA Foundation Study cohort.

    Directory of Open Access Journals (Sweden)

    Silvia Nozza

    Full Text Available In Italy the prevalence of recent HIV infection (RHI isn't currently monitored. Early diagnosis is crucial to allow introduction of antiretroviral therapy (cART in the recent phase of infection. We aimed to estimate the proportion and the determinants of RHI among patients enrolled in the ICONA cohort; we explored differences in the median time from HIV diagnosis to cART initiation and in the viro-immunological response between RHI and Less Recent HIV infections (NRHI. We included antiretroviral-naïve HIV-positive patients enrolled in the cohort with documented dates of HIV-negative and positive antibodies tests, grouped in RHI (estimated date of seroconversion within 12 months of enrolment and NRHI. Proportion of RHI and the trend of this proportion by calendar period (1996-2014 were investigated (Chi-square test. Logistic regression analysis was employed to identify factors associated with RHI. The time from seroconversion to cART initiation was compared in RHI and NRHI overall and after stratification by calendar period (survival analysis. We finally explored the time from starting cART to HIV-RNA <50 copies/mL and to CD4+ gain ≥200 cells/mmc by Cox regression. HIV seroconversion could be estimated for 2608/12,616 patients: 981/2608 (37.6% were RHI. Proportion of RHI increased in recent calendar periods and was associated with younger age, baseline higher HIV-RNA and CD4+ count. There wasn't difference in the 2-year estimates of cART start between RHI and NRHI, regardless of calendar period. Rates and hazards of virological response were similar in RHI versus NRHI. RHI showed a 1.5-fold higher probability of CD4+ gain, also following adjustment for calendar period and cART regimen, and for age, HCV and smoking; the difference in probability was however attenuated after further controlling for baseline HIV-RNA and CD4+ T-cells. The increased proportion of RHI over time suggests that in recent years in Italy HIV infections are more likely to

  8. Migration, environmental hazards, and health outcomes in China.

    Science.gov (United States)

    Chen, Juan; Chen, Shuo; Landry, Pierre F

    2013-03-01

    China's rapid economic growth has had a serious impact on the environment. Environmental hazards are major sources of health risk factors. The migration of over 200 million people to heavily polluted urban areas is likely to be significantly detrimental to health. Based on data from the 2009 national household survey "Chinese Attitudes toward Inequality and Distributive Injustice" (N = 2866) and various county-level and municipal indicators, we investigate the disparities in subjective exposure to environmental hazards and associated health outcomes in China. This study focuses particularly on migration-residency status and county-level socio-economic development. We employ multiple regressions that account for the complex multi-stage survey design to assess the associations between perceived environmental hazards and individual and county-level indicators and between perceived environmental hazards and health outcomes, controlling for physical and social environments at multiple levels. We find that perceived environmental hazards are associated with county-level industrialization and economic development: respondents living in more industrialized counties report greater exposure to environmental hazards. Rural-to-urban migrants are exposed to more water pollution and a higher measure of overall environmental hazard. Perceived environmental risk factors severely affect the physical and mental health of the respondents. The negative effects of perceived overall environmental hazard on physical health are more detrimental for rural-to-urban migrants than for urban residents. The research findings call for restructuring the household registration system in order to equalize access to public services and mitigate adverse environmental health effects, particularly among the migrant population. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Is the Proportion of Carbohydrate Intake Associated with the Incidence of Diabetes Complications?—An Analysis of the Japan Diabetes Complications Study

    Directory of Open Access Journals (Sweden)

    Chika Horikawa

    2017-02-01

    Full Text Available The appropriate proportions of macronutritional intake have been controversial in medical nutritional therapy for diabetes, and evidence of the effects of carbohydrate consumption on diabetes complications in prospective settings is sparse. We investigated the relationships between proportions of carbohydrate intake as the % of total energy and diabetes complications in a nationwide cohort of Japanese patients with type 2 diabetes aged 40–70 years with hemoglobin A1c ≥6.5%. The analysis was of 1516 responders to a baseline dietary survey assessed by the Food Frequency Questionnaire based on food groups. Primary outcomes were times to overt nephropathy, diabetic retinopathy, and cardiovascular disease (CVD after 8 years. Hazard ratios (HRs for proportions of carbohydrate intake were estimated by Cox regression adjusted for confounders. High carbohydrate intake was significantly related to higher intakes of grain, fruits, and sweets/snacks and lower intakes of soybean and soy products, vegetables, seaweed, meat and processed meat, fish and processed fish, eggs, milk and dairy products, oil, and alcoholic beverages. During the eight-year follow-up, there were 81, 275, and 129 events of overt nephropathy, diabetic retinopathy, and CVD, respectively. After adjustment for confounders, HRs for complications in patients with carbohydrate intake in the second or third tertiles (51.0%–56.4% and ≥56.5%, respectively compared with carbohydrate intake in the first tertile (<50.9%, referent were analyzed. No significant associations were shown in the second and third tertiles relative to first tertile (overt nephropathy: 1.05 (95% Confidence Interval, 0.54–2.06 and 0.98 (0.40–2.44; diabetic retinopathy: 1.30 (0.90–1.88 and 1.30 (0.78–2.15; and CVD: 0.95 (0.55–1.63 and 1.37 (0.69–2.72. By exploring potentially nonlinear relationships, trends for the incidence of diabetes complications according to proportions of carbohydrate intake were not

  10. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    Science.gov (United States)

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  11. An estimating equation for parametric shared frailty models with marginal additive hazards

    DEFF Research Database (Denmark)

    Pipper, Christian Bressen; Martinussen, Torben

    2004-01-01

    Multivariate failure time data arise when data consist of clusters in which the failure times may be dependent. A popular approach to such data is the marginal proportional hazards model with estimation under the working independence assumption. In some contexts, however, it may be more reasonable...

  12. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Molten salt hazardous waste disposal process utilizing gas/liquid contact for salt recovery

    International Nuclear Information System (INIS)

    Grantham, L.F.; McKenzie, D.E.

    1984-01-01

    The products of a molten salt combustion of hazardous wastes are converted into a cooled gas, which can be filtered to remove hazardous particulate material, and a dry flowable mixture of salts, which can be recycled for use in the molten salt combustion, by means of gas/liquid contact between the gaseous products of combustion of the hazardous waste and a solution produced by quenching the spent melt from such molten salt combustion. The process results in maximizing the proportion of useful materials recovered from the molten salt combustion and minimizing the volume of material which must be discarded. In a preferred embodiment a spray dryer treatment is used to achieve the desired gas/liquid contact

  14. Proportionality for Military Leaders

    National Research Council Canada - National Science Library

    Brown, Gary D

    2000-01-01

    .... Especially lacking is a realization that there are four distinct types of proportionality. In determining whether a particular resort to war is just, national leaders must consider the proportionality of the conflict (i.e...

  15. Use of generalized ordered logistic regression for the analysis of multidrug resistance data.

    Science.gov (United States)

    Agga, Getahun E; Scott, H Morgan

    2015-10-01

    Statistical analysis of antimicrobial resistance data largely focuses on individual antimicrobial's binary outcome (susceptible or resistant). However, bacteria are becoming increasingly multidrug resistant (MDR). Statistical analysis of MDR data is mostly descriptive often with tabular or graphical presentations. Here we report the applicability of generalized ordinal logistic regression model for the analysis of MDR data. A total of 1,152 Escherichia coli, isolated from the feces of weaned pigs experimentally supplemented with chlortetracycline (CTC) and copper, were tested for susceptibilities against 15 antimicrobials and were binary classified into resistant or susceptible. The 15 antimicrobial agents tested were grouped into eight different antimicrobial classes. We defined MDR as the number of antimicrobial classes to which E. coli isolates were resistant ranging from 0 to 8. Proportionality of the odds assumption of the ordinal logistic regression model was violated only for the effect of treatment period (pre-treatment, during-treatment and post-treatment); but not for the effect of CTC or copper supplementation. Subsequently, a partially constrained generalized ordinal logistic model was built that allows for the effect of treatment period to vary while constraining the effects of treatment (CTC and copper supplementation) to be constant across the levels of MDR classes. Copper (Proportional Odds Ratio [Prop OR]=1.03; 95% CI=0.73-1.47) and CTC (Prop OR=1.1; 95% CI=0.78-1.56) supplementation were not significantly associated with the level of MDR adjusted for the effect of treatment period. MDR generally declined over the trial period. In conclusion, generalized ordered logistic regression can be used for the analysis of ordinal data such as MDR data when the proportionality assumptions for ordered logistic regression are violated. Published by Elsevier B.V.

  16. Time-trend of melanoma screening practice by primary care physicians: A meta-regression analysis

    OpenAIRE

    Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni

    2009-01-01

    Objective To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Methods Meta-regression analyses of available data. Data Sources: MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Results Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%?82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening ten...

  17. Proportional and scale change models to project failures of mechanical components with applications to space station

    Science.gov (United States)

    Taneja, Vidya S.

    1996-01-01

    In this paper we develop the mathematical theory of proportional and scale change models to perform reliability analysis. The results obtained will be applied for the Reaction Control System (RCS) thruster valves on an orbiter. With the advent of extended EVA's associated with PROX OPS (ISSA & MIR), and docking, the loss of a thruster valve now takes on an expanded safety significance. Previous studies assume a homogeneous population of components with each component having the same failure rate. However, as various components experience different stresses and are exposed to different environments, their failure rates change with time. In this paper we model the reliability of a thruster valves by treating these valves as a censored repairable system. The model for each valve will take the form of a nonhomogeneous process with the intensity function that is either treated as a proportional hazard model, or a scale change random effects hazard model. Each component has an associated z, an independent realization of the random variable Z from a distribution G(z). This unobserved quantity z can be used to describe heterogeneity systematically. For various models methods for estimating the model parameters using censored data will be developed. Available field data (from previously flown flights) is from non-renewable systems. The estimated failure rate using such data will need to be modified for renewable systems such as thruster valve.

  18. Femoral neck-shaft angle and climate-induced body proportions.

    Science.gov (United States)

    Child, Stephanie L; Cowgill, Libby W

    2017-12-01

    Declination in femoral neck-shaft angle (NSA) is commonly linked to an increased level of physical activity during life. More recently, however, research suggests that lower NSA might also be explained, in part, as the mechanical consequence of differences in ecogeographic body proportions. This study tests the proposed link between NSA and climatic-induced body proportions, using relative body mass (RBM), throughout the course of development. NSA and RBM were collected for 445 immature remains from five geographic locations. NSA and RBM were standardized for age-effects. ANOVA was used to examine when population differences emerged in both NSA and RBM. Regression analyses were used to examine the pattern of relationship between NSA and RBM. Populations differ significantly in NSA and RBM before skeletal maturity, and these differences occur early in life. While both NSA and RBM change over the course of development, no significant relationship was found between NSA and RBM for any sample, or any age category (p = .244). Individuals who have relatively greater relative body mass do not necessarily have lower NSA. Population differences in NSA were found to be variable, while differences in RBM remained consistent across the developmental span. Taken together, these results suggest that regardless of body proportions, the degree of declination of NSA is presumed to be similar among individuals with similar gait and ambulatory behaviors. Conversely, populations differ in RBM from birth, and these differences are consistent throughout development. These two measures likely are responsive to diffing stimuli, and any potential relationship is likely complex and multifactorial. © 2017 Wiley Periodicals, Inc.

  19. PHAZE, Parametric Hazard Function Estimation

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking of the model assumptions. 2 - Methods: PHAZE assumes that the failures of a component follow a time-dependent (or non-homogenous) Poisson process and that the failure counts in non-overlapping time intervals are independent. Implicit in the independence property is the assumption that the component is restored to service immediately after any failure, with negligible repair time. The failures of one component are assumed to be independent of those of another component; a proportional hazards model is used. Data for a component are called time censored if the component is observed for a fixed time-period, or plant records covering a fixed time-period are examined, and the failure times are recorded. The number of these failures is random. Data are called failure censored if the component is kept in service until a predetermined number of failures has occurred, at which time the component is removed from service. In this case, the number of failures is fixed, but the end of the observation period equals the final failure time and is random. A typical PHAZE session consists of reading failure data from a file prepared previously, selecting one of the three models, and performing data analysis (i.e., performing the usual statistical inference about the parameters of the model, with special emphasis on the parameter(s) that determine whether the hazard function is increasing). The final goals of the inference are a point estimate

  20. Human hazards

    International Nuclear Information System (INIS)

    Delpla, M.; Vignes, S.; Wolber, G.

    1976-01-01

    Among health hazards from ionizing radiations, a distinction is made of observed, likely and theoretical risks. Theoretical risks, derived from extrapolation of observations on sublethal exposures to low doses may frighten. However, they have nothing in common with reality as shown for instance, by the study of carcinogenesis risks at Nagasaki. By extrapolation to low doses, theoretical mutation risks are derived by geneticians from the observation of some characters especially deleterious in the progeny of parents exposed to sublethal doses. One cannot agree when by calculation they express a population exposure by a shift of its genetic balance with an increase of the proportion of disabled individuals. As a matter of fact, experimental exposure of successive generations of laboratory animals shows no accumulation of deleterious genes, sublethal doses excepted. Large nuclear plants should not be overwhelmed by horrible charges on sanitary grounds, whereas small sources have but too often shown they may originate mortal risks [fr

  1. Association between proportion of US medical graduates and program characteristics in gastroenterology fellowships.

    Science.gov (United States)

    Atsawarungruangkit, Amporn

    2017-01-01

    Gastroenterology is one of the most competitive internal medicine fellowship. However, factors that associated with program competitiveness have not been documented. The objective of this study was to evaluate associations between characteristics of gastroenterology fellowship programs and their competitiveness through the proportion of US medical graduates for the academic year 2016/17. This study used a retrospective, cross-sectional design with data obtained from the American Medical Association. The proportion of US medical graduates in gastroenterology fellowships was used as an indicator of program competitiveness. Using both univariate and multivariate linear regression analyses, we analyzed the association between the proportion of medical graduates in each program and 27 program characteristics based on a significance level of 0.05. In total, 153 out of 171 gastroenterology fellowship programs satisfied the inclusion criteria. A multivariate analysis revealed that a higher proportion of US medical graduates was significantly associated with five program characteristics: that it was a university-based program (p < 0.001), the ratio of full-time paid faculty to fellow positions (p < 0.001), the proportion of females in the program (p = 0.002), location in the Pacific region (p = 0.039), and a non-smoker hiring policy (p = 0.042). Among the five significant factors, being university based, located in the Pacific, and having a non-smoker hiring policy were likely to remain unchanged over a long period. However, program directors and candidates should pay attention to equivalence between full-time paid faculty and fellowship positions, and the proportion of women in the program. The former indicates the level of supervision while the latter has become increasingly important owing to the higher proportion of women in medicine.

  2. Ensemble of ground subsidence hazard maps using fuzzy logic

    Science.gov (United States)

    Park, Inhye; Lee, Jiyeong; Saro, Lee

    2014-06-01

    Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.

  3. Physically and psychologically hazardous jobs and mental health in Thailand.

    Science.gov (United States)

    Yiengprugsawan, Vasoontara; Strazdins, Lyndall; Lim, Lynette L-Y; Kelly, Matthew; Seubsman, Sam-ang; Sleigh, Adrian C

    2015-09-01

    This paper investigates associations between hazardous jobs, mental health and wellbeing among Thai adults. In 2005, 87 134 distance-learning students from Sukhothai Thammathirat Open University completed a self-administered questionnaire; at the 2009 follow-up 60 569 again participated. Job characteristics were reported in 2005, psychological distress and life satisfaction were reported in both 2005 and 2009. We derived two composite variables grading psychologically and physically hazardous jobs and reported adjusted odds ratios (AOR) from multivariate logistic regressions. Analyses focused on cohort members in paid work: the total was 62 332 at 2005 baseline and 41 671 at 2009 follow-up. Cross-sectional AORs linking psychologically hazardous jobs to psychological distress ranged from 1.52 (one hazard) to 4.48 (four hazards) for males and a corresponding 1.34-3.76 for females. Similarly AORs for physically hazardous jobs were 1.75 (one hazard) to 2.76 (four or more hazards) for males and 1.70-3.19 for females. A similar magnitude of associations was found between psychologically adverse jobs and low life satisfaction (AORs of 1.34-4.34 among males and 1.18-3.63 among females). Longitudinal analyses confirm these cross-sectional relationships. Thus, significant dose-response associations were found linking hazardous job exposures in 2005 to mental health and wellbeing in 2009. The health impacts of psychologically and physically hazardous jobs in developed, Western countries are equally evident in transitioning Southeast Asian countries such as Thailand. Regulation and monitoring of work conditions will become increasingly important to the health and wellbeing of the Thai workforce. © The Author 2013. Published by Oxford University Press.

  4. Physically and psychologically hazardous jobs and mental health in Thailand

    Science.gov (United States)

    Yiengprugsawan, Vasoontara; Strazdins, Lyndall; Lim, Lynette L.-Y.; Kelly, Matthew; Seubsman, Sam-ang; Sleigh, Adrian C.

    2015-01-01

    This paper investigates associations between hazardous jobs, mental health and wellbeing among Thai adults. In 2005, 87 134 distance-learning students from Sukhothai Thammathirat Open University completed a self-administered questionnaire; at the 2009 follow-up 60 569 again participated. Job characteristics were reported in 2005, psychological distress and life satisfaction were reported in both 2005 and 2009. We derived two composite variables grading psychologically and physically hazardous jobs and reported adjusted odds ratios (AOR) from multivariate logistic regressions. Analyses focused on cohort members in paid work: the total was 62 332 at 2005 baseline and 41 671 at 2009 follow-up. Cross-sectional AORs linking psychologically hazardous jobs to psychological distress ranged from 1.52 (one hazard) to 4.48 (four hazards) for males and a corresponding 1.34–3.76 for females. Similarly AORs for physically hazardous jobs were 1.75 (one hazard) to 2.76 (four or more hazards) for males and 1.70–3.19 for females. A similar magnitude of associations was found between psychologically adverse jobs and low life satisfaction (AORs of 1.34–4.34 among males and 1.18–3.63 among females). Longitudinal analyses confirm these cross-sectional relationships. Thus, significant dose–response associations were found linking hazardous job exposures in 2005 to mental health and wellbeing in 2009. The health impacts of psychologically and physically hazardous jobs in developed, Western countries are equally evident in transitioning Southeast Asian countries such as Thailand. Regulation and monitoring of work conditions will become increasingly important to the health and wellbeing of the Thai workforce. PMID:24218225

  5. Chemical incidents resulted in hazardous substances releases in the context of human health hazards

    Directory of Open Access Journals (Sweden)

    Anna Pałaszewska-Tkacz

    2017-02-01

    Full Text Available Objectives: The research purpose was to analyze data concerning chemical incidents in Poland collected in 1999–2009 in terms of health hazards. Material and Methods: The data was obtained, using multimodal information technology (IT system, from chemical incidents reports prepared by rescuers at the scene. The final analysis covered sudden events associated with uncontrolled release of hazardous chemical substances or mixtures, which may potentially lead to human exposure. Releases of unidentified substances where emergency services took action to protect human health or environment were also included. Results: The number of analyzed chemical incidents in 1999–2009 was 2930 with more than 200 different substances released. The substances were classified into 13 groups of substances and mixtures posing analogous risks. Most common releases were connected with non-flammable corrosive liquids, including: hydrochloric acid (199 cases, sulfuric(VI acid (131 cases, sodium and potassium hydroxides (69 cases, ammonia solution (52 cases and butyric acid (32 cases. The next group were gases hazardous only due to physico-chemical properties, including: extremely flammable propane-butane (249 cases and methane (79 cases. There was no statistically significant trend associated with the total number of incidents. Only with the number of incidents with flammable corrosive, toxic and/or harmful liquids, the regression analysis revealed a statistically significant downward trend. The number of victims reported was 1997, including 1092 children and 18 fatalities. Conclusions: The number of people injured, number of incidents and the high 9th place of Poland in terms of the number of Seveso establishments, and 4 times higher number of hazardous industrial establishments not covered by the Seveso Directive justify the need for systematic analysis of hazards and their proper identification. It is advisable enhance health risk assessment, both qualitative and

  6. The effects of building design on hazard of first service in Norwegian dairy cows.

    Science.gov (United States)

    Martin, A D; Kielland, C; Nelson, S T; Østerås, O

    2015-12-01

    Reproductive inefficiency is one of the major production and economic constraints on modern dairy farms. The environment affects onset of ovarian activity in a cow postcalving and influences estrus behavior, which in turn affects a stockperson's ability to inseminate her at the correct time. This study used survival analysis to investigate effects of building design and animal factors on the postpartum hazard of first service (HFS) in freestall-housed Norwegian Red cows. The study was performed on 232 Norwegian dairy farms between 2004 and 2007. Data were obtained through on farm measurements and by accessing the Norwegian Dairy Herd Recording System. The final data set contained data on 38,436 calvings and 27,127 services. Univariate Cox proportional hazard analyses showed that herd size and milk yield were positively associated with HFS. Total free accessible area and free accessible area available per cow year were positively associated with the HFS, as was the number of freestalls available per cow. Cows housed on slatted floors had a lower HFS than those housed on solid floors. Conversely, cows housed on rubber floors had a higher HFS than cows on concrete floors. Dead-ending alleyways reduced the hazard of AI after calving. A multivariable Cox proportional hazards model, accounting for herd management by including a frailty term for herd, showed relationships between hazard of postpartum service and explanatory variables. Animals in herds with more than 50 cows had a higher HFS [hazard ratio (HR)=3.0] compared with those in smaller herds. The HFS was also higher (HR=4.3) if more than 8.8 m(2) of space was available per cow year compared with herds in which animals had less space. The HFS after calving increased with parity (parity 2 HR=0.5, parity ≥3 HR=1.7), and was reduced if a lactation began with dystocia (HR=0.82) or was a breed other than Norwegian Red (HR=0.2). The frailty term, herd, was large and highly significant indicating a significant

  7. Risk factors for the hazard of lameness in Danish Standardbred trotters

    DEFF Research Database (Denmark)

    Vigre, Håkan; Chriel, M.; Hesselholt, M.

    2002-01-01

    associations between the hazard of lameness and selected risk factors. The study population was dynamic and contained data of 265 Standardbred trotters monitored during 5 months in 1997 and 1998. The horses were greater than or equal to2 years old. Optimal training was defined as when the horse followed......-frequent cause of interruption of optimal training: 84 events in 69 horses (0.09 events per horse-month). Respiratory diseases (16 events) and muscular problems (seven events) were the second and third most-frequent causes of interrupted training. The effects of trainer, gender, age-group, time with a trainer......, participation in races and current month on the hazard of lameness were estimated in a multivariable Cox proportional-hazard model. The effects of trainer, gender and age-group were modelled as time-independent. The effects of time with a trainer, participation in races and the current month were modelled...

  8. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    Science.gov (United States)

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  9. Analogical proportions: another logical view

    Science.gov (United States)

    Prade, Henri; Richard, Gilles

    This paper investigates the logical formalization of a restricted form of analogical reasoning based on analogical proportions, i.e. statements of the form a is to b as c is to d. Starting from a naive set theoretic interpretation, we highlight the existence of two noticeable companion proportions: one states that a is to b the converse of what c is to d (reverse analogy), while the other called paralogical proportion expresses that what a and b have in common, c and d have it also. We identify the characteristic postulates of the three types of proportions and examine their consequences from an abstract viewpoint. We further study the properties of the set theoretic interpretation and of the Boolean logic interpretation, and we provide another light on the understanding of the role of permutations in the modeling of the three types of proportions. Finally, we address the use of these proportions as a basis for inference in a propositional setting, and relate it to more general schemes of analogical reasoning. The differences between analogy, reverse-analogy, and paralogy is still emphasized in a three-valued setting, which is also briefly presented.

  10. Seismic hazard map of the western hemisphere

    Science.gov (United States)

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the

  11. Seismic hazard map of the western hemisphere

    Directory of Open Access Journals (Sweden)

    J. G. Tanner

    1999-06-01

    Full Text Available Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.. Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($ 6 billion, 1994 Northridge, CA ($ 25 billion, and 1995 Kobe, Japan (> $ 100 billion earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions

  12. A phenomenological SMA model for combined axial–torsional proportional/non-proportional loading conditions

    International Nuclear Information System (INIS)

    Bodaghi, M.; Damanpack, A.R.; Aghdam, M.M.; Shakeri, M.

    2013-01-01

    In this paper, a simple and robust phenomenological model for shape memory alloys (SMAs) is proposed to simulate main features of SMAs under uniaxial as well as biaxial combined axial–torsional proportional/non-proportional loadings. The constitutive model for polycrystalline SMAs is developed within the framework of continuum thermodynamics of irreversible processes. The model nominates the volume fractions of self-accommodated and oriented martensite as scalar internal variables and the preferred direction of oriented martensitic variants as directional internal variable. An algorithm is introduced to develop explicit relationships for the thermo-mechanical behavior of SMAs under uniaxial and biaxial combined axial–torsional proportional/non-proportional loading conditions and also thermal loading. It is shown that the model is able to simulate main aspects of SMAs including self-accommodation, martensitic transformation, orientation and reorientation of martensite, shape memory effect, ferro-elasticity and pseudo-elasticity. A description of the time-discrete counterpart of the proposed SMA model is presented. Experimental results of uniaxial tension and biaxial combined tension–torsion non-proportional tests are simulated and a good qualitative correlation between numerical and experimental responses is achieved. Due to simplicity and accuracy, the model is expected to be used in the future studies dealing with the analysis of SMA devices in which two stress components including one normal and one shear stress are dominant

  13. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  14. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  15. Coffee intake, cardiovascular disease and allcause mortality

    DEFF Research Database (Denmark)

    Nordestgaard, Ask Tybjærg; Nordestgaard, Børge Grønne

    2016-01-01

    Background: Coffee has been associated with modestly lower risk of cardiovascular disease and all-cause mortality in meta-analyses; however, it is unclear whether these are causal associations. We tested first whether coffee intake is associated with cardiovascular disease and all-cause mortality...... observationally; second, whether genetic variations previously associated with caffeine intake are associated with coffee intake; and third, whether the genetic variations are associated with cardiovascular disease and all-cause mortality. Methods: First, we used multivariable adjusted Cox proportional hazard......- and age adjusted Cox proportional hazard regression models to examine genetic associations with cardiovascular disease and all-cause mortality in 112 509 Danes. Finally, we used sex and age-adjusted logistic regression models to examine genetic associations with ischaemic heart disease including...

  16. Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.

    Science.gov (United States)

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65). A long time-series (2001-2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat

  17. Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.

    Directory of Open Access Journals (Sweden)

    Marco Morabito

    Full Text Available Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks.Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65.A long time-series (2001-2013 of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m daytime and night-time land surface temperatures (LST. LST was estimated pixel-wise by applying two statistical model approaches: 1 the Linear Regression Model (LRM; 2 the Generalized Additive Model (GAM. Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m from the 2001 census (Eurostat source, and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI.The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities.This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat

  18. Tsunami Hazard Evaluation for the East Coast of Korea by using Empirical Data

    International Nuclear Information System (INIS)

    Kim, Min Kyu; Choi, In Kil

    2010-01-01

    In this study, a tsunami hazard curve was determined for a probabilistic safety assessment (PSA) induced tsunami event in Nuclear Power Plant site. A Tsunami catalogue was developed by using historical tsunami record which happen before 1900 and instrumental tsunami record after 1900. For the evaluation of return period of tsunami run-up height, power-law, uppertruncated power law and exponential function were considered for the assessment of regression curves and compared with each result. Although the total tsunami records were only 9 times at the east coast of Korea during tsunami catalogue, there was no such research like this about tsunami hazard curve evaluation and this research lay a cornerstone for probabilistic tsunami hazard assessment (PTHA) in Korea

  19. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Blanchard, A.

    2000-02-28

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  20. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    International Nuclear Information System (INIS)

    Blanchard, A.

    2000-01-01

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program

  1. Transportation of hazardous materials emergency preparedness hazards assessment

    International Nuclear Information System (INIS)

    Blanchard, A.

    2000-01-01

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program

  2. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  3. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  4. A stepwise regression tree for nonlinear approximation: applications to estimating subpixel land cover

    Science.gov (United States)

    Huang, C.; Townshend, J.R.G.

    2003-01-01

    A stepwise regression tree (SRT) algorithm was developed for approximating complex nonlinear relationships. Based on the regression tree of Breiman et al . (BRT) and a stepwise linear regression (SLR) method, this algorithm represents an improvement over SLR in that it can approximate nonlinear relationships and over BRT in that it gives more realistic predictions. The applicability of this method to estimating subpixel forest was demonstrated using three test data sets, on all of which it gave more accurate predictions than SLR and BRT. SRT also generated more compact trees and performed better than or at least as well as BRT at all 10 equal forest proportion interval ranging from 0 to 100%. This method is appealing to estimating subpixel land cover over large areas.

  5. Climate reconstruction by regression - 32 variations on a theme

    Energy Technology Data Exchange (ETDEWEB)

    Buerger, Gerd; Fast, Irina; Cubasch, Ulrich [FU Berlin (Germany). Inst. fuer Meteorologie

    2006-02-15

    Regression-based methods fail to provide a sufficiently unique reconstruction of a given millennial history of Northern Hemisphere mean temperature. They instead offer a multitude of variants, depending on the specific data processing scheme. Using a simulated climate history with noise-disturbed pseudo-proxies, we systematically test a set of such configurations, each of which appears to be a priori reasonable, with existing applications elsewhere. This results in an entire spectrum between practically useless and almost perfect reconstructions. The reason lies in the fact that the training variations are not representative of the full millennium, and the regression equations have to be extrapolated. This creates an error that is proportional to both the model uncertainty and the proxy amplitudes. Estimation of that uncertainty is paramount for a useful millennial reconstruction, especially if it is of the parameter-loaded multiproxy type.

  6. The Principle of Proportionality

    DEFF Research Database (Denmark)

    Bennedsen, Morten; Meisner Nielsen, Kasper

    2005-01-01

    Recent policy initiatives within the harmonization of European company laws have promoted a so-called "principle of proportionality" through proposals that regulate mechanisms opposing a proportional distribution of ownership and control. We scrutinize the foundation for these initiatives...... in relationship to the process of harmonization of the European capital markets.JEL classifications: G30, G32, G34 and G38Keywords: Ownership Structure, Dual Class Shares, Pyramids, EU companylaws....

  7. Hazards assessment for the Hazardous Waste Storage Facility

    International Nuclear Information System (INIS)

    Knudsen, J.K.; Calley, M.B.

    1994-04-01

    This report documents the hazards assessment for the Hazardous Waste Storage Facility (HWSF) located at the Idaho National Engineering Laboratory. The hazards assessment was performed to ensure that this facility complies with DOE and company requirements pertaining to emergency planning and preparedness for operational emergencies. The hazards assessment identifies and analyzes hazards that are significant enough to warrant consideration in a facility's operational emergency management program. The area surrounding HWSF, the buildings and structures at HWSF, and the processes used at HWSF are described in this report. All nonradiological hazardous materials at the HWSF were identified (radiological hazardous materials are not stored at HWSF) and screened against threshold quantities according to DOE Order 5500.3A guidance. Two of the identified hazardous materials exceeded their specified threshold quantity. This report discusses the potential release scenarios and consequences associated with an accidental release for each of the two identified hazardous materials, lead and mercury. Emergency considerations, such as emergency planning zones, emergency classes, protective actions, and emergency action levels, are also discussed based on the analysis of potential consequences. Evaluation of the potential consequences indicated that the highest emergency class for operational emergencies at the HWSF would be a Site Area Emergency

  8. A Matter of Balance: Motor Control is Related to Children’s Spatial and Proportional Reasoning Skills

    Science.gov (United States)

    Frick, Andrea; Möhring, Wenke

    2016-01-01

    Recent research has shown close links between spatial and mathematical thinking and between spatial abilities and motor skills. However, longitudinal research examining the relations between motor, spatial, and mathematical skills is rare, and the nature of these relations remains unclear. The present study thus investigated the relation between children’s motor control and their spatial and proportional reasoning. We measured 6-year-olds’ spatial scaling (i.e., the ability to reason about different-sized spaces), their mental transformation skills, and their ability to balance on one leg as an index for motor control. One year later (N = 126), we tested the same children’s understanding of proportions. We also assessed several control variables (verbal IQ and socio-economic status) as well as inhibitory control, visuo-spatial and verbal working memory. Stepwise hierarchical regressions showed that, after accounting for effects of control variables, children’s balance skills significantly increased the explained variance in their spatial performance and proportional reasoning. Our results suggest specific relations between balance skills and spatial as well as proportional reasoning skills that cannot be explained by general differences in executive functioning or intelligence. PMID:26793157

  9. Southern Dietary Pattern is Associated with Hazard of Acute Coronary Heart Disease in the Reasons for Geographic and Racial Differences in Stroke (REGARDS) Study

    Science.gov (United States)

    Shikany, James M.; Safford, Monika M.; Newby, P. K.; Durant, Raegan W.; Brown, Todd M.; Judd, Suzanne E.

    2015-01-01

    Background The association of overall diet, as characterized by dietary patterns, with risk of incident acute coronary heart disease (CHD) has not been studied extensively in samples including sociodemographic and regional diversity. Methods and Results We used data from 17,418 participants in Reasons for Geographic and Racial Differences in Stroke (REGARDS), a national, population-based, longitudinal study of white and black adults aged ≥45 years, enrolled from 2003-2007. We derived dietary patterns with factor analysis, and used Cox proportional hazards regression to examine hazard of incident acute CHD events – nonfatal myocardial infarction and acute CHD death – associated with quartiles of consumption of each pattern, adjusted for various levels of covariates. Five primary dietary patterns emerged: Convenience, Plant-based, Sweets, Southern, and Alcohol and Salad. A total of 536 acute CHD events occurred over a median (IQR) 5.8 (2.1) years of follow-up. After adjustment for sociodemographics, lifestyle factors, and energy intake, highest consumers of the Southern pattern (characterized by added fats, fried food, eggs, organ and processed meats, and sugar-sweetened beverages) experienced a 56% higher hazard of acute CHD (comparing quartile 4 to quartile 1: HR = 1.56; 95% CI: 1.17, 2.08; P for trend across quartiles = 0.003). Adding anthropometric and medical history variables to the model attenuated the association somewhat (HR = 1.37; 95% CI: 1.01, 1.85; P = 0.036). Conclusions A dietary pattern characteristic of the southern US was associated with greater hazard of CHD in this sample of white and black adults in diverse regions of the US. PMID:26260732

  10. Hazard assessment and risk management of offshore production chemicals

    International Nuclear Information System (INIS)

    Schobben, H.P.M.; Scholten, M.C.T.; Vik, E.A.; Bakke, S.

    1994-01-01

    There is a clear need for harmonization of the regulations with regard to the use and discharge of drilling and production chemicals in the North Sea. Therefore the CHARM (Chemical Hazard Assessment and Risk Management) model was developed. Both government (of several countries) and industry (E and P and chemical suppliers) participated in the project. The CHARM model is discussed and accepted by OSPARCON. The CHARM model consists of several modules. The model starts with a prescreening on the basis of hazardous properties like persistency, accumulation potential and the appearance on black lists. The core of the model.consists of modules for hazard assessment and risk analysis. Hazard assessment covers a general environmental evaluation of a chemical on the basis of intrinsic properties of that chemical. Risk analysis covers a more specific evaluation of the environmental impact from the use of a production chemical, or a combination of chemicals, under actual conditions. In the risk management module the user is guided to reduce the total risk of all chemicals used on a platform by the definition of measures in the most cost-effective way. The model calculates the environmental impact for the marine environment. Thereto three parts are distinguished: pelagic, benthic and food chain. Both hazard assessment and risk analysis are based on a proportional comparison of an estimated PEC with an estimated NEC. The PEC is estimated from the use, release, dilution and fate of the chemical and the NEC is estimated from the available toxicity data of the chemicals

  11. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  12. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    Science.gov (United States)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability

  13. Evaluating Middle Years Students' Proportional Reasoning

    Science.gov (United States)

    Hilton, Annette; Dole, Shelley; Hilton, Geoff; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is a key aspect of numeracy that is not always developed naturally by students. Understanding the types of proportional reasoning that students apply to different problem types is a useful first step to identifying ways to support teachers and students to develop proportional reasoning in the classroom. This paper describes…

  14. Control of dust hazards in mines

    Energy Technology Data Exchange (ETDEWEB)

    Sukhanov, V V

    1981-09-01

    This paper analyzes health hazards associated with air pollution by respirable coal dust which causes pneumoconioses. The following directions in pneumoconioses prevention are discussed: improved protective systems (e.g. respirators), mining schemes optimized from a health hazards point of view, correct determination of the maximum permissible level of respirable dusts, reducing working time. Safety regulations in the USSR on the critical amount of coal dust in the miner respiratory system are insufficient as the 20 g limit is too high and does not guarantee safety. Using regression analysis influence of the factors which cause pneumoconioses is analyzed. This influence is described by an equation which considers the following factors: number of shifts associated with contact of a miner with coal dusts, dust concentration in mine air, amount of air with coal dust being respirated, miner's age, years as miner, coal rank. It is stated that use of the proposed equation (derived by computer calculations) permits the safe working time to be correctly determined considering all factors which cause pneumoconioses.

  15. Weibull and lognormal Taguchi analysis using multiple linear regression

    International Nuclear Information System (INIS)

    Piña-Monarrez, Manuel R.; Ortiz-Yañez, Jesús F.

    2015-01-01

    The paper provides to reliability practitioners with a method (1) to estimate the robust Weibull family when the Taguchi method (TM) is applied, (2) to estimate the normal operational Weibull family in an accelerated life testing (ALT) analysis to give confidence to the extrapolation and (3) to perform the ANOVA analysis to both the robust and the normal operational Weibull family. On the other hand, because the Weibull distribution neither has the normal additive property nor has a direct relationship with the normal parameters (µ, σ), in this paper, the issues of estimating a Weibull family by using a design of experiment (DOE) are first addressed by using an L_9 (3"4) orthogonal array (OA) in both the TM and in the Weibull proportional hazard model approach (WPHM). Then, by using the Weibull/Gumbel and the lognormal/normal relationships and multiple linear regression, the direct relationships between the Weibull and the lifetime parameters are derived and used to formulate the proposed method. Moreover, since the derived direct relationships always hold, the method is generalized to the lognormal and ALT analysis. Finally, the method’s efficiency is shown through its application to the used OA and to a set of ALT data. - Highlights: • It gives the statistical relations and steps to use the Taguchi Method (TM) to analyze Weibull data. • It gives the steps to determine the unknown Weibull family to both the robust TM setting and the normal ALT level. • It gives a method to determine the expected lifetimes and to perform its ANOVA analysis in TM and ALT analysis. • It gives a method to give confidence to the extrapolation in an ALT analysis by using the Weibull family of the normal level.

  16. Optical fusions and proportional syntheses

    Science.gov (United States)

    Albert-Vanel, Michel

    2002-06-01

    A tragic error is being made in the literature concerning matters of color when dealing with optical fusions. They are still considered to be of additive nature, whereas experience shows us somewhat different results. The goal of this presentation is to show that fusions are, in fact, of 'proportional' nature, tending to be additive or subtractive, depending on each individual case. Using the pointillist paintings done in the manner of Seurat, or the spinning discs experiment could highlight this intermediate sector of the proportional. So, let us try to examine more closely what occurs in fact, by reviewing additive, subtractive and proportional syntheses.

  17. On the Importance of Accounting for Competing Risks in Pediatric Brain Cancer: II. Regression Modeling and Sample Size

    International Nuclear Information System (INIS)

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-01-01

    Purpose: To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. Methods and Materials: We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. Results: The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Conclusions: Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest.

  18. Endogenous sex hormones and risk of venous thromboembolism in women and men

    DEFF Research Database (Denmark)

    Holmegard, Haya N; Nordestgaard, B G; Schnohr, P

    2014-01-01

    Heart Study, who had estradiol and testosterone concentrations measured. Of these, 636 developed VTE (deep venous thrombosis [DVT] and/or pulmonary embolism [PE]) during a follow-up of 21 years (range, 0.02-32 years). Associations between endogenous estradiol and testosterone concentrations and risk...... of VTE were estimated by Cox proportional hazards regression with time-dependent covariates and corrected for regression dilution bias. RESULTS: Multifactorially adjusted hazard ratios of VTE for individuals with estradiol levels >75th vs. ≤25th percentile were 0.84 (95%CI, 0.25-2.85), 1.05 (0...

  19. Seismic hazard map of North and Central America and the Caribbean

    Directory of Open Access Journals (Sweden)

    K. M. Shedlock

    1999-06-01

    Full Text Available Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes, emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of North and Central America and the Caribbean is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful regional seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA with a 10% chance of exceedance in 50 years. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of North and Central America and the Caribbean depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings. The highest seismic hazard values in the region generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes.

  20. Proportioning of U3O8 powder

    International Nuclear Information System (INIS)

    Cermak, V.; Markvart, M.; Novy, P.; Vanka, M.

    1989-01-01

    The tests are briefly described or proportioning U 3 O 8 powder of a granulometric grain size range of 0-160 μm using a vertical screw, a horizontal dual screw and a vibration dispenser with a view to proportioning very fine U 3 O 8 powder fractions produced in the oxidation of UO 2 fuel pellets. In the tests, the evenness of proportioning was assessed by the percentage value of the proportioning rate spread measured at one-minute intervals at a proportioning rate of 1-3 kg/h. In feeding the U 3 O 3 in a flame fluorator, it is advantageous to monitor the continuity of the powder column being proportioned and to assess it radiometrically by the value of the proportioning rate spread at very short intervals (0.1 s). (author). 10 figs., 1 tab., 12 refs

  1. Proportion and clinical features of never-smokers with non-small cell lung cancer.

    Science.gov (United States)

    Cho, Jaeyoung; Choi, Sun Mi; Lee, Jinwoo; Lee, Chang-Hoon; Lee, Sang-Min; Kim, Dong-Wan; Yim, Jae-Joon; Kim, Young Tae; Yoo, Chul-Gyu; Kim, Young Whan; Han, Sung Koo; Park, Young Sik

    2017-02-08

    The proportion of never-smokers with non-small cell lung cancer (NSCLC) is increasing, but that in Korea has not been well addressed in a large population. We aimed to evaluate the proportion and clinical features of never-smokers with NSCLC in a large single institution. We analyzed clinical data of 1860 consecutive patients who were newly diagnosed with NSCLC between June 2011 and December 2014. Of the 1860 NSCLC patients, 707 (38.0%) were never-smokers. The proportions of women (83.7% vs. 5.6%) and adenocarcinoma (89.8% vs. 44.9%) were higher among never-smokers than among ever-smokers. Significantly more never-smokers were diagnosed at a younger median age (65 vs. 68 years, P smokers. Epidermal growth factor receptor mutations (57.8% vs. 24.4%, P never-smokers, whereas Kirsten rat sarcoma viral oncogene homolog mutations (5.8% vs. 9.6%, P = 0.021) were less frequently encountered in never-smokers than in ever-smokers. Never-smokers showed longer survival after adjusting for the favorable effects of younger age, female sex, adenocarcinoma histology, better performance status, early stage disease, being asymptomatic at diagnosis, received antitumor treatment, and the presence of driver mutations (hazard ratio, 0.624; 95% confidence interval, 0.460-0.848; P = 0.003). More than one-third of the Korean patients with NSCLC were never-smokers. NSCLC in never-smokers had different clinical characteristics and major driver mutations and resulted in longer overall survival compared with NSCLC in ever-smokers.

  2. Proportional Symbol Mapping in R

    Directory of Open Access Journals (Sweden)

    Susumu Tanimura

    2006-01-01

    Full Text Available Visualization of spatial data on a map aids not only in data exploration but also in communication to impart spatial conception or ideas to others. Although recent carto-graphic functions in R are rapidly becoming richer, proportional symbol mapping, which is one of the common mapping approaches, has not been packaged thus far. Based on the theories of proportional symbol mapping developed in cartography, the authors developed some functions for proportional symbol mapping using R, including mathematical and perceptual scaling. An example of these functions demonstrated the new expressive power and options available in R, particularly for the visualization of conceptual point data.

  3. A balanced hazard ratio for risk group evaluation from survival data.

    Science.gov (United States)

    Branders, Samuel; Dupont, Pierre

    2015-07-30

    Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Weighted linear regression using D2H and D2 as the independent variables

    Science.gov (United States)

    Hans T. Schreuder; Michael S. Williams

    1998-01-01

    Several error structures for weighted regression equations used for predicting volume were examined for 2 large data sets of felled and standing loblolly pine trees (Pinus taeda L.). The generally accepted model with variance of error proportional to the value of the covariate squared ( D2H = diameter squared times height or D...

  5. Hazardous drinking and its association with homelessness among veterans in care.

    Science.gov (United States)

    Ghose, T; Fiellin, D A; Gordon, A J; Metraux, S; Goetz, M B; Blackstock, O; McInnes, K; Rodriguez-Barradas, M C; Justice, A C

    2013-09-01

    While scholarship on alcohol use and homelessness has focused on the impact of alcohol abuse and dependence, little is known about the effects of lower levels of misuse such as hazardous use. Veterans receiving care in the Department of Veterans Affairs Health Care System (VA) constitute a population that is vulnerable to alcohol misuse and homelessness. This research examines the effects of hazardous drinking on homelessness in the Veterans Aging Cohort Study, a sample of 2898 older veterans (mean age=50.2), receiving care in 8 VAs across the country. Logistic regression models examined the associations between (1) hazardous drinking at baseline and homelessness at 1-year follow-up, (2) transitions into and out of hazardous drinking from baseline to follow-up and homelessness at follow-up, and (3) transitioning to hazardous drinking and transitioning to homelessness from baseline to follow-up during that same time-period. After controlling for other correlates including alcohol dependence, hazardous drinking at baseline increased the risk of homelessness at follow-up (adjusted odds ratio [AOR]=1.39, 95% confidence interval [CI]=1.02, 1.88). Transitioning to hazardous drinking more than doubled the risk of homelessness at follow-up (AOR=2.42, 95% CI=1.41, 4.15), while more than doubling the risk of transitioning from being housed at baseline to being homeless at follow-up (AOR=2.49, 95% CI=1.30, 4.79). Early intervention that seeks to prevent transitioning into hazardous drinking could increase housing stability among veterans. Brief interventions which have been shown to be effective at lower levels of alcohol use should be implemented with veterans in VA care. Published by Elsevier Ireland Ltd.

  6. Impossibility Theorem in Proportional Representation Problem

    International Nuclear Information System (INIS)

    Karpov, Alexander

    2010-01-01

    The study examines general axiomatics of Balinski and Young and analyzes existed proportional representation methods using this approach. The second part of the paper provides new axiomatics based on rational choice models. New system of axioms is applied to study known proportional representation systems. It is shown that there is no proportional representation method satisfying a minimal set of the axioms (monotonicity and neutrality).

  7. The effect of proportional v. value pricing on fountain drink purchases: results from a field experiment.

    Science.gov (United States)

    Gollust, Sarah E; Tang, Xuyang; Runge, Carlisle Ford; French, Simone A; Rothman, Alexander J

    2018-05-15

    Reducing sugar-sweetened beverage consumption is a public health priority, yet finding an effective and acceptable policy intervention is challenging. One strategy is to use proportional pricing (a consistent price per fluid ounce) instead of the typical value-priced approach where large beverages offer better value. The purpose of the present study was to evaluate whether proportional pricing affects the purchasing of fountain beverages at a university cinema concession stand. Four price strategies for beverages were evaluated over ten weekends of film screenings. We manipulated two factors: the price structure (value pricing v. proportional pricing) and the provision of information about the price per fluid ounce (labels v. no labels). The key outcomes were the number and size of beverages purchased. We analysed data using regression analyses, with standard errors clustered by film and controlling for the day and time of purchase. A university cinema concession stand in Minnesota, USA, in spring 2015. University students. Over the study period (360 beverages purchased) there were no significant effects of the proportional pricing treatment. Pairing a label with the standard value pricing increased the likelihood of purchasing large drinks but the label did not affect purchasing when paired with proportional pricing. Proportional prices did not significantly affect the size of beverages purchased by students at a university cinema, but adding a price-per-ounce label increased large drink purchases when drinks were value-priced. More work is needed to address whether pricing and labelling strategies might promote healthier beverage purchases.

  8. Applying machine-learning techniques to Twitter data for automatic hazard-event classification.

    Science.gov (United States)

    Filgueira, R.; Bee, E. J.; Diaz-Doce, D.; Poole, J., Sr.; Singh, A.

    2017-12-01

    The constant flow of information offered by tweets provides valuable information about all sorts of events at a high temporal and spatial resolution. Over the past year we have been analyzing in real-time geological hazards/phenomenon, such as earthquakes, volcanic eruptions, landslides, floods or the aurora, as part of the GeoSocial project, by geo-locating tweets filtered by keywords in a web-map. However, not all the filtered tweets are related with hazard/phenomenon events. This work explores two classification techniques for automatic hazard-event categorization based on tweets about the "Aurora". First, tweets were filtered using aurora-related keywords, removing stop words and selecting the ones written in English. For classifying the remaining between "aurora-event" or "no-aurora-event" categories, we compared two state-of-art techniques: Support Vector Machine (SVM) and Deep Convolutional Neural Networks (CNN) algorithms. Both approaches belong to the family of supervised learning algorithms, which make predictions based on labelled training dataset. Therefore, we created a training dataset by tagging 1200 tweets between both categories. The general form of SVM is used to separate two classes by a function (kernel). We compared the performance of four different kernels (Linear Regression, Logistic Regression, Multinomial Naïve Bayesian and Stochastic Gradient Descent) provided by Scikit-Learn library using our training dataset to build the SVM classifier. The results shown that the Logistic Regression (LR) gets the best accuracy (87%). So, we selected the SVM-LR classifier to categorise a large collection of tweets using the "dispel4py" framework.Later, we developed a CNN classifier, where the first layer embeds words into low-dimensional vectors. The next layer performs convolutions over the embedded word vectors. Results from the convolutional layer are max-pooled into a long feature vector, which is classified using a softmax layer. The CNN's accuracy

  9. Biostatistics Series Module 6: Correlation and Linear Regression.

    Science.gov (United States)

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Correlation and linear regression are the most commonly used techniques for quantifying the association between two numeric variables. Correlation quantifies the strength of the linear relationship between paired variables, expressing this as a correlation coefficient. If both variables x and y are normally distributed, we calculate Pearson's correlation coefficient ( r ). If normality assumption is not met for one or both variables in a correlation analysis, a rank correlation coefficient, such as Spearman's rho (ρ) may be calculated. A hypothesis test of correlation tests whether the linear relationship between the two variables holds in the underlying population, in which case it returns a P correlation coefficient can also be calculated for an idea of the correlation in the population. The value r 2 denotes the proportion of the variability of the dependent variable y that can be attributed to its linear relation with the independent variable x and is called the coefficient of determination. Linear regression is a technique that attempts to link two correlated variables x and y in the form of a mathematical equation ( y = a + bx ), such that given the value of one variable the other may be predicted. In general, the method of least squares is applied to obtain the equation of the regression line. Correlation and linear regression analysis are based on certain assumptions pertaining to the data sets. If these assumptions are not met, misleading conclusions may be drawn. The first assumption is that of linear relationship between the two variables. A scatter plot is essential before embarking on any correlation-regression analysis to show that this is indeed the case. Outliers or clustering within data sets can distort the correlation coefficient value. Finally, it is vital to remember that though strong correlation can be a pointer toward causation, the two are not synonymous.

  10. Proportioning of light weight concrete

    DEFF Research Database (Denmark)

    Palmus, Lars

    1996-01-01

    Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory......Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory...

  11. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  12. Development of multiwire proportional chambers

    CERN Multimedia

    Charpak, G

    1969-01-01

    It has happened quite often in the history of science that theoreticians, confronted with some major difficulty, have successfully gone back thirty years to look at ideas that had then been thrown overboard. But it is rare that experimentalists go back thirty years to look again at equipment which had become out-dated. This is what Charpak and his colleagues did to emerge with the 'multiwire proportional chamber' which has several new features making it a very useful addition to the armoury of particle detectors. In the 1930s, ion-chambers, Geiger- Muller counters and proportional counters, were vital pieces of equipment in nuclear physics research. Other types of detectors have since largely replaced them but now the proportional counter, in new array, is making a comeback.

  13. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  14. Managing patients with concerns about workplace reproductive hazards.

    Science.gov (United States)

    Frazier, L M; Jones, T L

    2000-01-01

    To find out who uses an occupational reproductive consultation service, what proportion of patients have different types of workplace exposures, and what hypotheses can be generated about barriers to implementing medically necessary job modifications to promote reproductive health. A case series study was conducted by reviewing medical records at two occupational health clinics. 51 patients (1 man and 50 women) were seen, 10 of whom wished to discuss a future pregnancy and 41 of whom were pregnant. Pregnant women worked with a mean of 15.5 different chemicals, and patients were also concerned about ionizing radiation, biological hazards, electromagnetic fields, and ultraviolet light. Pregnant women made clinic visits at a mean gestational age of 10.9 weeks. Only one man used the service, suggesting a lack of knowledge about possible paternal contributions to adverse reproductive outcomes. Many pregnant women visited the clinic too late to prevent harm from exposure to some teratogens, so preconception counseling may be of benefit. Cases are presented that illustrate ways in which the primary care provider can assist the patient who may be exposed to reproductive hazards.

  15. Macro-level vulnerable road users crash analysis: A Bayesian joint modeling approach of frequency and proportion.

    Science.gov (United States)

    Cai, Qing; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2017-10-01

    This study aims at contributing to the literature on pedestrian and bicyclist safety by building on the conventional count regression models to explore exogenous factors affecting pedestrian and bicyclist crashes at the macroscopic level. In the traditional count models, effects of exogenous factors on non-motorist crashes were investigated directly. However, the vulnerable road users' crashes are collisions between vehicles and non-motorists. Thus, the exogenous factors can affect the non-motorist crashes through the non-motorists and vehicle drivers. To accommodate for the potentially different impact of exogenous factors we convert the non-motorist crash counts as the product of total crash counts and proportion of non-motorist crashes and formulate a joint model of the negative binomial (NB) model and the logit model to deal with the two parts, respectively. The formulated joint model is estimated using non-motorist crash data based on the Traffic Analysis Districts (TADs) in Florida. Meanwhile, the traditional NB model is also estimated and compared with the joint model. The result indicates that the joint model provides better data fit and can identify more significant variables. Subsequently, a novel joint screening method is suggested based on the proposed model to identify hot zones for non-motorist crashes. The hot zones of non-motorist crashes are identified and divided into three types: hot zones with more dangerous driving environment only, hot zones with more hazardous walking and cycling conditions only, and hot zones with both. It is expected that the joint model and screening method can help decision makers, transportation officials, and community planners to make more efficient treatments to proactively improve pedestrian and bicyclist safety. Published by Elsevier Ltd.

  16. Using threshold regression to analyze survival data from complex surveys: With application to mortality linked NHANES III Phase II genetic data.

    Science.gov (United States)

    Li, Yan; Xiao, Tao; Liao, Dandan; Lee, Mei-Ling Ting

    2018-03-30

    The Cox proportional hazards (PH) model is a common statistical technique used for analyzing time-to-event data. The assumption of PH, however, is not always appropriate in real applications. In cases where the assumption is not tenable, threshold regression (TR) and other survival methods, which do not require the PH assumption, are available and widely used. These alternative methods generally assume that the study data constitute simple random samples. In particular, TR has not been studied in the setting of complex surveys that involve (1) differential selection probabilities of study subjects and (2) intracluster correlations induced by multistage cluster sampling. In this paper, we extend TR procedures to account for complex sampling designs. The pseudo-maximum likelihood estimation technique is applied to estimate the TR model parameters. Computationally efficient Taylor linearization variance estimators that consider both the intracluster correlation and the differential selection probabilities are developed. The proposed methods are evaluated by using simulation experiments with various complex designs and illustrated empirically by using mortality-linked Third National Health and Nutrition Examination Survey Phase II genetic data. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Misspecified poisson regression models for large-scale registry data: inference for 'large n and small p'.

    Science.gov (United States)

    Grøn, Randi; Gerds, Thomas A; Andersen, Per K

    2016-03-30

    Poisson regression is an important tool in register-based epidemiology where it is used to study the association between exposure variables and event rates. In this paper, we will discuss the situation with 'large n and small p', where n is the sample size and p is the number of available covariates. Specifically, we are concerned with modeling options when there are time-varying covariates that can have time-varying effects. One problem is that tests of the proportional hazards assumption, of no interactions between exposure and other observed variables, or of other modeling assumptions have large power due to the large sample size and will often indicate statistical significance even for numerically small deviations that are unimportant for the subject matter. Another problem is that information on important confounders may be unavailable. In practice, this situation may lead to simple working models that are then likely misspecified. To support and improve conclusions drawn from such models, we discuss methods for sensitivity analysis, for estimation of average exposure effects using aggregated data, and a semi-parametric bootstrap method to obtain robust standard errors. The methods are illustrated using data from the Danish national registries investigating the diabetes incidence for individuals treated with antipsychotics compared with the general unexposed population. Copyright © 2015 John Wiley & Sons, Ltd.

  18. The Current and Future Use of Ridge Regression for Prediction in Quantitative Genetics

    Directory of Open Access Journals (Sweden)

    Ronald de Vlaming

    2015-01-01

    Full Text Available In recent years, there has been a considerable amount of research on the use of regularization methods for inference and prediction in quantitative genetics. Such research mostly focuses on selection of markers and shrinkage of their effects. In this review paper, the use of ridge regression for prediction in quantitative genetics using single-nucleotide polymorphism data is discussed. In particular, we consider (i the theoretical foundations of ridge regression, (ii its link to commonly used methods in animal breeding, (iii the computational feasibility, and (iv the scope for constructing prediction models with nonlinear effects (e.g., dominance and epistasis. Based on a simulation study we gauge the current and future potential of ridge regression for prediction of human traits using genome-wide SNP data. We conclude that, for outcomes with a relatively simple genetic architecture, given current sample sizes in most cohorts (i.e., N<10,000 the predictive accuracy of ridge regression is slightly higher than the classical genome-wide association study approach of repeated simple regression (i.e., one regression per SNP. However, both capture only a small proportion of the heritability. Nevertheless, we find evidence that for large-scale initiatives, such as biobanks, sample sizes can be achieved where ridge regression compared to the classical approach improves predictive accuracy substantially.

  19. Hazards and hazard combinations relevant for the safety of nuclear power plants

    Science.gov (United States)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    The potential of the contemporaneous impact of different, yet causally related, hazardous events and event cascades on nuclear power plants is a major contributor to the overall risk of nuclear installations. In the aftermath of the Fukushima accident, which was caused by a combination of severe ground shaking by an earthquake, an earthquake-triggered tsunami and the disruption of the plants from the electrical grid by a seismically induced landslide, hazard combinations and hazard cascades moved into the focus of nuclear safety research. We therefore developed an exhaustive list of external hazards and hazard combinations which pose potential threats to nuclear installations in the framework of the European project ASAMPSAE (Advanced Safety Assessment: Extended PSA). The project gathers 31 partners from Europe, North Amerika and Japan. The list comprises of exhaustive lists of natural hazards, external man-made hazards, and a cross-correlation matrix of these hazards. The hazard list is regarded comprehensive by including all types of hazards that were previously cited in documents by IAEA, the Western European Nuclear Regulators Association (WENRA), and others. 73 natural hazards and 24 man-made external hazards are included. Natural hazards are grouped into seismotectonic hazards, flooding and hydrological hazards, extreme values of meteorological phenomena, rare meteorological phenomena, biological hazards / infestation, geological hazards, and forest fire / wild fire. The list of external man-made hazards includes industry accidents, military accidents, transportation accidents, pipeline accidents and other man-made external events. The large number of different hazards results in the extremely large number of 5.151 theoretically possible hazard combinations (not considering hazard cascades). In principle all of these combinations are possible to occur by random coincidence except for 82 hazard combinations that - depending on the time scale - are mutually

  20. Hazardous Waste

    Science.gov (United States)

    ... chemicals can still harm human health and the environment. When you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint thinner. U.S. residents ...

  1. Penalized variable selection in competing risks regression.

    Science.gov (United States)

    Fu, Zhixuan; Parikh, Chirag R; Zhou, Bingqing

    2017-07-01

    Penalized variable selection methods have been extensively studied for standard time-to-event data. Such methods cannot be directly applied when subjects are at risk of multiple mutually exclusive events, known as competing risks. The proportional subdistribution hazard (PSH) model proposed by Fine and Gray (J Am Stat Assoc 94:496-509, 1999) has become a popular semi-parametric model for time-to-event data with competing risks. It allows for direct assessment of covariate effects on the cumulative incidence function. In this paper, we propose a general penalized variable selection strategy that simultaneously handles variable selection and parameter estimation in the PSH model. We rigorously establish the asymptotic properties of the proposed penalized estimators and modify the coordinate descent algorithm for implementation. Simulation studies are conducted to demonstrate the good performance of the proposed method. Data from deceased donor kidney transplants from the United Network of Organ Sharing illustrate the utility of the proposed method.

  2. Urban-Hazard Risk Analysis: Mapping of Heat-Related Risks in the Elderly in Major Italian Cities

    Science.gov (United States)

    Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco

    2015-01-01

    Background Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Objectives Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥65). Methods A long time-series (2001–2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using “Crichton’s Risk Triangle” hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). Results The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. Conclusions This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public

  3. Assessment of participation bias in cohort studies: systematic review and meta-regression analysis

    Directory of Open Access Journals (Sweden)

    Sérgio Henrique Almeida da Silva Junior

    2015-11-01

    Full Text Available Abstract The proportion of non-participation in cohort studies, if associated with both the exposure and the probability of occurrence of the event, can introduce bias in the estimates of interest. The aim of this study is to evaluate the impact of participation and its characteristics in longitudinal studies. A systematic review (MEDLINE, Scopus and Web of Science for articles describing the proportion of participation in the baseline of cohort studies was performed. Among the 2,964 initially identified, 50 were selected. The average proportion of participation was 64.7%. Using a meta-regression model with mixed effects, only age, year of baseline contact and study region (borderline were associated with participation. Considering the decrease in participation in recent years, and the cost of cohort studies, it is essential to gather information to assess the potential for non-participation, before committing resources. Finally, journals should require the presentation of this information in the papers.

  4. Hazard perception in traffic. [previously knows as: Hazard perception.

    NARCIS (Netherlands)

    2008-01-01

    Hazard perception is an essential part of the driving task. There are clear indications that insufficient skills in perceiving hazards play an important role in the occurrence of crashes, especially those involving novice drivers. Proper hazard perception not only consists of scanning and perceiving

  5. Using Swiss Webster mice to model Fetal Alcohol Spectrum Disorders (FASD): An analysis of multilevel time-to-event data through mixed-effects Cox proportional hazards models.

    Science.gov (United States)

    Chi, Peter; Aras, Radha; Martin, Katie; Favero, Carlita

    2016-05-15

    Fetal Alcohol Spectrum Disorders (FASD) collectively describes the constellation of effects resulting from human alcohol consumption during pregnancy. Even with public awareness, the incidence of FASD is estimated to be upwards of 5% in the general population and is becoming a global health problem. The physical, cognitive, and behavioral impairments of FASD are recapitulated in animal models. Recently rodent models utilizing voluntary drinking paradigms have been developed that accurately reflect moderate consumption, which makes up the majority of FASD cases. The range in severity of FASD characteristics reflects the frequency, dose, developmental timing, and individual susceptibility to alcohol exposure. As most rodent models of FASD use C57BL/6 mice, there is a need to expand the stocks of mice studied in order to more fully understand the complex neurobiology of this disorder. To that end, we allowed pregnant Swiss Webster mice to voluntarily drink ethanol via the drinking in the dark (DID) paradigm throughout their gestation period. Ethanol exposure did not alter gestational outcomes as determined by no significant differences in maternal weight gain, maternal liquid consumption, litter size, or pup weight at birth or weaning. Despite seemingly normal gestation, ethanol-exposed offspring exhibit significantly altered timing to achieve developmental milestones (surface righting, cliff aversion, and open field traversal), as analyzed through mixed-effects Cox proportional hazards models. These results confirm Swiss Webster mice as a viable option to study the incidence and causes of ethanol-induced neurobehavioral alterations during development. Future studies in our laboratory will investigate the brain regions and molecules responsible for these behavioral changes. Copyright © 2016. Published by Elsevier B.V.

  6. Occupational hazards and illnesses of Filipino women workers in export processing zones.

    Science.gov (United States)

    Lu, Jinky Leilanie

    2008-01-01

    This was a baseline study on occupational exposure and health problems among women workers in export processing zones. Physical, chemical, and ergonomic hazards were evaluated and measured through workplace ambient monitoring, survey questionnaires, and interviews with 500 respondents in 24 companies (most were female at 88.8%). The top 5 hazards were ergonomic hazards (72.2%), heat (66.6%), overwork (66.6%), poor ventilation (54.8%), and chemical exposure (50.8%). The most common illnesses were gastrointestinal problems (57.4%), backache (56%), headache (53.2%), and fatigue/weakness (53.2%). Logistic regression showed an association between certain work-related factors and occupational illnesses, and psychosocial problems. Highly significant associations were hearing loss with years spent in the company (p=.005) and gender (p=.006), headache and dizziness with poor ventilation (p=.000), backache with prolonged work (p=.003). These results will have implications for policy and program formulation for women workers' concerns and issues in export zones.

  7. Adaptive bayesian analysis for binomial proportions

    CSIR Research Space (South Africa)

    Das, Sonali

    2008-10-01

    Full Text Available of testing the proportion of some trait. For example, say, we are interested to infer about the effectiveness of a certain intervention teaching strategy, by comparing proportion of ‘proficient’ teachers, before and after an intervention. The number...

  8. Icon arrays help younger children's proportional reasoning.

    Science.gov (United States)

    Ruggeri, Azzurra; Vagharchakian, Laurianne; Xu, Fei

    2018-06-01

    We investigated the effects of two context variables, presentation format (icon arrays or numerical frequencies) and time limitation (limited or unlimited time), on the proportional reasoning abilities of children aged 7 and 10 years, as well as adults. Participants had to select, between two sets of tokens, the one that offered the highest likelihood of drawing a gold token, that is, the set of elements with the greater proportion of gold tokens. Results show that participants performed better in the unlimited time condition. Moreover, besides a general developmental improvement in accuracy, our results show that younger children performed better when proportions were presented as icon arrays, whereas older children and adults were similarly accurate in the two presentation format conditions. Statement of contribution What is already known on this subject? There is a developmental improvement in proportional reasoning accuracy. Icon arrays facilitate reasoning in adults with low numeracy. What does this study add? Participants were more accurate when they were given more time to make the proportional judgement. Younger children's proportional reasoning was more accurate when they were presented with icon arrays. Proportional reasoning abilities correlate with working memory, approximate number system, and subitizing skills. © 2018 The British Psychological Society.

  9. Identification of Potential Hazard using Hazard Identification and Risk Assessment

    Science.gov (United States)

    Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.

    2017-03-01

    This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.

  10. Probabilistic Seismic Hazard Assessment for Northeast India Region

    Science.gov (United States)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  11. Electronics for proportional drift tubes

    International Nuclear Information System (INIS)

    Fremont, G.; Friend, B.; Mess, K.H.; Schmidt-Parzefall, W.; Tarle, J.C.; Verweij, H.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Geske, K.; Riege, H.; Schuett, J.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Semenov, Y.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration)

    1980-01-01

    An electronic system for the read-out of a large number of proportional drift tubes (16,000) has been designed. This system measures deposited charge and drift-time of the charge of a particle traversing a proportional drift tube. A second event can be accepted during the read-out of the system. Up to 40 typical events can be collected and buffered before a data transfer to a computer is necessary. (orig.)

  12. Stagewise pseudo-value regression for time-varying effects on the cumulative incidence

    DEFF Research Database (Denmark)

    Zöller, Daniela; Schmidtmann, Irene; Weinmann, Arndt

    2016-01-01

    In a competing risks setting, the cumulative incidence of an event of interest describes the absolute risk for this event as a function of time. For regression analysis, one can either choose to model all competing events by separate cause-specific hazard models or directly model the association...... for time-varying effects. This is implemented by coupling variable selection between the grid times, but determining estimates separately. The effect estimates are regularized to also allow for model fitting with a low to moderate number of observations. This technique is illustrated in an application...

  13. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  14. Cognitive and Metacognitive Aspects of Proportional Reasoning

    Science.gov (United States)

    Modestou, Modestina; Gagatsis, Athanasios

    2010-01-01

    In this study we attempt to propose a new model of proportional reasoning based both on bibliographical and research data. This is impelled with the help of three written tests involving analogical, proportional, and non-proportional situations that were administered to pupils from grade 7 to 9. The results suggest the existence of a…

  15. Collision prediction models using multivariate Poisson-lognormal regression.

    Science.gov (United States)

    El-Basyouny, Karim; Sayed, Tarek

    2009-07-01

    This paper advocates the use of multivariate Poisson-lognormal (MVPLN) regression to develop models for collision count data. The MVPLN approach presents an opportunity to incorporate the correlations across collision severity levels and their influence on safety analyses. The paper introduces a new multivariate hazardous location identification technique, which generalizes the univariate posterior probability of excess that has been commonly proposed and applied in the literature. In addition, the paper presents an alternative approach for quantifying the effect of the multivariate structure on the precision of expected collision frequency. The MVPLN approach is compared with the independent (separate) univariate Poisson-lognormal (PLN) models with respect to model inference, goodness-of-fit, identification of hot spots and precision of expected collision frequency. The MVPLN is modeled using the WinBUGS platform which facilitates computation of posterior distributions as well as providing a goodness-of-fit measure for model comparisons. The results indicate that the estimates of the extra Poisson variation parameters were considerably smaller under MVPLN leading to higher precision. The improvement in precision is due mainly to the fact that MVPLN accounts for the correlation between the latent variables representing property damage only (PDO) and injuries plus fatalities (I+F). This correlation was estimated at 0.758, which is highly significant, suggesting that higher PDO rates are associated with higher I+F rates, as the collision likelihood for both types is likely to rise due to similar deficiencies in roadway design and/or other unobserved factors. In terms of goodness-of-fit, the MVPLN model provided a superior fit than the independent univariate models. The multivariate hazardous location identification results demonstrated that some hazardous locations could be overlooked if the analysis was restricted to the univariate models.

  16. Comparison of Bayesian and frequentist approaches in modelling risk of preterm birth near the Sydney Tar Ponds, Nova Scotia, Canada

    Directory of Open Access Journals (Sweden)

    Canty Angelo

    2007-09-01

    Full Text Available Abstract Background This study compares the Bayesian and frequentist (non-Bayesian approaches in the modelling of the association between the risk of preterm birth and maternal proximity to hazardous waste and pollution from the Sydney Tar Pond site in Nova Scotia, Canada. Methods The data includes 1604 observed cases of preterm birth out of a total population of 17559 at risk of preterm birth from 144 enumeration districts in the Cape Breton Regional Municipality. Other covariates include the distance from the Tar Pond; the rate of unemployment to population; the proportion of persons who are separated, divorced or widowed; the proportion of persons who have no high school diploma; the proportion of persons living alone; the proportion of single parent families and average income. Bayesian hierarchical Poisson regression, quasi-likelihood Poisson regression and weighted linear regression models were fitted to the data. Results The results of the analyses were compared together with their limitations. Conclusion The results of the weighted linear regression and the quasi-likelihood Poisson regression agrees with the result from the Bayesian hierarchical modelling which incorporates the spatial effects.

  17. [HIV/AIDS related mortality in southern Shanxi province and its risk factors].

    Science.gov (United States)

    Ning, Shaoping; Xue, Zidong; Wei, Jun; Mu, Shengcai; Xu, Yajuan; Jia, Shaoxian; Qiu, Chao; Xu, Jianqing

    2015-03-01

    To explore factors influencing mortality rate of HIV/AIDS and to improve the effectiveness of antiretroviral therapy (ART). By means of retrospective cohort study and the AIDS control information system, HIV/AIDS case reports and antiviral treatment information of 4 cities in southern Shanxi province up to end of December 2012 were selected, to calculate the mortality rate and treatment coverage based on further data collected, along with analysis using the Cox proportional hazards survival regression. 4 040 cases confirmed of HIV/AIDS were included in this study. The average age was (36.0 ± 12.9) years, with 65.3% being male, 56.5% being married, 73.5% having junior high school education or lower, 58.4% being peasants, 54.3% with sexually transmitted infection (40.1% were heterosexual, 14.2% were homosexual), and 38.9% were infected via blood transmission (20.2% were former plasma donors, 16.2% blood transfusion or products recipients, 2.4% were injection drug users). Overall mortality decreased from 40.2 per 100 person/year in 2004 to 6.3 per 100 person/year in 2012, with treatment coverage concomitantly increasing from almost 14.8% to 63.4%. Cox proportional hazards survival regression was used on 4 040 qualified cases, demonstrating the top mortality risk factor was without antiretroviral therapy (RR = 14.9, 95% CI: 12.7-17.4). Cox proportional hazards survival regression was made on 1 938 cases of antiviral treatment, demonstrating that the mortality risk of underweight or obese before treatment was higher than those of normal and overweight cases (RR = 2.7, 95% CI: 1.6-4.5), and the mortality of those having a CD4(+) T-lymphocyte count ≤ 50 cells per µl before treatment was more than 50 cases (RR = 2.6, 95% CI: 1.5-4.5); Cox proportional hazards survival regression was made on 2 102 cases of untreated cases, demonstrating the mortality risk of those initially diagnosed as AIDS was higher than those initially diagnosed as HIV (RR = 3.4, 95% CI: 2

  18. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  19. Relating arithmetical techniques of proportion to geometry

    DEFF Research Database (Denmark)

    Wijayanti, Dyana

    2015-01-01

    The purpose of this study is to investigate how textbooks introduce and treat the theme of proportion in geometry (similarity) and arithmetic (ratio and proportion), and how these themes are linked to each other in the books. To pursue this aim, we use the anthropological theory of the didactic....... Considering 6 common Indonesian textbooks in use, we describe how proportion is explained and appears in examples and exercises, using an explicit reference model of the mathematical organizations of both themes. We also identify how the proportion themes of the geometry and arithmetic domains are linked. Our...

  20. A Partial Proportional Odds Model for Pedestrian Crashes at Mid-Blocks in Melbourne Metropolitan Area

    Directory of Open Access Journals (Sweden)

    Toran Pour Alireza

    2016-01-01

    Full Text Available Pedestrian crashes account for 11% of all reported traffic crashes in Melbourne metropolitan area between 2004 and 2013. There are very limited studies on pedestrian accidents at mid-blocks. Mid-block crashes account for about 46% of the total pedestrian crashes in Melbourne metropolitan area. Meanwhile, about 50% of all pedestrian fatalities occur at mid-blocks. In this research, Partial Proportional Odds (PPO model is applied to examine vehicle-pedestrian crash severity at mid-blocks in Melbourne metropolitan area. The PPO model is a logistic regression model that allows the covariates that meet the proportional odds assumption to affect different crash severity levels with the same magnitude; whereas the covariates that do not meet the proportional odds assumption can have different effects on different severity levels. In this research vehicle-pedestrian crashes at mid-blocks are analysed for first time. In addition, some factors such as distance of crashes to public transport stops, average road slope and some social characteristics are considered to develop the model in this research for first time. Results of PPO model show that speed limit, light condition, pedestrian age and gender, and vehicle type are the most significant factors that influence vehicle-pedestrian crash severity at mid-blocks.

  1. Elevated plasma vitamin B12 levels and cancer prognosis: A population-based cohort study

    DEFF Research Database (Denmark)

    Arendt, Johan Frederik Håkonsen; Farkas, Dora Kormendine; Pedersen, Lars

    2015-01-01

    patients without a plasma Cbl measurement. Patients treated with Cbl were excluded. Survival probability was assessed using Kaplan-Meier curves. Mortality risk ratios (MRR) were computed using Cox proportional hazard regression, adjusted for age, sex, calendar year, cancer stage and comorbidity, scored...

  2. Berekening van levensverwachting uit mortaliteits follow-up studies

    NARCIS (Netherlands)

    Hoogenveen RT; Schuit AJ; Visscher TLS; Feskens EJM; Nagelkerke NJD; CCM

    1998-01-01

    Survival analyses of longitudinal studies make often use of the Cox proportional hazards model. Mortality differences between risk groups are presented in terms of estimated regression coefficients or relative risks. The research question relates to the presentation of mortality differences in terms

  3. Proportional Reasoning and the Visually Impaired

    Science.gov (United States)

    Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…

  4. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  5. Spatial vulnerability assessments by regression kriging

    Science.gov (United States)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor

    2016-04-01

    Two fairly different complex environmental phenomena, causing natural hazard were mapped based on a combined spatial inference approach. The behaviour is related to various environmental factors and the applied approach enables the inclusion of several, spatially exhaustive auxiliary variables that are available for mapping. Inland excess water (IEW) is an interrelated natural and human induced phenomenon causes several problems in the flat-land regions of Hungary, which cover nearly half of the country. The term 'inland excess water' refers to the occurrence of inundations outside the flood levee that originate from sources differing from flood overflow, it is surplus surface water forming due to the lack of runoff, insufficient absorption capability of soil or the upwelling of groundwater. There is a multiplicity of definitions, which indicate the complexity of processes that govern this phenomenon. Most of the definitions have a common part, namely, that inland excess water is temporary water inundation that occurs in flat-lands due to both precipitation and groundwater emerging on the surface as substantial sources. Radon gas is produced in the radioactive decay chain of uranium, which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on soil physical and meteorological parameters and can enter and accumulate in the buildings. Health risk originating from indoor radon concentration attributed to natural factors is characterized by geogenic radon potential (GRP). In addition to geology and meteorology, physical soil properties play significant role in the determination of GRP. Identification of areas with high risk requires spatial modelling, that is mapping of specific natural hazards. In both cases external environmental factors determine the behaviour of the target process (occurrence/frequncy of IEW and grade of GRP respectively). Spatial auxiliary

  6. Effects of plant density and proportion on the interaction between wheat with alexandergrass plants

    Directory of Open Access Journals (Sweden)

    Leonardo Bianco de Carvalho

    2011-01-01

    Full Text Available Determination of competitive relationships among plant species requires appropriate experimental designs and method of analysis. The hypothesis of this research was that two species growing in coexistence show different growth and development due to their relative competitiveness. This research aims to measure the relative competitiveness of wheat crop compared to Alexandergrass by the interpretation of plant density and proportional effects using replacement series experiments. Monocultures were cultivated in densities of 1, 3, 5, 10 and 15 plants per pot and analyzed by regression of dry mass data. Mixture experiment was cultivated in wheat:Alexandergrass proportions of 0:6, 1:5, 2:4, 3:3, 4:2, 5:1 and 6:0 plants per pot and analyzed by graphical interpretation of growth and production characteristics. Both experiments were carried out in randomized complete block design with four replicates. Alexandergrass was more sensitive to intraspecific competition than wheat. Alexandergrass was lightly more competitive than wheat. Number and weight of spikes and number of tillers were the wheat characteristics more affected by Alexandergrass interference.

  7. Proportional gas scintillation detectors and their applications

    International Nuclear Information System (INIS)

    Petr, I.

    1978-01-01

    The principle is described of a gas proportional scintillation detector and its function. Dependence of Si(Li) and xenon proportional detectors energy resolution on the input window size is given. A typical design is shown of a xenon detector used for X-ray spetrometry at an energy of 277 eV to 5.898 keV and at a gas pressure of 98 to 270 kPa. Gas proportional scintillation detectors show considerable better energy resolution than common proportional counters and even better resolution than semiconductor Si(Li) detectors for low X radiation energies. For detection areas smaller than 25 mm 2 Si(Li) detectors show better resolution, especially for higher X radiation energies. For window areas 25 to 190 mm 2 both types of detectors are equal, for a window area exceeding 190 mm 2 the proportional scintillation detector has higher energy resolution. (B.S.)

  8. Augmented Beta rectangular regression models: A Bayesian perspective.

    Science.gov (United States)

    Wang, Jue; Luo, Sheng

    2016-01-01

    Mixed effects Beta regression models based on Beta distributions have been widely used to analyze longitudinal percentage or proportional data ranging between zero and one. However, Beta distributions are not flexible to extreme outliers or excessive events around tail areas, and they do not account for the presence of the boundary values zeros and ones because these values are not in the support of the Beta distributions. To address these issues, we propose a mixed effects model using Beta rectangular distribution and augment it with the probabilities of zero and one. We conduct extensive simulation studies to assess the performance of mixed effects models based on both the Beta and Beta rectangular distributions under various scenarios. The simulation studies suggest that the regression models based on Beta rectangular distributions improve the accuracy of parameter estimates in the presence of outliers and heavy tails. The proposed models are applied to the motivating Neuroprotection Exploratory Trials in Parkinson's Disease (PD) Long-term Study-1 (LS-1 study, n = 1741), developed by The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson's Disease (NINDS NET-PD) network. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  10. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  11. PROPORTIONS AND HUMAN SCALE IN DAMASCENE COURTYARD HOUSES

    Directory of Open Access Journals (Sweden)

    M. Salim Ferwati

    2008-03-01

    Full Text Available Interior designers, architects, landscape architects, and even urban designers, agree that environment, as a form of non-verbal communication means, has a symbolic dimension to it. As for its aesthetic dimension, it seems that beauty is related to a certain proportion, partially and as a whole. Suitable proportion leaves a good impression upon the beholders, especially when it matches human proportion. That in fact was the underlining belief of LeCorbusier, according to which he developed his Modular concept. The study searches for a modular, or proportion, system that governs the design of Damascene traditional house. By geometrical and mathematical examinations of 28 traditional houses, it was found that a certain proportional relationship existed; however, these proportional relationships were not fixed ones. The study relied on analyzing the Iwan elevation as well as the inner courtyard proportion in relation to the building area. Charts, diagrams and tables were produced to summarize the results.

  12. Substitution elasticities between GHG-polluting and nonpolluting inputs in agricultural production: A meta-regression

    International Nuclear Information System (INIS)

    Liu, Boying; Richard Shumway, C.

    2016-01-01

    This paper reports meta-regressions of substitution elasticities between greenhouse gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. We treat energy, fertilizer, and manure collectively as the “polluting input” and labor, land, and capital as nonpolluting inputs. We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input. Much of the heterogeneity of Morishima elasticities can be explained by type of primal or dual function, functional form, type and observational level of data, input categories, number of outputs, type of output, time period, and country categories. Each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. These findings imply that life-cycle analysis based on fixed proportion production functions could provide grossly inaccurate measures of GHG of biofuel. - Highlights: • This paper reports meta-regressions of substitution elasticities between greenhouse-gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. • We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input based on 65 primary studies. • We found that each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. • These findings imply that life-cycle analysis based on fixed proportion production functions could

  13. Working towards a clearer and more helpful hazard map: investigating the influence of hazard map design on hazard communication

    Science.gov (United States)

    Thompson, M. A.; Lindsay, J. M.; Gaillard, J.

    2015-12-01

    Globally, geological hazards are communicated using maps. In traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map for stakeholder and public use. However, this one-way, top-down approach to hazard communication is not necessarily effective or reliable. The messages which people take away will be dependent on the way in which they read, interpret, and understand the map, a facet of hazard communication which has been relatively unexplored. Decades of cartographic studies suggest that variables in the visual representation of data on maps, such as colour and symbology, can have a powerful effect on how people understand map content. In practice, however, there is little guidance or consistency in how hazard information is expressed and represented on maps. Accordingly, decisions are often made based on subjective preference, rather than research-backed principles. Here we present the results of a study in which we explore how hazard map design features can influence hazard map interpretation, and we propose a number of considerations for hazard map design. A series of hazard maps were generated, with each one showing the same probabilistic volcanic ashfall dataset, but using different verbal and visual variables (e.g., different colour schemes, data classifications, probabilistic formats). Following a short pilot study, these maps were used in an online survey of 110 stakeholders and scientists in New Zealand. Participants answered 30 open-ended and multiple choice questions about ashfall hazard based on the different maps. Results suggest that hazard map design can have a significant influence on the messages readers take away. For example, diverging colour schemes were associated with concepts of "risk" and decision-making more than sequential schemes, and participants made more precise estimates of hazard with isarithmic data classifications compared to binned or gradational shading. Based on such

  14. PM10 modeling in the Oviedo urban area (Northern Spain) by using multivariate adaptive regression splines

    Science.gov (United States)

    Nieto, Paulino José García; Antón, Juan Carlos Álvarez; Vilán, José Antonio Vilán; García-Gonzalo, Esperanza

    2014-10-01

    The aim of this research work is to build a regression model of the particulate matter up to 10 micrometers in size (PM10) by using the multivariate adaptive regression splines (MARS) technique in the Oviedo urban area (Northern Spain) at local scale. This research work explores the use of a nonparametric regression algorithm known as multivariate adaptive regression splines (MARS) which has the ability to approximate the relationship between the inputs and outputs, and express the relationship mathematically. In this sense, hazardous air pollutants or toxic air contaminants refer to any substance that may cause or contribute to an increase in mortality or serious illness, or that may pose a present or potential hazard to human health. To accomplish the objective of this study, the experimental dataset of nitrogen oxides (NOx), carbon monoxide (CO), sulfur dioxide (SO2), ozone (O3) and dust (PM10) were collected over 3 years (2006-2008) and they are used to create a highly nonlinear model of the PM10 in the Oviedo urban nucleus (Northern Spain) based on the MARS technique. One main objective of this model is to obtain a preliminary estimate of the dependence between PM10 pollutant in the Oviedo urban area at local scale. A second aim is to determine the factors with the greatest bearing on air quality with a view to proposing health and lifestyle improvements. The United States National Ambient Air Quality Standards (NAAQS) establishes the limit values of the main pollutants in the atmosphere in order to ensure the health of healthy people. Firstly, this MARS regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the main pollutants in the Oviedo urban area. Secondly, the main advantages of MARS are its capacity to produce simple, easy-to-interpret models, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, on the basis of

  15. Hazardous Alcohol Use in 2 Countries: A Comparison Between Alberta, Canada and Queensland, Australia

    Directory of Open Access Journals (Sweden)

    Diana C. Sanchez-Ramirez

    2017-09-01

    Full Text Available Objectives This article aimed to compare alcohol consumption between the populations of Queensland in Australia and Alberta in Canada. Furthermore, the associations between greater alcohol consumption and socio-demographic characteristics were explored in each population. Methods Data from 2500 participants of the 2013 Alberta Survey and the 2013 Queensland Social Survey were analyzed. Regression analyses were used to explore the associations between alcohol risk and socio-demographic characteristics. Results A higher rate of hazardous alcohol use was found in Queenslanders than in Albertans. In both Albertans and Queenslanders, hazardous alcohol use was associated with being between 18 and 24 years of age. Higher income, having no religion, living alone, and being born in Canada were also associated with alcohol risk in Albertans; while in Queenslanders, hazardous alcohol use was also associated with common-law marital status. In addition, hazardous alcohol use was lower among respondents with a non-Catholic or Protestant religious affiliation. Conclusions Younger age was associated with greater hazardous alcohol use in both populations. In addition, different socio-demographic factors were associated with hazardous alcohol use in each of the populations studied. Our results allowed us to identify the socio-demographic profiles associated with hazardous alcohol use in Alberta and Queensland. These profiles constitute valuable sources of information for local health authorities and policymakers when designing suitable preventive strategies targeting hazardous alcohol use. Overall, the present study highlights the importance of analyzing the socio-demographic factors associated with alcohol consumption in population-specific contexts.

  16. Multi-hazard risk assessment of the Republic of Mauritius

    Science.gov (United States)

    Mysiak, Jaroslav; Galli, Alberto; Amadio, Mattia; Teatini, Chiara

    2013-04-01

    The Republic of Mauritius (ROM) is a small island developing state (SIDS), part of the Mascarene Islands in West Indian Ocean, comprised by Mauritius, Rodrigues, Agalega and St. Brandon islands and several islets. ROM is exposed to many natural hazards notably cyclones, tsunamis, torrential precipitation, landslides, and droughts; and highly vulnerable sea level rise (SLR) driven by human induced climate change. The multihazard risk assessment presented in this paper is aimed at identifying the areas prone to flood, inundation and landslide hazard, and inform the development of strategy for disaster risk reduction (DRR) and climate change adaptation (CCA). Climate risk analysis - a central component of the analysis - is one of the first comprehensive climate modelling studies conducted for the country. Climate change may lift the temperature by 1-2 degree Celsius by 2060-2070, and increase sizably the intensity and frequency of extreme precipitation events. According to the IPCC Forth Assessment Report (AR4), the expected Sea Level Rise (SLR) ranges between 16 and 49 cm. Individually or in combination, the inland flood, coastal inundation and landslide hazards affect large proportion of the country. Sea level rise and the changes in precipitation regimes will amplified existing vulnerabilities and create new ones. The paper outlines an Action plan for Disaster Risk Reduction that takes into account the likely effects of climate change. The Action Plan calls on the government to establish a National Platform for Disaster Risk Reduction as recommended by the Hyogo Framework for Action (HFA) 2005-2015. It consists of nine recommendations which, if put in practice, will significantly reduce the annual damage to natural hazard and produce additional (ancillary) benefits in economic, social and environmental terms.

  17. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    Science.gov (United States)

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  19. Meta-analyses of the proportion of Japanese encephalitis virus infection in vectors and vertebrate hosts.

    Science.gov (United States)

    Oliveira, Ana R S; Cohnstaedt, Lee W; Strathe, Erin; Hernández, Luciana Etcheverry; McVey, D Scott; Piaggio, José; Cernicchiaro, Natalia

    2017-09-07

    Japanese encephalitis (JE) is a zoonosis in Southeast Asia vectored by mosquitoes infected with the Japanese encephalitis virus (JEV). Japanese encephalitis is considered an emerging exotic infectious disease with potential for introduction in currently JEV-free countries. Pigs and ardeid birds are reservoir hosts and play a major role on the transmission dynamics of the disease. The objective of the study was to quantitatively summarize the proportion of JEV infection in vectors and vertebrate hosts from data pertaining to observational studies obtained in a systematic review of the literature on vector and host competence for JEV, using meta-analyses. Data gathered in this study pertained to three outcomes: proportion of JEV infection in vectors, proportion of JEV infection in vertebrate hosts, and minimum infection rate (MIR) in vectors. Random-effects subgroup meta-analysis models were fitted by species (mosquito or vertebrate host species) to estimate pooled summary measures, as well as to compute the variance between studies. Meta-regression models were fitted to assess the association between different predictors and the outcomes of interest and to identify sources of heterogeneity among studies. Predictors included in all models were mosquito/vertebrate host species, diagnostic methods, mosquito capture methods, season, country/region, age category, and number of mosquitos per pool. Mosquito species, diagnostic method, country, and capture method represented important sources of heterogeneity associated with the proportion of JEV infection; host species and region were considered sources of heterogeneity associated with the proportion of JEV infection in hosts; and diagnostic and mosquito capture methods were deemed important contributors of heterogeneity for the MIR outcome. Our findings provide reference pooled summary estimates of vector competence for JEV for some mosquito species, as well as of sources of variability for these outcomes. Moreover, this

  20. Glyburide increases risk in patients with diabetes mellitus after emergent percutaneous intervention for myocardial infarction - A nationwide study

    DEFF Research Database (Denmark)

    Jørgensen, C H; Gislason, G H; Bretler, D

    2011-01-01

    Danish patients receiving glucose-lowering drugs admitted with myocardial infarction between 1997 and 2006 who underwent emergent percutaneous coronary intervention were identified from national registers. Multivariable Cox proportional hazards models were used to analyze the risk of cardiovascular.......9%) received metformin. Cox proportional hazard regression analyses adjusted for age, sex, calendar year, comorbidity and concomitant pharmacotherapy showed an increased risk of cardiovascular mortality (hazard ratio [HR] 2.91, 95% confidence interval [CI] 1.26-6.72 ; p=0.012), cardiovascular mortality...... and nonfatal myocardial infarction (HR 2.69 , 95% CI 1.21-6.00; p=0.016), and all-cause mortality (HR 2.46, 95% CI 1.11-5.47; p=0.027), respectively, with glyburide compared to metformin. CONCLUSIONS: Glyburide is associated with increased cardiovascular mortality and morbidity in patients with diabetes...

  1. Determinants of hazardous drinking among black South African men who have sex with men.

    Science.gov (United States)

    Knox, Justin; Reddy, Vasu; Lane, Tim; Lovasi, Gina; Hasin, Deborah; Sandfort, Theo

    2017-11-01

    There is a known heavy burden of hazardous drinking and its associated health risks among black South African MSM; however, no study to date has identified risk factors for hazardous drinking among this nor any other African MSM population. A cross-sectional survey was conducted among 480 black South African MSM recruited using respondent-driven sampling. All analyses were adjusted using an RDS II estimator. Multivariable logistic regression was used to assess the relationship between demographic characteristics, psychosocial factors, behavioral attributes and hazardous drinking. More than half of the men (62%, 95%CI=56%-68%) screened positive as hazardous drinkers. In multivariable analyses, living in a township (versus the city of Pretoria) (aOR=1.9, 95%CI=1.2-3.1, pchild (aOR=2.6, 95%CI=1.1-6.4, p=.03), having anxiety (aOR=5.4, 95%CI=1.2-24.3, p=.03), and social network drinking behavior (aOR=5.4, 95%CI=1.2-24.3, p=.03) were positively associated with hazardous drinking. Being sexually attracted only to men (aOR=0.3, 95%CI=0.1-0.8, p=.01) was negatively associated with hazardous drinking. Hazardous drinking is highly prevalent among black South African MSM. Multiple indicators of social vulnerability were identified as independent determinants of hazardous drinking. These findings are of heightened concern because these health problems often work synergistically to increase risk of HIV infection and should be taken into consideration by efforts aimed at reducing hazardous drinking among this critical population. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Handbook of hazardous waste management

    International Nuclear Information System (INIS)

    Metry, A.A.

    1980-01-01

    The contents of this work are arranged so as to give the reader a detailed understanding of the elements of hazardous waste management. Generalized management concepts are covered in Chapters 1 through 5 which are entitled: Introduction, Regulations Affecting Hazardous Waste Management, Comprehensive Hazardous Waste Management, Control of Hazardous Waste Transportation, and Emergency Hazardous Waste Management. Chapters 6 through 11 deal with treatment concepts and are entitled: General Considerations for Hazardous Waste Management Facilities, Physical Treatment of Hazardous Wastes, Chemical Treatment of Hazardous Wastes, Biological Treatment of Hazardous Wastes, Incineration of Hazardous Wastes, and Hazardous Waste Management of Selected Industries. Chapters 12 through 15 are devoted to ultimate disposal concepts and are entitled: Land Disposal Facilities, Ocean Dumping of Hazardous Wastes, Disposal of Extremely Hazardous Wastes, and Generalized Criteria for Hazardous Waste Management Facilities

  3. Hazardous drinking and HIV-risk-related behavior among male clients of female sex workers in Tijuana, Mexico.

    Science.gov (United States)

    Goodman-Meza, David; Pitpitan, Eileen V; Semple, Shirley J; Wagner, Karla D; Chavarin, Claudia V; Strathdee, Steffanie A; Patterson, Thomas L

    2014-01-01

    Male clients of female sex workers (FSWs) are at high risk for HIV. Whereas the HIV risks of alcohol use are well understood, less is known about hazardous alcohol use among male clients of FSWs, particularly in Mexico. We sought to identify risk factors for hazardous alcohol use and test associations between hazardous alcohol use and HIV risk behavior among male clients in Tijuana. Male clients of FSWs in Tijuana (n = 400) completed a quantitative interview in 2008. The AUDIT was used to characterize hazardous alcohol use. Multivariate logistic regression was used to determine independent associations of demographic and HIV risk variables with hazardous alcohol use (vs. non-hazardous). Forty percent of our sample met criteria for hazardous alcohol use. Variables independently associated with hazardous drinking were reporting any sexually transmitted infection (STI), having sex with a FSW while under the influence of alcohol, being younger than 36 years of age, living in Tijuana, and ever having been jailed. Hazardous drinkers were less likely ever to have been deported or to have shared injection drugs. Hazardous alcohol use is associated with HIV risk, including engaging in sex with FSWs while intoxicated and having an STI among male clients of FSWs in Tijuana. We systematically described patterns and correlates of hazardous alcohol use among male clients of FSWs in Tijuana, Mexico. The results suggest that HIV/STI risk reduction interventions must target hazardous alcohol users, and be tailored to address alcohol use. © American Academy of Addiction Psychiatry.

  4. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  5. Mix Proportion Design of Asphalt Concrete

    Science.gov (United States)

    Wu, Xianhu; Gao, Lingling; Du, Shoujun

    2017-12-01

    Based on the gradation of AC and SMA, this paper designs a new type of anti slide mixture with two types of advantages. Chapter introduces the material selection, ratio of ore mixture ratio design calculation, and determine the optimal asphalt content test and proportioning design of asphalt concrete mix. This paper introduces the new technology of mix proportion.

  6. Multivariate Models for Prediction of Human Skin Sensitization Hazard

    Science.gov (United States)

    Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M.; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole

    2016-01-01

    One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays—the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens™ assay—six physicochemical properties, and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression (LR) and support vector machine (SVM), to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three LR and three SVM) with the highest accuracy (92%) used: (1) DPRA, h-CLAT, and read-across; (2) DPRA, h-CLAT, read-across, and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens, and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy = 88%), any of the alternative methods alone (accuracy = 63–79%), or test batteries combining data from the individual methods (accuracy = 75%). These results suggest that computational methods are promising tools to effectively identify potential human skin sensitizers without animal testing. PMID:27480324

  7. Robust extraction of basis functions for simultaneous and proportional myoelectric control via sparse non-negative matrix factorization

    Science.gov (United States)

    Lin, Chuang; Wang, Binghui; Jiang, Ning; Farina, Dario

    2018-04-01

    Objective. This paper proposes a novel simultaneous and proportional multiple degree of freedom (DOF) myoelectric control method for active prostheses. Approach. The approach is based on non-negative matrix factorization (NMF) of surface EMG signals with the inclusion of sparseness constraints. By applying a sparseness constraint to the control signal matrix, it is possible to extract the basis information from arbitrary movements (quasi-unsupervised approach) for multiple DOFs concurrently. Main Results. In online testing based on target hitting, able-bodied subjects reached a greater throughput (TP) when using sparse NMF (SNMF) than with classic NMF or with linear regression (LR). Accordingly, the completion time (CT) was shorter for SNMF than NMF or LR. The same observations were made in two patients with unilateral limb deficiencies. Significance. The addition of sparseness constraints to NMF allows for a quasi-unsupervised approach to myoelectric control with superior results with respect to previous methods for the simultaneous and proportional control of multi-DOF. The proposed factorization algorithm allows robust simultaneous and proportional control, is superior to previous supervised algorithms, and, because of minimal supervision, paves the way to online adaptation in myoelectric control.

  8. Egyptian Environmental Activities and Regulations for Management of Hazardous Substances and Hazardous Wastes

    International Nuclear Information System (INIS)

    El Zarka, M.

    1999-01-01

    A substantial use of hazardous substances is essential to meet the social and economic goals of the community in Egypt. Agrochemicals are being used extensively to increase crop yield. The outdated agrochemicals and their empty containers represent a serious environmental problem. Industrial development in different sectors in Egypt obligates handling of huge amounts of hazardous substances and hazardous wastes. The inappropriate handling of such hazardous substances creates several health and environmental problems. Egypt faces many challenges to control safe handling of such substances and wastes. Several regulations are governing handling of hazardous substances in Egypt. The unified Environmental Law 4 for the year 1994 includes a full chapter on the Management of Hazardous Substances and Hazardous Wastes. National and international activities have been taken to manage hazardous substances and hazardous wastes in an environmental sound manner

  9. Why do card issuers charge proportional fees?

    OpenAIRE

    Oz Shy; Zhu Wang

    2008-01-01

    This paper explains why payment card companies charge consumers and merchants fees which are proportional to the transaction values instead of charging a fixed per-transaction fee. Our theory shows that, even in the absence of any cost considerations, card companies earn much higher profit when they charge proportional fees. It is also shown that competition among merchants reduces card companies' gains from using proportional fees relative to a fixed per-transaction fee. Merchants are found ...

  10. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  11. Modeling the Qualitative Relationship among Risks Associated with Occupational and Workplace Hazards in Seaport Environments: the Case of Apapa Port, Nigeria

    Directory of Open Access Journals (Sweden)

    Nwokedi Theophilus C

    2017-01-01

    Full Text Available The aim of the research is to establish the quantitative relationship and impacts of risks associated with various categories of occupational and workplace hazards in the Nigerian seaports. It was carried out by obtaining time series statistical data of 7 years from hazard identification and risk assessment report of Nigerian Ports Authority (NPA Apapa, western port headquarters. The variables considered are the associated risks of various types of occupational and workplace hazards to which seaport workers were exposed from 2009-2014. The overall level of associated risks of occupational and workplace hazards represent the cumulative of various hazards and were treated as the dependent variable ‘Y’. The exposures to the risks of mechanical hazards, ergonomic hazards, physical hazards, noise/environmental hazards were symbolized as X1, X2, X3, and X4 respectively and treated as independent variables. The method of multiple regression analysis was used to analyze the time series data. T-test was used to test the hypotheses. It was found that risks associated to mechanical hazard, ergonomic hazards, noise/vibration hazard, physical hazards, all have significant impact on the overall level of risk of exposure to occupational and workplace hazards in Nigerian seaport environment. It was recommended that proactive investment in safety inspective and management system is needed to limit the level of exposure of seaport staff to occupational hazards.

  12. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  13. Support Vector Hazards Machine: A Counting Process Framework for Learning Risk Scores for Censored Outcomes.

    Science.gov (United States)

    Wang, Yuanjia; Chen, Tianle; Zeng, Donglin

    2016-01-01

    Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.

  14. Which Mixed-Member Proportional Electoral Formula Fits You Best? Assessing the Proportionality Principle of Positive Vote Transfer Systems

    DEFF Research Database (Denmark)

    Bochsler, Daniel

    2014-01-01

    Mixed-member proportional systems (MMP) are a family of electoral systems which combine district-based elections with a proportional seat allocation. Positive vote transfer systems belong to this family. This article explains why they might be better than their siblings, and examines under which ...

  15. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  16. Low-density lipoprotein cholesterol and the risk of cancer: a mendelian randomization study

    DEFF Research Database (Denmark)

    Benn, Marianne; Tybjærg-Hansen, Anne; Stender, Stefan

    2011-01-01

    cholesterol was calculated using the Friedewald equation in samples in which the triglyceride level was less than 354 mg/dL and measured directly by colorimetry for samples with higher triglyceride levels. Risk of cancer was estimated prospectively using Cox proportional hazards regression analyses and cross...

  17. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  18. Two-Sample Tests for High-Dimensional Linear Regression with an Application to Detecting Interactions.

    Science.gov (United States)

    Xia, Yin; Cai, Tianxi; Cai, T Tony

    2018-01-01

    Motivated by applications in genomics, we consider in this paper global and multiple testing for the comparisons of two high-dimensional linear regression models. A procedure for testing the equality of the two regression vectors globally is proposed and shown to be particularly powerful against sparse alternatives. We then introduce a multiple testing procedure for identifying unequal coordinates while controlling the false discovery rate and false discovery proportion. Theoretical justifications are provided to guarantee the validity of the proposed tests and optimality results are established under sparsity assumptions on the regression coefficients. The proposed testing procedures are easy to implement. Numerical properties of the procedures are investigated through simulation and data analysis. The results show that the proposed tests maintain the desired error rates under the null and have good power under the alternative at moderate sample sizes. The procedures are applied to the Framingham Offspring study to investigate the interactions between smoking and cardiovascular related genetic mutations important for an inflammation marker.

  19. Depression and incident dementia. An 8-year population-based prospective study.

    Science.gov (United States)

    Luppa, Melanie; Luck, Tobias; Ritschel, Franziska; Angermeyer, Matthias C; Villringer, Arno; Riedel-Heller, Steffi G

    2013-01-01

    The aim of the study was to investigate the impact of depression (categorical diagnosis; major depression, MD) and depressive symptoms (dimensional diagnosis and symptom patterns) on incident dementia in the German general population. Within the Leipzig Longitudinal Study of the Aged (LEILA 75+), a representative sample of 1,265 individuals aged 75 years and older were interviewed every 1.5 years over 8 years (mean observation time 4.3 years; mean number of visits 4.2). Cox proportional hazards and binary logistic regressions were used to estimate the effect of baseline depression and depressive symptoms on incident dementia. The incidence of dementia was 48 per 1,000 person-years (95% confidence interval (CI) 45-51). Depressive symptoms (Hazard ratio HR 1.03, 95% CI 1.01-1.05), and in particular mood-related symptoms (HR 1.08, 95% CI 1.03-1.14), showed a significant impact on the incidence of dementia only in univariate analysis, but not after adjustment for cognitive and functional impairment. MD showed only a significant impact on incidence of dementia in Cox proportional hazards regression, but not in binary logistic regression models. The present study using different diagnostic measures of depression on future dementia found no clear significant associations of depression and incident dementia. Further in-depth investigation would help to understand the nature of depression in the context of incident dementia.

  20. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  1. Proportion congruency effects: Instructions may be enough

    Directory of Open Access Journals (Sweden)

    Olga eEntel

    2014-10-01

    Full Text Available Learning takes time, namely, one needs to be exposed to contingency relations between stimulus dimensions in order to learn, whereas intentional control can be recruited through task demands. Therefore showing that control can be recruited as a function of experimental instructions alone, that is, adapting the processing according to the instructions before the exposure to the task, can be taken as evidence for existence of control recruitment in the absence of learning. This was done by manipulating the information given at the outset of the experiment. In the first experiment, we manipulated list-level congruency proportion. Half of the participants were informed that most of the stimuli would be congruent, whereas the other half were informed that most of the stimuli would be incongruent. This held true for the stimuli in the second part of each experiment. In the first part, however, the proportion of the two stimulus types was equal. A proportion congruent effect was found in both parts of the experiment, but it was larger in the second part. In our second experiment, we manipulated the proportion of the stimuli within participants by applying an item-specific design. This was done by presenting some color words most often in their congruent color, and other color words in incongruent colors. Participants were informed about the exact word-color pairings in advance. Similar to Experiment 1, this held true only for the second experimental part. In contrast to our first experiment, informing participants in advance did not result in an item-specific proportion effect, which was observed only in the second part. Thus our results support the hypothesis that instructions may be enough to trigger list-level control, yet learning does contribute to the proportion congruent effect under such conditions. The item-level proportion effect is apparently caused by learning or at least it is moderated by it.

  2. Multiwire proportional chamber development

    Science.gov (United States)

    Doolittle, R. F.; Pollvogt, U.; Eskovitz, A. J.

    1973-01-01

    The development of large area multiwire proportional chambers, to be used as high resolution spatial detectors in cosmic ray experiments is described. A readout system was developed which uses a directly coupled, lumped element delay-line whose characteristics are independent of the MWPC design. A complete analysis of the delay-line and the readout electronic system shows that a spatial resolution of about 0.1 mm can be reached with the MWPC operating in the strictly proportional region. This was confirmed by measurements with a small MWPC and Fe-55 X-rays. A simplified analysis was carried out to estimate the theoretical limit of spatial resolution due to delta-rays, spread of the discharge along the anode wire, and inclined trajectories. To calculate the gas gain of MWPC's of different geometrical configurations a method was developed which is based on the knowledge of the first Townsend coefficient of the chamber gas.

  3. ''Hazardous'' terminology

    International Nuclear Information System (INIS)

    Powers, J.

    1991-01-01

    A number of terms (e.g., ''hazardous chemicals,'' ''hazardous materials,'' ''hazardous waste,'' and similar nomenclature) refer to substances that are subject to regulation under one or more federal environmental laws. State laws and regulations also provide additional, similar, or identical terminology that may be confused with the federally defined terms. Many of these terms appear synonymous, and it easy to use them interchangeably. However, in a regulatory context, inappropriate use of narrowly defined terms can lead to confusion about the substances referred to, the statutory provisions that apply, and the regulatory requirements for compliance under the applicable federal statutes. This information Brief provides regulatory definitions, a brief discussion of compliance requirements, and references for the precise terminology that should be used when referring to ''hazardous'' substances regulated under federal environmental laws. A companion CERCLA Information Brief (EH-231-004/0191) addresses ''toxic'' nomenclature

  4. Exposure to hazardous workplace noise and use of hearing protection devices among US workers--NHANES, 1999-2004.

    Science.gov (United States)

    Tak, Sangwoo; Davis, Rickie R; Calvert, Geoffrey M

    2009-05-01

    To estimate the prevalence of workplace noise exposure and use of hearing protection devices (HPDs) at noisy work, NIOSH analyzed 1999-2004 data from the National Health and Nutrition Examination Survey (NHANES). A total of 9,275 currently employed workers aged > or =16 years were included in the weighted analysis. Hazardous workplace noise exposure was defined as self-reported exposure to noise at their current job that was so loud that the respondent had to speak in a raised voice to be heard. Industry and occupation were determined based on the respondent's current place and type of work. Twenty-two million US workers (17%) reported exposure to hazardous workplace noise. The weighted prevalence of workplace noise exposure was highest for mining (76%, SE = 7.0) followed by lumber/wood product manufacturing (55%, SE = 2.5). High-risk occupations included repair and maintenance, motor vehicle operators, and construction trades. Overall, 34% of the estimated 22 million US workers reporting hazardous workplace exposure reported non-use of HPDs. The proportion of noise-exposed workers who reported non-use of HPDs was highest for healthcare and social services (73.7%, SE = 8.1), followed by educational services (55.5%). Hearing loss prevention and intervention programs should be targeted at those industries and occupations identified to have a high prevalence of workplace noise exposure and those industries with the highest proportion of noise-exposed workers who reported non-use of HPDs. Published 2009 Wiley-Liss, Inc.

  5. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  6. Disaggregated seismic hazard and the elastic input energy spectrum: An approach to design earthquake selection

    Science.gov (United States)

    Chapman, Martin Colby

    1998-12-01

    The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression

  7. Hazard Ranking Method for Populations Exposed to Arsenic in Private Water Supplies: Relation to Bedrock Geology.

    Science.gov (United States)

    Crabbe, Helen; Fletcher, Tony; Close, Rebecca; Watts, Michael J; Ander, E Louise; Smedley, Pauline L; Verlander, Neville Q; Gregory, Martin; Middleton, Daniel R S; Polya, David A; Studden, Mike; Leonardi, Giovanni S

    2017-12-01

    Approximately one million people in the UK are served by private water supplies (PWS) where main municipal water supply system connection is not practical or where PWS is the preferred option. Chronic exposure to contaminants in PWS may have adverse effects on health. South West England is an area with elevated arsenic concentrations in groundwater and over 9000 domestic dwellings here are supplied by PWS. There remains uncertainty as to the extent of the population exposed to arsenic (As), and the factors predicting such exposure. We describe a hazard assessment model based on simplified geology with the potential to predict exposure to As in PWS. Households with a recorded PWS in Cornwall were recruited to take part in a water sampling programme from 2011 to 2013. Bedrock geologies were aggregated and classified into nine Simplified Bedrock Geological Categories (SBGC), plus a cross-cutting "mineralized" area. PWS were sampled by random selection within SBGCs and some 508 households volunteered for the study. Transformations of the data were explored to estimate the distribution of As concentrations for PWS by SBGC. Using the distribution per SBGC, we predict the proportion of dwellings that would be affected by high concentrations and rank the geologies according to hazard. Within most SBGCs, As concentrations were found to have log-normal distributions. Across these areas, the proportion of dwellings predicted to have drinking water over the prescribed concentration value (PCV) for As ranged from 0% to 20%. From these results, a pilot predictive model was developed calculating the proportion of PWS above the PCV for As and hazard ranking supports local decision making and prioritization. With further development and testing, this can help local authorities predict the number of dwellings that might fail the PCV for As, based on bedrock geology. The model presented here for Cornwall could be applied in areas with similar geologies. Application of the method

  8. The Origins of Scintillator Non-Proportionality

    Science.gov (United States)

    Moses, W. W.; Bizarri, G. A.; Williams, R. T.; Payne, S. A.; Vasil'ev, A. N.; Singh, J.; Li, Q.; Grim, J. Q.; Choong, W.-S.

    2012-10-01

    Recent years have seen significant advances in both theoretically understanding and mathematically modeling the underlying causes of scintillator non-proportionality. The core cause is that the interaction of radiation with matter invariably leads to a non-uniform ionization density in the scintillator, coupled with the fact that the light yield depends on the ionization density. The mechanisms that lead to the luminescence dependence on ionization density are incompletely understood, but several important features have been identified, notably Auger-like processes (where two carriers of excitation interact with each other, causing one to de-excite non-radiatively), the inability of excitation carriers to recombine (caused either by trapping or physical separation), and the carrier mobility. This paper reviews the present understanding of the fundamental origins of scintillator non-proportionality, specifically the various theories that have been used to explain non-proportionality.

  9. Parity and the risk of breast and ovarian cancer in and mutation carriers

    OpenAIRE

    Milne , Roger L.; Osorio , Ana; Ramón Y Cajal , Teresa; Baiget , Montserrat; Lasa , Adriana; Diaz-Rubio , Eduardo; Hoya , Miguel; Caldés , Trinidad; Teulé , Alex; Lázaro , Conxi; Blanco , Ignacio; Balmaña , Judith; Sánchez-Ollé , Gessamí; Vega , Ana; Blanco , Ana

    2009-01-01

    Abstract Environmental or lifestyle factors are likely to explain part of the heterogeneity in breast and ovarian cancer risk among BRCA1 and BRCA2 mutation carriers. We assessed parity as a risk modifier in 515 and 503 Spanish female carriers of mutations in BRCA1 and BRCA2, respectively. Hazard ratios (HR) and their corresponding 95% confidence intervals (CI) were estimated using weighted Cox proportional hazards regression, adjusted for year of birth and study centre. The result...

  10. The Relationship of Alcoholism and Alcohol Consumption to All-Cause Mortality in Forty-One-Year Follow-up of the Swedish REBUS Sample

    DEFF Research Database (Denmark)

    Lundin, Andreas; Mortensen, Laust Hvas; Halldin, Jan

    2015-01-01

    of the International Classification of Diseases (ICD-8). Information on the usual amount and frequency of alcohol consumption was collected at the psychiatric interview. Mortality up to year 2011 was assessed with Cox proportional hazard regression models. RESULTS: At baseline, there were 65 men and 21 women diagnosed...

  11. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  12. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  13. Survival analysis in hematologic malignancies: recommendations for clinicians

    Science.gov (United States)

    Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril

    2014-01-01

    The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982

  14. A comparison of Cox and logistic regression for use in genome-wide association studies of cohort and case-cohort design.

    Science.gov (United States)

    Staley, James R; Jones, Edmund; Kaptoge, Stephen; Butterworth, Adam S; Sweeting, Michael J; Wood, Angela M; Howson, Joanna M M

    2017-06-01

    Logistic regression is often used instead of Cox regression to analyse genome-wide association studies (GWAS) of single-nucleotide polymorphisms (SNPs) and disease outcomes with cohort and case-cohort designs, as it is less computationally expensive. Although Cox and logistic regression models have been compared previously in cohort studies, this work does not completely cover the GWAS setting nor extend to the case-cohort study design. Here, we evaluated Cox and logistic regression applied to cohort and case-cohort genetic association studies using simulated data and genetic data from the EPIC-CVD study. In the cohort setting, there was a modest improvement in power to detect SNP-disease associations using Cox regression compared with logistic regression, which increased as the disease incidence increased. In contrast, logistic regression had more power than (Prentice weighted) Cox regression in the case-cohort setting. Logistic regression yielded inflated effect estimates (assuming the hazard ratio is the underlying measure of association) for both study designs, especially for SNPs with greater effect on disease. Given logistic regression is substantially more computationally efficient than Cox regression in both settings, we propose a two-step approach to GWAS in cohort and case-cohort studies. First to analyse all SNPs with logistic regression to identify associated variants below a pre-defined P-value threshold, and second to fit Cox regression (appropriately weighted in case-cohort studies) to those identified SNPs to ensure accurate estimation of association with disease.

  15. Social Stressors and Alcohol Use among Immigrant Sexual and Gender Minority Latinos in a Non-Traditional Settlement State

    OpenAIRE

    Gilbert, Paul A.; Perreira, Krista; Eng, Eugenia; Rhodes, Scott D.

    2014-01-01

    We sought to quantify the association of social stressors with alcohol use among immigrant sexual and gender minority Latinos in North Carolina (n = 190). We modeled any drinking in past year using logistic regression and heavy episodic drinking in past 30 days using Poisson regression. Despite a large proportion of abstainers, there were indications of hazardous drinking. Among current drinkers, 63% reported at least one heavy drinking episode in past 30 days. Ethnic discrimination increased...

  16. Multi-Hazard Interactions in Guatemala

    Science.gov (United States)

    Gill, Joel; Malamud, Bruce D.

    2017-04-01

    In this paper, we combine physical and social science approaches to develop a multi-scale regional framework for natural hazard interactions in Guatemala. The identification and characterisation of natural hazard interactions is an important input for comprehensive multi-hazard approaches to disaster risk reduction at a regional level. We use five transdisciplinary evidence sources to organise and populate our framework: (i) internationally-accessible literature; (ii) civil protection bulletins; (iii) field observations; (iv) stakeholder interviews (hazard and civil protection professionals); and (v) stakeholder workshop results. These five evidence sources are synthesised to determine an appropriate natural hazard classification scheme for Guatemala (6 hazard groups, 19 hazard types, and 37 hazard sub-types). For a national spatial extent (Guatemala), we construct and populate a "21×21" hazard interaction matrix, identifying 49 possible interactions between 21 hazard types. For a sub-national spatial extent (Southern Highlands, Guatemala), we construct and populate a "33×33" hazard interaction matrix, identifying 112 possible interactions between 33 hazard sub-types. Evidence sources are also used to constrain anthropogenic processes that could trigger natural hazards in Guatemala, and characterise possible networks of natural hazard interactions (cascades). The outcomes of this approach are among the most comprehensive interaction frameworks for national and sub-national spatial scales in the published literature. These can be used to support disaster risk reduction and civil protection professionals in better understanding natural hazards and potential disasters at a regional scale.

  17. 16 CFR 240.9 - Proportionally equal terms.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Proportionally equal terms. 240.9 Section 240.9 Commercial Practices FEDERAL TRADE COMMISSION GUIDES AND TRADE PRACTICE RULES GUIDES FOR ADVERTISING ALLOWANCES AND OTHER MERCHANDISING PAYMENTS AND SERVICES § 240.9 Proportionally equal terms. (a...

  18. Assessing Lake Trophic Status: A Proportional Odds Logistic Regression Model

    Science.gov (United States)

    Lake trophic state classifications are good predictors of ecosystem condition and are indicative of both ecosystem services (e.g., recreation and aesthetics), and disservices (e.g., harmful algal blooms). Methods for classifying trophic state are based off the foundational work o...

  19. Mathematical models for estimating earthquake casualties and damage cost through regression analysis using matrices

    International Nuclear Information System (INIS)

    Urrutia, J D; Bautista, L A; Baccay, E B

    2014-01-01

    The aim of this study was to develop mathematical models for estimating earthquake casualties such as death, number of injured persons, affected families and total cost of damage. To quantify the direct damages from earthquakes to human beings and properties given the magnitude, intensity, depth of focus, location of epicentre and time duration, the regression models were made. The researchers formulated models through regression analysis using matrices and used α = 0.01. The study considered thirty destructive earthquakes that hit the Philippines from the inclusive years 1968 to 2012. Relevant data about these said earthquakes were obtained from Philippine Institute of Volcanology and Seismology. Data on damages and casualties were gathered from the records of National Disaster Risk Reduction and Management Council. This study will be of great value in emergency planning, initiating and updating programs for earthquake hazard reduction in the Philippines, which is an earthquake-prone country.

  20. Hazard management at the workplace

    International Nuclear Information System (INIS)

    Hasfazilah Hassan; Azimawati Ahmad; Syed Asraf Fahlawi Wafa S M Ghazi; Hairul Nizam Idris

    2005-01-01

    Failure to ensure health and safety environment at workplace will cause an accident involving loss to the time, human resource, finance and for the worse case effect the moral value of an organization. If we go through to the cause of the accident, it is impossible to have a totally safety workplace. It is because every process in work activities has it own hazard elements. The purpose of this paper is to discuss the best action to prevent from the hazard with a comprehensive and effectiveness hazard management. Hazard management is the one of the pro-active hazard control. With this we manage to identify and evaluate the hazard and control the hazard risk. Therefore, hazard management should be screened constantly and continuously to make sure work hazard always in control. (Author)

  1. DOE Hazardous Waste Program

    International Nuclear Information System (INIS)

    Eyman, L.D.; Craig, R.B.

    1985-01-01

    The goal of the DOE Hazardous Waste Program is to support the implementation and improvement of hazardous-chemical and mixed-radioactive-waste management such that public health, safety, and the environment are protected and DOE missions are effectively accomplished. The strategy for accomplishing this goal is to define the character and magnitude of hazardous wastes emanating from DOE facilities, determine what DOE resources are available to address these problems, define the regulatory and operational constraints, and develop programs and plans to resolve hazardous waste issues. Over the longer term the program will support the adaptation and application of technologies to meet hazardous waste management needs and to implement an integrated, DOE-wide hazardous waste management strategy. 1 reference, 1 figure

  2. Prediction of Vitamin D Deficiency Among Tabriz Elderly and Nursing Home Residents Using Stereotype Regression Model

    Directory of Open Access Journals (Sweden)

    Zohreh Razzaghi

    2011-07-01

    Full Text Available Objectives: Vitamin D deficiency is one of the most important health problems of any society. It is more common in elderly even in those dwelling in rest homes. By now, several studies have been conducted on vitamin D deficiency using current statistical models. In this study, corresponding proportional odds and stereotype regression methods were used to identify threatening factors related to vitamin D deficiency in elderly living in rest homes and comparing them with those who live out of the mentioned places. Methods & Materials: In this case-control study, there were 140 older persons living in rest homes and 140 ones not dwelling in these centers. In the present study, 25(OHD serum level variable and age, sex, body mass index, duration of exposure to sunlight variables were regarded as response and predictive variables to vitamin D deficiency, respectively. The analyses were carried out using corresponding proportional odds and stereotype regression methods and estimating parameters of these two models. Deviation statistics (AIC was used to evaluate and compare the mentioned methods. Stata.9.1 software was elected to conduct the analyses. Results: Average serum level of 25(OHD was 16.10±16.65 ng/ml and 39.62±24.78 ng/ml in individuals living in rest homes and those not living there, respectively (P=0.001. Prevalence of vitamin D deficiency (less than 20 ng/ml was observed in 75% of members of the group consisting of those living in rest homes and 23.78% of members of another group. Using corresponding proportional odds and stereotype regression methods, age, sex, body mass index, duration of exposure to sunlight variables and whether they are member of rest home were fitted. In both models, variables of group and duration of exposure to sunlight were regarded as meaningful (P<0.001. Stereotype regression model included group variable (odd ratio for a group suffering from severe vitamin D deficiency was 42.85, 95%CI:9.93-185.67 and

  3. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  4. Psychosocial Correlates of AUDIT-C Hazardous Drinking Risk Status: Implications for Screening and Brief Intervention in College Settings

    Science.gov (United States)

    Wahesh, Edward; Lewis, Todd F.

    2015-01-01

    The current study identified psychosocial variables associated with AUDIT-C hazardous drinking risk status for male and female college students. Logistic regression analysis revealed that AUDIT-C risk status was associated with alcohol-related negative consequences, injunctive norms, and descriptive norms for both male and female participants.…

  5. Physics-Based Simulations of Natural Hazards

    Science.gov (United States)

    Schultz, Kasey William

    Earthquakes and tsunamis are some of the most damaging natural disasters that we face. Just two recent events, the 2004 Indian Ocean earthquake and tsunami and the 2011 Haiti earthquake, claimed more than 400,000 lives. Despite their catastrophic impacts on society, our ability to predict these natural disasters is still very limited. The main challenge in studying the earthquake cycle is the non-linear and multi-scale properties of fault networks. Earthquakes are governed by physics across many orders of magnitude of spatial and temporal scales; from the scale of tectonic plates and their evolution over millions of years, down to the scale of rock fracturing over milliseconds to minutes at the sub-centimeter scale during an earthquake. Despite these challenges, there are useful patterns in earthquake occurrence. One such pattern, the frequency-magnitude relation, relates the number of large earthquakes to small earthquakes and forms the basis for assessing earthquake hazard. However the utility of these relations is proportional to the length of our earthquake records, and typical records span at most a few hundred years. Utilizing physics based interactions and techniques from statistical physics, earthquake simulations provide rich earthquake catalogs allowing us to measure otherwise unobservable statistics. In this dissertation I will discuss five applications of physics-based simulations of natural hazards, utilizing an earthquake simulator called Virtual Quake. The first is an overview of computing earthquake probabilities from simulations, focusing on the California fault system. The second uses simulations to help guide satellite-based earthquake monitoring methods. The third presents a new friction model for Virtual Quake and describes how we tune simulations to match reality. The fourth describes the process of turning Virtual Quake into an open source research tool. This section then focuses on a resulting collaboration using Virtual Quake for a detailed

  6. Hazardous alcohol consumption in non-aboriginal male inmates in New South Wales.

    Science.gov (United States)

    Field, Courtney

    2018-03-12

    Purpose The purpose of this paper is to examine correlates and predictors of hazardous drinking behaviour, that may be considered evidence of generalised strain, in a sample of incarcerated non-Aboriginal males in New South Wales, Australia. Design/methodology/approach Data were collected from 283 non-Aboriginal male inmates as part of a larger epidemiological survey of inmates in NSW undertaken in 2015 by the Justice Health and Forensic Mental Health Network. Data relating to a range of social factors were selected with reference to relevant literature and assessed with regards their predictive value for scores from the Alcohol Use Disorders Identification Test (AUDIT). To facilitate regression analysis, variables were logically organised into historical factors or adult factors. Findings Almost all participants reported some history of alcohol consumption. Hazardous drinking was common among participants. While parental alcohol problems and adult drug use were the only correlates of AUDIT scores, parental misuse of alcohol was shown to be an important predictor of AUDIT scores in regression analysis. The role of parent gender was inconclusive. Previous incarceration as an adult, employment status, and drug use as an adult also predicted AUDIT scores. Originality/value Alcohol abuse is common among inmates and the use of alcohol is implicated in the commission of many offences. A better understanding of its genesis may inspire novel approaches to treatment, leading to improved health outcomes for inmates.

  7. Contingency proportion systematically influences contingency learning.

    Science.gov (United States)

    Forrin, Noah D; MacLeod, Colin M

    2018-01-01

    In the color-word contingency learning paradigm, each word appears more often in one color (high contingency) than in the other colors (low contingency). Shortly after beginning the task, color identification responses become faster on the high-contingency trials than on the low-contingency trials-the contingency learning effect. Across five groups, we varied the high-contingency proportion in 10% steps, from 80% to 40%. The size of the contingency learning effect was positively related to high-contingency proportion, with the effect disappearing when high contingency was reduced to 40%. At the two highest contingency proportions, the magnitude of the effect increased over trials, the pattern suggesting that there was an increasing cost for the low-contingency trials rather than an increasing benefit for the high-contingency trials. Overall, the results fit a modified version of Schmidt's (2013, Acta Psychologica, 142, 119-126) parallel episodic processing account in which prior trial instances are routinely retrieved from memory and influence current trial performance.

  8. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a

  9. Natural hazards science strategy

    Science.gov (United States)

    Holmes, Robert R.; Jones, Lucile M.; Eidenshink, Jeffery C.; Godt, Jonathan W.; Kirby, Stephen H.; Love, Jeffrey J.; Neal, Christina A.; Plant, Nathaniel G.; Plunkett, Michael L.; Weaver, Craig S.; Wein, Anne; Perry, Suzanne C.

    2012-01-01

    The mission of the U.S. Geological Survey (USGS) in natural hazards is to develop and apply hazard science to help protect the safety, security, and economic well-being of the Nation. The costs and consequences of natural hazards can be enormous, and each year more people and infrastructure are at risk. USGS scientific research—founded on detailed observations and improved understanding of the responsible physical processes—can help to understand and reduce natural hazard risks and to make and effectively communicate reliable statements about hazard characteristics, such as frequency, magnitude, extent, onset, consequences, and where possible, the time of future events.To accomplish its broad hazard mission, the USGS maintains an expert workforce of scientists and technicians in the earth sciences, hydrology, biology, geography, social and behavioral sciences, and other fields, and engages cooperatively with numerous agencies, research institutions, and organizations in the public and private sectors, across the Nation and around the world. The scientific expertise required to accomplish the USGS mission in natural hazards includes a wide range of disciplines that this report refers to, in aggregate, as hazard science.In October 2010, the Natural Hazards Science Strategy Planning Team (H–SSPT) was charged with developing a long-term (10-year) Science Strategy for the USGS mission in natural hazards. This report fulfills that charge, with a document hereinafter referred to as the Strategy, to provide scientific observations, analyses, and research that are critical for the Nation to become more resilient to natural hazards. Science provides the information that decisionmakers need to determine whether risk management activities are worthwhile. Moreover, as the agency with the perspective of geologic time, the USGS is uniquely positioned to extend the collective experience of society to prepare for events outside current memory. The USGS has critical statutory

  10. Against proportional shortfall as a priority-setting principle.

    Science.gov (United States)

    Altmann, Samuel

    2018-05-01

    As the demand for healthcare rises, so does the need for priority setting in healthcare. In this paper, I consider a prominent priority-setting principle: proportional shortfall. My purpose is to argue that proportional shortfall, as a principle, should not be adopted. My key criticism is that proportional shortfall fails to consider past health.Proportional shortfall is justified as it supposedly balances concern for prospective health while still accounting for lifetime health, even though past health is deemed irrelevant. Accounting for this lifetime perspective means that the principle may indirectly consider past health by accounting for how far an individual is from achieving a complete, healthy life. I argue that proportional shortfall does not account for this lifetime perspective as it fails to incorporate the fair innings argument as originally claimed, undermining its purported justification.I go on to demonstrate that the case for ignoring past health is weak, and argue that past health is at least sometimes relevant for priority-setting decisions. Specifically, when an individual's past health has a direct impact on current or future health, and when one individual has enjoyed significantly more healthy life years than another.Finally, I demonstrate that by ignoring past illnesses, even those entirely unrelated to their current illness, proportional shortfall can lead to instances of double jeopardy, a highly problematic implication. These arguments give us reason to reject proportional shortfall. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Radiation hazards

    International Nuclear Information System (INIS)

    Rausch, L.

    1979-01-01

    On a scientific basis and with the aid of realistic examples, the author gives a popular introduction to an understanding and judgment of the public discussion over radiation hazards: Uses and hazards of X-ray examinations, biological radiation effects, civilisation risks in comparison, origins and explanation of radiation protection regulations. (orig.) [de

  12. The divine proportion

    CERN Document Server

    Huntley, H E

    1970-01-01

    Using simple mathematical formulas, most as basic as Pythagoras's theorem and requiring only a very limited knowledge of mathematics, Professor Huntley explores the fascinating relationship between geometry and aesthetics. Poetry, patterns like Pascal's triangle, philosophy, psychology, music, and dozens of simple mathematical figures are enlisted to show that the ""divine proportion"" or ""golden ratio"" is a feature of geometry and analysis which awakes answering echoes in the human psyche. When we judge a work of art aesthetically satisfying, according to his formulation, we are making it c

  13. Pressure control valve using proportional electro-magnetic solenoid actuator

    International Nuclear Information System (INIS)

    Yun, So Nam; Ham, Young Bog; Park, Pyoung Won

    2006-01-01

    This paper presents an experimental characteristics of electro-hydraulic proportional pressure control valve. In this study, poppet and valve body which are assembled into the proportional solenoid were designed and manufactured. The constant force characteristics of proportional solenoid actuator in the control region should be independent of the plunger position in order to be used to control the valve position in the fluid flow control system. The stroke-force characteristics of the proportional solenoid actuator is determined by the shape (or parameters) of the control cone. In this paper, steady state and transient characteristics of the solenoid actuator for electro-hydraulic proportional valve are analyzed using finite element method and it is confirmed that the proportional solenoid actuator has a constant attraction force in the control region independently on the stroke position. The effects of the parameters such as control cone length, thickness and taper length are also discussed

  14. Application and validation of Cox regression models in a single-center series of double kidney transplantation.

    Science.gov (United States)

    Santori, G; Fontana, I; Bertocchi, M; Gasloli, G; Magoni Rossi, A; Tagliamacco, A; Barocci, S; Nocera, A; Valente, U

    2010-05-01

    A useful approach to reduce the number of discarded marginal kidneys and to increase the nephron mass is double kidney transplantation (DKT). In this study, we retrospectively evaluated the potential predictors for patient and graft survival in a single-center series of 59 DKT procedures performed between April 21, 1999, and September 21, 2008. The kidney recipients of mean age 63.27 +/- 5.17 years included 16 women (27%) and 43 men (73%). The donors of mean age 69.54 +/- 7.48 years included 32 women (54%) and 27 men (46%). The mean posttransplant dialysis time was 2.37 +/- 3.61 days. The mean hospitalization was 20.12 +/- 13.65 days. Average serum creatinine (SCr) at discharge was 1.5 +/- 0.59 mg/dL. In view of the limited numbers of recipient deaths (n = 4) and graft losses (n = 8) that occurred in our series, the proportional hazards assumption for each Cox regression model with P DKT (P = .043), and SCr 6 months post-DKT (P = .017). All significant univariate models for graft survival passed the Schoenfeld test. A final multivariate model retained SCr at 6 months (beta = 1.746, P = .042) and donor SCr (beta = .767, P = .090). In our analysis, SCr at 6 months seemed to emerge from both univariate and multivariate Cox models as a potential predictor of graft survival among DKT. Multicenter studies with larger recipient populations and more graft losses should be performed to confirm our findings. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  15. The effect of parity on the proportion of important healthy fatty acids in raw milk of Holstein cows

    Directory of Open Access Journals (Sweden)

    Luděk Stádník

    2013-11-01

    Full Text Available The objective of this study was to determine and evaluate the effect of parity on the fatty acids groups’ proportion in Holstein cows’ milk during the first phase of lactations, with an emphasis on its potential importance for consumer health. A total of 25 Holstein cows, 9 primiparous, 9 in the 2nd, and 7 in the 3rd and subsequent parity, were observed and sampled at 7-day intervals through the first 17 weeks of lactation. The percentage proportion of saturated (hypercholesterolemic and volatile as its components and unsaturated (monounsaturated and polyunsaturated as its components fatty acids in the samples of milk fat (n=425 was determined. The effects of parity and negative energy balance, as well as regression, on the lactation week and the fat to protein ratio were evaluated using SAS 9.3. A significantly (P<0.01 lower proportion of unhealthy hypercholesterolemic fatty acids was detected in primiparous cows (-2.67 % and those in the 3rd and subsequent lactation (-2.94 % compared to the 2nd lactation, as well as a simultaneously higher proportion of healthy unsaturated fatty acids (+2.07, respectively +3.08 %. The determined relationships corresponded to organism stress evoked by the initiation of milk production and its maintenance in higher parities. Therefore, the generally required prolongation of dairy cows’ longevity can influence on the quality of raw milk, especially considering composition of fatty acids.

  16. Hazard screening application guide

    International Nuclear Information System (INIS)

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information

  17. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  18. PEP quark search proportional chambers

    Energy Technology Data Exchange (ETDEWEB)

    Parker, S I; Harris, F; Karliner, I; Yount, D [Hawaii Univ., Honolulu (USA); Ely, R; Hamilton, R; Pun, T [California Univ., Berkeley (USA). Lawrence Berkeley Lab.; Guryn, W; Miller, D; Fries, R [Northwestern Univ., Evanston, IL (USA)

    1981-04-01

    Proportional chambers are used in the PEP Free Quark Search to identify and remove possible background sources such as particles traversing the edges of counters, to permit geometric corrections to the dE/dx and TOF information from the scintillator and Cerenkov counters, and to look for possible high cross section quarks. The present beam pipe has a thickness of 0.007 interaction lengths (lambdasub(i)) and is followed in both arms each with 45/sup 0/ <= theta <= 135/sup 0/, ..delta..phi=90/sup 0/ by 5 proportional chambers, each 0.0008 lambdasub(i) thick with 32 channels of pulse height readout, and by 3 thin scintillator planes, each 0.003 lambdasub(i) thick. Following this thin front end, each arm of the detector has 8 layers of scintillator (one with scintillating light pipes) interspersed with 4 proportional chambers and a layer of lucite Cerenkov counters. Both the calculated ion statistics and measurements using He-CH/sub 4/ gas in a test chamber indicate that the chamber efficiencies should be >98% for q=1/3. The Landau spread measured in the test was equal to that observed for normal q=1 traversals. One scintillator plane and thin chamber in each arm will have an extra set of ADC's with a wide gate bracketing the normal one so timing errors and tails of earlier pulses should not produce fake quarks.

  19. Multi-omics facilitated variable selection in Cox-regression model for cancer prognosis prediction.

    Science.gov (United States)

    Liu, Cong; Wang, Xujun; Genchev, Georgi Z; Lu, Hui

    2017-07-15

    New developments in high-throughput genomic technologies have enabled the measurement of diverse types of omics biomarkers in a cost-efficient and clinically-feasible manner. Developing computational methods and tools for analysis and translation of such genomic data into clinically-relevant information is an ongoing and active area of investigation. For example, several studies have utilized an unsupervised learning framework to cluster patients by integrating omics data. Despite such recent advances, predicting cancer prognosis using integrated omics biomarkers remains a challenge. There is also a shortage of computational tools for predicting cancer prognosis by using supervised learning methods. The current standard approach is to fit a Cox regression model by concatenating the different types of omics data in a linear manner, while penalty could be added for feature selection. A more powerful approach, however, would be to incorporate data by considering relationships among omics datatypes. Here we developed two methods: a SKI-Cox method and a wLASSO-Cox method to incorporate the association among different types of omics data. Both methods fit the Cox proportional hazards model and predict a risk score based on mRNA expression profiles. SKI-Cox borrows the information generated by these additional types of omics data to guide variable selection, while wLASSO-Cox incorporates this information as a penalty factor during model fitting. We show that SKI-Cox and wLASSO-Cox models select more true variables than a LASSO-Cox model in simulation studies. We assess the performance of SKI-Cox and wLASSO-Cox using TCGA glioblastoma multiforme and lung adenocarcinoma data. In each case, mRNA expression, methylation, and copy number variation data are integrated to predict the overall survival time of cancer patients. Our methods achieve better performance in predicting patients' survival in glioblastoma and lung adenocarcinoma. Copyright © 2017. Published by Elsevier

  20. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  1. Job Hazard Analysis

    National Research Council Canada - National Science Library

    1998-01-01

    .... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...

  2. Introduction: Hazard mapping

    Science.gov (United States)

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  3. Restrictions and Proportionality

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2009-01-01

    The article discusses three central aspects of the freedoms under European Community law, namely 1) the prohibition against restrictions as an important extension of the prohibition against discrimination, 2) a prohibition against exit restrictions which is just as important as the prohibition...... against host country restrictions, but which is often not recognised to the same extent by national law, and 3) the importance of also identifying and recognising an exit restriction, so that it is possible to achieve the required test of appropriateness and proportionality in relation to the rule...

  4. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  5. Disease proportions attributable to environment

    Directory of Open Access Journals (Sweden)

    Vineis Paolo

    2007-11-01

    Full Text Available Abstract Population disease proportions attributable to various causal agents are popular as they present a simplified view of the contribution of each agent to the disease load. However they are only summary figures that may be easily misinterpreted or over-interpreted even when the causal link between an exposure and an effect is well established. This commentary discusses several issues surrounding the estimation of attributable proportions, particularly with reference to environmental causes of cancers, and critically examines two recently published papers. These issues encompass potential biases as well as the very definition of environment and of environmental agent. The latter aspect is not just a semantic question but carries implications for the focus of preventive actions, whether centred on the material and social environment or on single individuals.

  6. Proton-recoil proportional counter tests at TREAT

    International Nuclear Information System (INIS)

    Fink, C.L.; Eichholz, J.J.; Burrows, D.R.; DeVolpi, A.

    1979-01-01

    A methane filled proton-recoil proportional counter will be used as a fission neutron detector in the fast-neutron hodoscope. To provide meaningful fuel-motion information the proportional counter should have: a linear response over a wide range of reactor powers background ratio (the number of high energy neutrons detected must be maximized relative to low energy neutrons, and gamma ray sensitivity must be kept small); and a detector efficiency for fission neutrons above 1 MeV of approximately 1%. In addition, it is desirable that the detector and the associated amplifier/discriminator be capable of operating at counting rates in excess of 500 kHz. This paper reports on tests that were conducted on several proportional counters at the TREAT reactor

  7. Barrow hazards survey

    International Nuclear Information System (INIS)

    1980-06-01

    Following a series of public meetings at which PERG presented the results of a literature review and site specific accident study of the hazards of the maritime transport of spent nuclear reactor fuel to Barrow (en route to the Windscale reprocessing works), PERG was requested by the Planning Committee of Barrow Town Council to prepare an assessment of the interaction of the hazards arising from the concentration of nuclear activities in the area with those of a proposed gas-terminal. This report presents a preliminary review of the Environmental Impact Assessments prepared by the Borough Surveyor and a critical appraisal of the hazard analyses undertaken by the Health and Safety Executive, and the consultants to Cumbria County Council on this matter, the Safety and Reliability Directorate of the United Kingdom Atomic Energy Authority. After a general and historical introduction, the document continues under the following headings: a description of the hazards (BNFL spent fuel shipments; the gas terminal; gas condensate storage; the Vickers shipyard (involving nuclear powered submarines)); the interaction of hazards; planning implications and democratic decisions; recommendations. (U.K.)

  8. Proportional counter end effects eliminator

    International Nuclear Information System (INIS)

    Meekins, J.F.

    1976-01-01

    An improved gas-filled proportional counter which includes a resistor network connected between the anode and cathode at the ends of the counter in order to eliminate ''end effects'' is described. 3 Claims, 2 Drawing Figures

  9. Household hazardous waste disposal to landfill: Using LandSim to model leachate migration

    International Nuclear Information System (INIS)

    Slack, Rebecca J.; Gronow, Jan R.; Hall, David H.; Voulvoulis, Nikolaos

    2007-01-01

    Municipal solid waste (MSW) landfill leachate contains a number of aquatic pollutants. A specific MSW stream often referred to as household hazardous waste (HHW) can be considered to contribute a large proportion of these pollutants. This paper describes the use of the LandSim (Landfill Performance Simulation) modelling program to assess the environmental consequences of leachate release from a generic MSW landfill in receipt of co-disposed HHW. Heavy metals and organic pollutants were found to migrate into the zones beneath a model landfill site over a 20,000-year period. Arsenic and chromium were found to exceed European Union and US-EPA drinking water standards at the unsaturated zone/aquifer interface, with levels of mercury and cadmium exceeding minimum reporting values (MRVs). The findings demonstrate the pollution potential arising from HHW disposal with MSW. - Aquatic pollutants linked to the disposal of household hazardous waste in municipal landfills have the potential to exist in soil and groundwater for many years

  10. 76 FR 4823 - Hazardous Waste Management System; Identifying and Listing Hazardous Waste Exclusion

    Science.gov (United States)

    2011-01-27

    ... Waste Management System; Identifying and Listing Hazardous Waste Exclusion AGENCY: Environmental... hazardous wastes. The Agency has decided to grant the petition based on an evaluation of waste-specific... excludes the petitioned waste from the requirements of hazardous waste regulations under the Resource...

  11. Integrated risk reduction framework to improve railway hazardous materials transportation safety

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiang, E-mail: liu94@illinois.edu; Saat, M. Rapik, E-mail: mohdsaat@illinois.edu; Barkan, Christopher P.L., E-mail: cbarkan@illinois.edu

    2013-09-15

    Highlights: • An integrated framework is developed to optimize risk reduction. • A negative binomial regression model is developed to analyze accident-cause-specific railcar derailment probability. • A Pareto-optimality technique is applied to determine the lowest risk given any level of resource. • A multi-attribute decision model is developed to determine the optimal amount of investment for risk reduction. • The models could aid the government and rail industry in developing cost-efficient risk reduction policy and practice. -- Abstract: Rail transportation plays a critical role to safely and efficiently transport hazardous materials. A number of strategies have been implemented or are being developed to reduce the risk of hazardous materials release from train accidents. Each of these risk reduction strategies has its safety benefit and corresponding implementation cost. However, the cost effectiveness of the integration of different risk reduction strategies is not well understood. Meanwhile, there has been growing interest in the U.S. rail industry and government to best allocate resources for improving hazardous materials transportation safety. This paper presents an optimization model that considers the combination of two types of risk reduction strategies, broken rail prevention and tank car safety design enhancement. A Pareto-optimality technique is used to maximize risk reduction at a given level of investment. The framework presented in this paper can be adapted to address a broader set of risk reduction strategies and is intended to assist decision makers for local, regional and system-wide risk management of rail hazardous materials transportation.

  12. Structural comparison of hazardous and non-hazardous coals based on gas sorption experiments

    Energy Technology Data Exchange (ETDEWEB)

    Lakatos, J.; Toth, J. [Research Lab. for Mining Chemistry, Hungarian Academy of Sciences, Miskolc-Egyetemvaros (Hungary); Radnai-Gyoengyoes, Z. [Geopard Ltd., Pecs (Hungary); Bokanyi, L. [Miskolc Univ., Miskolc-Egyetemvaros (Hungary). Dept. of Process Engineering

    1997-12-31

    Comparison of carbon-dioxide and propane sorption at ambient temperature was used for characterising the difference of the structure of hazardous and non hazardous coals. However, hazardous coals were found more microporous or contain more closed pores than non hazardous ones, this difference couldn`t have been enlarged and attributed to one petrographic component by producing the density fractions. Gas sorption isobars (nitrogen, methane, ethane) are proposed to make a distinction between fine pore structure of coals. (orig.)

  13. Channel selection for simultaneous and proportional myoelectric prosthesis control of multiple degrees-of-freedom

    Science.gov (United States)

    Hwang, Han-Jeong; Hahne, Janne Mathias; Müller, Klaus-Robert

    2014-10-01

    Objective. Recent studies have shown the possibility of simultaneous and proportional control of electrically powered upper-limb prostheses, but there has been little investigation on optimal channel selection. The objective of this study is to find a robust channel selection method and the channel subsets most suitable for simultaneous and proportional myoelectric prosthesis control of multiple degrees-of-freedom (DoFs). Approach. Ten able-bodied subjects and one person with congenital upper-limb deficiency took part in this study, and performed wrist movements with various combinations of two DoFs (flexion/extension and radial/ulnar deviation). During the experiment, high density electromyographic (EMG) signals and the actual wrist angles were recorded with an 8 × 24 electrode array and a motion tracking system, respectively. The wrist angles were estimated from EMG features with ridge regression using the subsets of channels chosen by three different channel selection methods: (1) least absolute shrinkage and selection operator (LASSO), (2) sequential feature selection (SFS), and (3) uniform selection (UNI). Main results. SFS generally showed higher estimation accuracy than LASSO and UNI, but LASSO always outperformed SFS in terms of robustness, such as noise addition, channel shift and training data reduction. It was also confirmed that about 95% of the original performance obtained using all channels can be retained with only 12 bipolar channels individually selected by LASSO and SFS. Significance. From the analysis results, it can be concluded that LASSO is a promising channel selection method for accurate simultaneous and proportional prosthesis control. We expect that our results will provide a useful guideline to select optimal channel subsets when developing clinical myoelectric prosthesis control systems based on continuous movements with multiple DoFs.

  14. Informing Workers of Chemical Hazards: The OSHA Hazard Communication Standard.

    Science.gov (United States)

    American Chemical Society, Washington, DC.

    Practical information on how to implement a chemical-related safety program is outlined in this publication. Highlights of the federal Occupational Safety and Health Administrations (OSHA) Hazard Communication Standard are presented and explained. These include: (1) hazard communication requirements (consisting of warning labels, material safety…

  15. Transport of hazardous goods

    International Nuclear Information System (INIS)

    1989-01-01

    The course 'Transport of hazardous goods' was held in Berlin in November 1988 in cooperation with the Bundesanstalt fuer Materialforschung und -pruefung. From all lecturs, two are recorded separately: 'Safety of tank trucks - requirements on the tank, development possibiities of active and passive safety' and 'Requirements on the transport of radioactive materials - possible derivations for other hazardous goods'. The other lectures deal with hazardous goods law, requirements on packinging, risk assessment, railroad transport, hazardous goods road network, insurance matters, EC regulations, and waste tourism. (HSCH) [de

  16. The proportionate value of proportionality in palliative sedation.

    Science.gov (United States)

    Berger, Jeffrey T

    2014-01-01

    Proportionality, as it pertains to palliative sedation, is the notion that sedation should be induced at the lowest degree effective for symptom control, so that the patient's consciousness may be preserved. The pursuit of proportionality in palliative sedation is a widely accepted imperative advocated in position statements and guidelines on this treatment. The priority assigned to the pursuit of proportionality, and the extent to which it is relevant for patients who qualify for palliative sedation, have been overstated. Copyright 2014 The Journal of Clinical Ethics. All rights reserved.

  17. Seismic hazard maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  18. Improving sensitivity of linear regression-based cell type-specific differential expression deconvolution with per-gene vs. global significance threshold.

    Science.gov (United States)

    Glass, Edmund R; Dozmorov, Mikhail G

    2016-10-06

    The goal of many human disease-oriented studies is to detect molecular mechanisms different between healthy controls and patients. Yet, commonly used gene expression measurements from blood samples suffer from variability of cell composition. This variability hinders the detection of differentially expressed genes and is often ignored. Combined with cell counts, heterogeneous gene expression may provide deeper insights into the gene expression differences on the cell type-specific level. Published computational methods use linear regression to estimate cell type-specific differential expression, and a global cutoff to judge significance, such as False Discovery Rate (FDR). Yet, they do not consider many artifacts hidden in high-dimensional gene expression data that may negatively affect linear regression. In this paper we quantify the parameter space affecting the performance of linear regression (sensitivity of cell type-specific differential expression detection) on a per-gene basis. We evaluated the effect of sample sizes, cell type-specific proportion variability, and mean squared error on sensitivity of cell type-specific differential expression detection using linear regression. Each parameter affected variability of cell type-specific expression estimates and, subsequently, the sensitivity of differential expression detection. We provide the R package, LRCDE, which performs linear regression-based cell type-specific differential expression (deconvolution) detection on a gene-by-gene basis. Accounting for variability around cell type-specific gene expression estimates, it computes per-gene t-statistics of differential detection, p-values, t-statistic-based sensitivity, group-specific mean squared error, and several gene-specific diagnostic metrics. The sensitivity of linear regression-based cell type-specific differential expression detection differed for each gene as a function of mean squared error, per group sample sizes, and variability of the proportions

  19. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  20. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    Garrett, R.J.

    2005-01-01

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified

  1. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  2. 76 FR 55846 - Hazardous Waste Management System: Identification and Listing of Hazardous Waste: Carbon Dioxide...

    Science.gov (United States)

    2011-09-09

    ... carbon dioxide (CO 2 ) streams that are hazardous from the definition of hazardous waste, provided these... management under the Resource Conservation and Recovery Act (RCRA) to conditionally exclude carbon dioxide... 2050-AG60 Hazardous Waste Management System: Identification and Listing of Hazardous Waste: Carbon...

  3. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  4. Time-trend of melanoma screening practice by primary care physicians: a meta-regression analysis.

    Science.gov (United States)

    Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni

    2009-01-01

    To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Meta-regression analyses of available data. MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%-82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening tended to decrease by 1.72% per year (P =0.086). Corresponding annual changes in European, North American, and Australian settings were -0.68% (P =0.494), -2.02% (P =0.044), and +2.59% (P =0.010), respectively. Changes were not influenced by national guide-lines. Considering the increasing incidence of melanoma and other skin malignancies, as well as their relative potential consequences, the FBSE implementation time-trend we retrieved should be considered a worrisome phenomenon.

  5. How do stocking density and straw provision affect fouling in conventionally housed slaughter pigs?

    DEFF Research Database (Denmark)

    Larsen, Mona Lilian Vestbjerg; Bertelsen, Maja; Pedersen, Lene Juul

    2017-01-01

    with excreta and/or urine. Only the first event of fouling for each pen was included, and thus results represent whether a pen had a fouling event or not and when it happened. Data was analysed by using a Cox regression assuming proportional hazard and with right censoring of pens that never developed fouling...

  6. Normalization Ridge Regression in Practice I: Comparisons Between Ordinary Least Squares, Ridge Regression and Normalization Ridge Regression.

    Science.gov (United States)

    Bulcock, J. W.

    The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…

  7. India's Conditional Cash Transfer Programme (the JSY to Promote Institutional Birth: Is There an Association between Institutional Birth Proportion and Maternal Mortality?

    Directory of Open Access Journals (Sweden)

    Bharat Randive

    Full Text Available India accounts for 19% of global maternal deaths, three-quarters of which come from nine states. In 2005, India launched a conditional cash transfer (CCT programme, Janani Suraksha Yojana (JSY, to reduce maternal mortality ratio (MMR through promotion of institutional births. JSY is the largest CCT in the world. In the nine states with relatively lower socioeconomic levels, JSY provides a cash incentive to all women on birthing in health institution. The cash incentive is intended to reduce financial barriers to accessing institutional care for delivery. Increased institutional births are expected to reduce MMR. Thus, JSY is expected to (a increase institutional births and (b reduce MMR in states with high proportions of institutional births. We examine the association between (a service uptake, i.e., institutional birth proportions and (b health outcome, i.e., MMR.Data from Sample Registration Survey of India were analysed to describe trends in proportion of institutional births before (2005 and during (2006-2010 the implementation of the JSY. Data from Annual Health Survey (2010-2011 for all 284 districts in above- mentioned nine states were analysed to assess relationship between MMR and institutional births.Proportion of institutional births increased from a pre-programme average of 20% to 49% in 5 years (p<0.05. In bivariate analysis, proportion of institutional births had a small negative correlation with district MMR (r = -0.11.The multivariate regression model did not establish significant association between institutional birth proportions and MMR [CI: -0.10, 0.68].Our analysis confirmed that JSY succeeded in raising institutional births significantly. However, we were unable to detect a significant association between institutional birth proportion and MMR. This indicates that high institutional birth proportions that JSY has achieved are of themselves inadequate to reduce MMR. Other factors including improved quality of care at

  8. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  9. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any

  10. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  11. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  12. The critical proportion of immune individuals needed to control hepatitis B

    Science.gov (United States)

    Ospina, Juan; Hincapié-Palacio, Doracelly

    2016-05-01

    We estimate the critical proportion of immunity (Pc) to control hepatitis B in Medellin - Colombia, based on a random population survey of 2077 individuals of 6-64 years of age. The force of infection (Fi) was estimated according to empirical data of susceptibility by age S(a), assuming a quadratic expression. Parameters were estimated by adjusting data to a nonlinear regression. Fi was defined by -(ds(a)/da)/s(a) and according to the form of the empirical curve S(a) we assume a quadratic expression given by S(a)= Ea2+Ba+C. Then we have the explicit expression for the accumulated Fi by age given by F(a) = -a(Ea+B)/c. The expression of average infection age A is obtained as A = L + EL3/(3C)+BL2/(2C) and the basic reproductive number R0 is obtained as R0 = 1 + 6C/(6C+2EL2+3BL). From the las result we obtain the Pc given by Pc= 6C/(12C+2EL2+3BL). Numerical simulations were performed with the age-susceptibility proportion and initial values (a=0.02, b=20, c=100), obtaining an adjusted coefficient of multiple determination of 64.83%. According to the best estimate, the algebraic expressions for S(a) and the Fi were derived. Using the result of Fi, we obtain A = 30, L =85; R0 CI 95%: 1.42 - 1.64 and Pc: 0-0.29. These results indicate that at the worst case, to maintain control of the disease should be immunes at least 30% of susceptible individuals. Similar results were obtained by sex and residential area.

  13. Putative golden proportions as predictors of facial esthetics in adolescents.

    Science.gov (United States)

    Kiekens, Rosemie M A; Kuijpers-Jagtman, Anne Marie; van 't Hof, Martin A; van 't Hof, Bep E; Maltha, Jaap C

    2008-10-01

    In orthodontics, facial esthetics is assumed to be related to golden proportions apparent in the ideal human face. The aim of the study was to analyze the putative relationship between facial esthetics and golden proportions in white adolescents. Seventy-six adult laypeople evaluated sets of photographs of 64 adolescents on a visual analog scale (VAS) from 0 to 100. The facial esthetic value of each subject was calculated as a mean VAS score. Three observers recorded the position of 13 facial landmarks included in 19 putative golden proportions, based on the golden proportions as defined by Ricketts. The proportions and each proportion's deviation from the golden target (1.618) were calculated. This deviation was then related to the VAS scores. Only 4 of the 19 proportions had a significant negative correlation with the VAS scores, indicating that beautiful faces showed less deviation from the golden standard than less beautiful faces. Together, these variables explained only 16% of the variance. Few golden proportions have a significant relationship with facial esthetics in adolescents. The explained variance of these variables is too small to be of clinical importance.

  14. Subjective Proportions: 18th-Century Interpretations of Paestum’s ‘Disproportion’

    Directory of Open Access Journals (Sweden)

    Sigrid de Jong

    2016-02-01

    Full Text Available When 18th-century travellers saw the Doric temples of Paestum in Southern Italy with their own eyes, they observed for the first time true examples of the proportions of archaic Greek architecture. Contrary to the Roman proportional systems, the Greek ones had been largely unavailable to architects until then. With the rediscovery of Paestum, conveniently located south of Naples and not in far away Greece, the secret of Greek proportions was no more. Architects were able to precisely measure the temples and wrote many accounts about their primitive forms and proportions. But what did architects mean exactly when describing the proportions as primitive? What kinds of reflections did these proportions provoke? This article treats proportions as aesthetics, or as visible proportions, not as a numerical system. The discourse on proportions changed in this period, giving more weight to their cultural and historical meaning. The writings by such architects as Soane, Wilkins, and Labrouste demonstrate how Paestum functioned as a laboratory to unveil the secret of primitive proportions, and how, with the different meanings architects attached to them, it enlarged and renewed the debate on proportions.

  15. TRENDS IN MORTALITY FROM OCCUPATIONAL HAZARDS AMONG MEN IN ENGLAND AND WALES DURING 1979-2010

    Science.gov (United States)

    Harris, E Clare; Palmer, Keith T; Cox, Vanessa; Darnton, Andrew; Osman, John; Coggon, David

    2016-01-01

    Objectives To monitor the impact of health and safety provisions and inform future preventive strategies, we investigated trends in mortality from established occupational hazards in England and Wales. Methods We analysed data from death certificates on underlying cause of death and last full-time occupation for 3,688,916 deaths among men aged 20-74 years in England and Wales during 1979-2010 (excluding 1981 when records were incomplete). Proportional mortality ratios (PMRs), standardised for age and social class, were calculated for occupations at risk of specified hazards. Observed and expected numbers of deaths for each hazard were summed across occupations, and the differences summarised as average annual excesses. Results Excess mortality declined substantially for most hazards. For example, the annual excess of deaths from chronic bronchitis and emphysema fell from 170.7 during 1979-90 to 36.0 in 2001-10, and that for deaths from injury and poisoning from 237.0 to 87.5. In many cases the improvements were associated with falling PMRs (suggesting safer working practices), but they also reflected reductions in the numbers of men employed in more hazardous jobs, and declining mortality from some diseases across the whole population. Notable exceptions to the general improvement were diseases caused by asbestos, especially in some construction trades and sinonasal cancer in woodworkers. Conclusions The highest priority for future prevention of work-related fatalities is the minority of occupational disorders for which excess mortality remains static or is increasing, in particular asbestos-related disease among certain occupations in the construction industry and sinonasal cancer in woodworkers. PMID:26976946

  16. Mercury Hazard Assessment for Piscivorous Wildlife in Glacier National Park

    KAUST Repository

    Stafford, Craig P.

    2016-12-14

    We examined the mercury hazard posed to selected piscivorous wildlife in Glacier National Park (GNP), Montana. Logging Lake was our focal site where we estimated the dietary mercury concentrations of wildlife (common loon [Gavia immer], American mink [Neovison vison], river otter [Lontra canadensis], and belted kingfisher [Megaceryle alcyon]) by assuming that fishes were consumed in proportion to their relative abundances. To evaluate if Logging Lake provided a suitable baseline for our study, we made geographic comparisons of fish mercury levels and investigated the distribution and abundance of high mercury fishes within GNP. We complimented our assessment by examining selenium:mercury molar ratios in fishes from Logging Lake and Saint Mary Lake. Our results suggest fish consumption does not imperil wildlife from Logging Lake based on published thresholds for adverse mercury effects, but some hazard may exist particularly if there is strong feeding selectivity for the most contaminated species, northern pikeminnow (Ptychocheilus oregonensis). The geographic comparisons of fish mercury levels, together with the distribution and abundance of high mercury fishes within GNP, suggest that Logging Lake provided a relatively protective baseline among our study lakes. Risk may be further reduced by the molar excess of selenium relative to mercury, particularly in the smaller fishes typically consumed by GNP wildlife. Our findings contrast with studies from northeastern US and southeastern Canada where greater mercury hazard to wildlife exists. An emergent finding from our research is that waterborne concentrations of methylmercury may provide limited insight into regional differences in fish mercury levels.

  17. Mercury Hazard Assessment for Piscivorous Wildlife in Glacier National Park

    KAUST Repository

    Stafford, Craig P.; Downs, Christopher C.; Langner, Heiko W.

    2016-01-01

    We examined the mercury hazard posed to selected piscivorous wildlife in Glacier National Park (GNP), Montana. Logging Lake was our focal site where we estimated the dietary mercury concentrations of wildlife (common loon [Gavia immer], American mink [Neovison vison], river otter [Lontra canadensis], and belted kingfisher [Megaceryle alcyon]) by assuming that fishes were consumed in proportion to their relative abundances. To evaluate if Logging Lake provided a suitable baseline for our study, we made geographic comparisons of fish mercury levels and investigated the distribution and abundance of high mercury fishes within GNP. We complimented our assessment by examining selenium:mercury molar ratios in fishes from Logging Lake and Saint Mary Lake. Our results suggest fish consumption does not imperil wildlife from Logging Lake based on published thresholds for adverse mercury effects, but some hazard may exist particularly if there is strong feeding selectivity for the most contaminated species, northern pikeminnow (Ptychocheilus oregonensis). The geographic comparisons of fish mercury levels, together with the distribution and abundance of high mercury fishes within GNP, suggest that Logging Lake provided a relatively protective baseline among our study lakes. Risk may be further reduced by the molar excess of selenium relative to mercury, particularly in the smaller fishes typically consumed by GNP wildlife. Our findings contrast with studies from northeastern US and southeastern Canada where greater mercury hazard to wildlife exists. An emergent finding from our research is that waterborne concentrations of methylmercury may provide limited insight into regional differences in fish mercury levels.

  18. Bias in logistic regression due to imperfect diagnostic test results and practical correction approaches.

    Science.gov (United States)

    Valle, Denis; Lima, Joanna M Tucker; Millar, Justin; Amratia, Punam; Haque, Ubydul

    2015-11-04

    Logistic regression is a statistical model widely used in cross-sectional and cohort studies to identify and quantify the effects of potential disease risk factors. However, the impact of imperfect tests on adjusted odds ratios (and thus on the identification of risk factors) is under-appreciated. The purpose of this article is to draw attention to the problem associated with modelling imperfect diagnostic tests, and propose simple Bayesian models to adequately address this issue. A systematic literature review was conducted to determine the proportion of malaria studies that appropriately accounted for false-negatives/false-positives in a logistic regression setting. Inference from the standard logistic regression was also compared with that from three proposed Bayesian models using simulations and malaria data from the western Brazilian Amazon. A systematic literature review suggests that malaria epidemiologists are largely unaware of the problem of using logistic regression to model imperfect diagnostic test results. Simulation results reveal that statistical inference can be substantially improved when using the proposed Bayesian models versus the standard logistic regression. Finally, analysis of original malaria data with one of the proposed Bayesian models reveals that microscopy sensitivity is strongly influenced by how long people have lived in the study region, and an important risk factor (i.e., participation in forest extractivism) is identified that would have been missed by standard logistic regression. Given the numerous diagnostic methods employed by malaria researchers and the ubiquitous use of logistic regression to model the results of these diagnostic tests, this paper provides critical guidelines to improve data analysis practice in the presence of misclassification error. Easy-to-use code that can be readily adapted to WinBUGS is provided, enabling straightforward implementation of the proposed Bayesian models.

  19. Hazardous Chemicals

    Centers for Disease Control (CDC) Podcasts

    Chemicals are a part of our daily lives, providing many products and modern conveniences. With more than three decades of experience, The Centers for Disease Control and Prevention (CDC) has been in the forefront of efforts to protect and assess people's exposure to environmental and hazardous chemicals. This report provides information about hazardous chemicals and useful tips on how to protect you and your family from harmful exposure.

  20. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  1. Multiwire proportional chamber for Moessbauer spectroscopy: development and results

    International Nuclear Information System (INIS)

    Costa, M.S. da.

    1985-12-01

    A new Multiwere proportional Chamber designed for Moessbauer Spectroscopy is presented. This detector allows transmission backscattering experiments using either photons or electrons. The Moessbauer data acquisition system, partially developed for this work is described. A simple method for determining the frontier between true proportional and semi-proportional regions of operation in gaseous detectors is proposed. The study of the tertiary gas mixture He-Ar-CH 4 leads to a straight forward way of energy calibration of the electron spectra. Moessbauer spectra using Fe-57 source are presented. In particular those obtained with backsattered electrons show the feasibility of depth selective analysis with gaseous proportional counters. (author) [pt

  2. The principle of proportionality and European contract law

    NARCIS (Netherlands)

    Cauffman, C.; Rutgers, J.; Sirena, P.

    2015-01-01

    The paper investigates the role of the principle of proportionality within contract law, in balancing the rights and obligations of the contracting parties. It illustrates that the principle of proportionality is one of the general principles which govern contractual relations, and as such it is an

  3. The Coastal Hazard Wheel system for coastal multi-hazard assessment & management in a changing climate

    DEFF Research Database (Denmark)

    Appelquist, Lars Rosendahl; Halsnæs, Kirsten

    2015-01-01

    This paper presents the complete Coastal Hazard Wheel (CHW) system, developed for multi-hazard-assessment and multi-hazard-management of coastal areas worldwide under a changing climate. The system is designed as a low-tech tool that can be used in areas with limited data availability...... screening and management. The system is developed to assess the main coastal hazards in a single process and covers the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding. The system was initially presented in 2012 and based on a range of test......-applications and feedback from coastal experts, the system has been further refined and developed into a complete hazard management tool. This paper therefore covers the coastal classification system used by the CHW, a standardized assessment procedure for implementation of multi-hazard-assessments, technical guidance...

  4. Disposal of hazardous wastes

    International Nuclear Information System (INIS)

    Barnhart, B.J.

    1978-01-01

    The Fifth Life Sciences Symposium entitled Hazardous Solid Wastes and Their Disposal on October 12 through 14, 1977 was summarized. The topic was the passage of the National Resources Conservation and Recovery Act of 1976 will force some type of action on all hazardous solid wastes. Some major points covered were: the formulation of a definition of a hazardous solid waste, assessment of long-term risk, list of specific materials or general criteria to specify the wastes of concern, Bioethics, sources of hazardous waste, industrial and agricultural wastes, coal wastes, radioactive wastes, and disposal of wastes

  5. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  6. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  7. Offsite transportation hazards assessment

    International Nuclear Information System (INIS)

    Burnside, M.E.

    1997-01-01

    This report documents the emergency preparedness Hazards Assessment for the offsite transportation of hazardous material from the Hanford Site. The assessment is required by the US Department of Energy (DOE) Order 151.1. Offsite transportation accidents are categorized using the DOE system to assist communication within the DOE and assure that appropriate assistance is provided to the people in charge at the scene. The assistance will initially include information about the load and the potential hazards. Local authorities will use the information to protect the public following a transportation accident. This Hazards Assessment will focus on the material being transported from the Hanford Site. Shipments coming to Hanford are the responsibility of the shipper and the carrier and, therefore, are not included in this Hazards Assessment, unless the DOE elects to be the shipper of record

  8. Radiological hazards of alpha-contaminated waste

    International Nuclear Information System (INIS)

    Rodgers, J.C.

    1982-01-01

    The radiological hazards of alpha-contaminated wastes are discussed in this overview in terms of two components of hazard: radiobiological hazard, and radioecological hazard. Radiobiological hazard refers to human uptake of alpha-emitters by inhalation and ingestion, and the resultant dose to critical organs of the body. Radioecological hazard refers to the processes of release from buried wastes, transport in the environment, and translocation to man through the food chain. Besides detailing the sources and magnitude of hazards, this brief review identifies the uncertainties in their estimation, and implications for the regulatory process

  9. Hazards from aircraft

    International Nuclear Information System (INIS)

    Grund, J.E.; Hornyik, K.

    1975-01-01

    The siting of nuclear power plants has created innumerable environmental concerns. Among the effects of the ''man-made environment'' one of increasing importance in recent nuclear plant siting hazards analysis has been the concern about aircraft hazards to the nuclear plant. These hazards are of concern because of the possibility that an aircraft may have a malfunction and crash either near the plant or directly into it. Such a crash could be postulated to result, because of missile and/or fire effects, in radioactive releases which would endanger the public health and safety. The majority of studies related to hazards from air traffic have been concerned with the determination of the probability associated with an aircraft striking vulnerable portions of a given plant. Other studies have focused on the structural response to such a strike. This work focuses on the problem of strike probability. 13 references

  10. Education and risk of coronary heart disease: Assessment of mediation by behavioural risk factors using the additive hazards model

    DEFF Research Database (Denmark)

    Nordahl, H; Rod, NH; Frederiksen, BL

    2013-01-01

    seven Danish cohort studies were linked to registry data on education and incidence of CHD. Mediation by smoking, low physical activity, and body mass index (BMI) on the association between education and CHD were estimated by applying newly proposed methods for mediation based on the additive hazards...... % CI: 12, 22) for women and 37 (95 % CI: 28, 46) for men could be ascribed to the pathway through smoking. Further, 39 (95 % CI: 30, 49) cases for women and 94 (95 % CI: 79, 110) cases for men could be ascribed to the pathway through BMI. The effects of low physical activity were negligible. Using...... contemporary methods, the additive hazards model, for mediation we indicated the absolute numbers of CHD cases prevented when modifying smoking and BMI. This study confirms previous claims based on the Cox proportional hazards model that behavioral risk factors partially mediates the effect of education on CHD...

  11. Exploring the effects of driving experience on hazard awareness and risk perception via real-time hazard identification, hazard classification, and rating tasks.

    Science.gov (United States)

    Borowsky, Avinoam; Oron-Gilad, Tal

    2013-10-01

    This study investigated the effects of driving experience on hazard awareness and risk perception skills. These topics have previously been investigated separately, yet a novel approach is suggested where hazard awareness and risk perception are examined concurrently. Young, newly qualified drivers, experienced drivers, and a group of commercial drivers, namely, taxi drivers performed three consecutive tasks: (1) observed 10 short movies of real-world driving situations and were asked to press a button each time they identified a hazardous situation; (2) observed one of three possible sub-sets of 8 movies (out of the 10 they have seen earlier) for the second time, and were asked to categorize them into an arbitrary number of clusters according to the similarity in their hazardous situation; and (3) observed the same sub-set for a third time and following each movie were asked to rate its level of hazardousness. The first task is considered a real-time identification task while the other two are performed using hindsight. During it participants' eye movements were recorded. Results showed that taxi drivers were more sensitive to hidden hazards than the other driver groups and that young-novices were the least sensitive. Young-novice drivers also relied heavily on materialized hazards in their categorization structure. In addition, it emerged that risk perception was derived from two major components: the likelihood of a crash and the severity of its outcome. Yet, the outcome was rarely considered under time pressure (i.e., in real-time hazard identification tasks). Using hindsight, when drivers were provided with the opportunity to rate the movies' hazardousness more freely (rating task) they considered both components. Otherwise, in the categorization task, they usually chose the severity of the crash outcome as their dominant criterion. Theoretical and practical implications are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  13. Flood hazards for nuclear power plants

    International Nuclear Information System (INIS)

    Yen, B.C.

    1988-01-01

    Flooding hazards for nuclear power plants may be caused by various external geophysical events. In this paper the hydrologic hazards from flash floods, river floods and heavy rain at the plant site are considered. Depending on the mode of analysis, two types of hazard evaluation are identified: 1) design hazard which is the probability of flooding over an expected service period, and 2) operational hazard which deals with real-time forecasting of the probability of flooding of an incoming event. Hazard evaluation techniques using flood frequency analysis can only be used for type 1) design hazard. Evaluation techniques using rainfall-runoff simulation or multi-station correlation can be used for both types of hazard prediction. (orig.)

  14. GEOSPATIAL DATA INTEGRATION FOR ASSESSING LANDSLIDE HAZARD ON ENGINEERED SLOPES

    Directory of Open Access Journals (Sweden)

    P. E. Miller

    2012-07-01

    Full Text Available Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator’s hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator’s existing field-based approaches.

  15. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  16. Does workplace social capital associate with hazardous drinking among Chinese rural-urban migrant workers?

    Science.gov (United States)

    Gao, Junling; Weaver, Scott R; Fua, Hua; Pan, Zhigang

    2014-01-01

    The present study sought to investigate the associations between workplace social capital and hazardous drinking (HD) among Chinese rural-urban migrant workers (RUMW). A cross sectional study with a multi-stage stratified sampling procedure was conducted in Shanghai during July 2012 to January 2013. In total, 5,318 RUMWs from 77 workplaces were involved. Work-place social capital was assessed using a validated and psychometrically tested eight-item measure. The Chinese version of Alcohol Use Disorders Identification Test (AUDIT) was used to assess hazardous drinking. Control variables included gender, age, marital status, education level, salary, and current smoking. Multilevel logistic regression analysis was conducted to test whether individual- and workplace-level social capital was associated with hazardous drinking. Overall, the prevalence of HD was 10.6%. After controlling for individual-level socio-demographic and lifestyle variables, compared to workers in the highest quartile of individual-level social capital, the odds of HD for workers in the three bottom quartiles were 1.13(95%CI: 1.04-1.23), 1.17(95%CI: 1.05-1.56) and 1.26(95%CI: 1.13-1.72), respectively. However, contrary to hypothesis, there was no relationship between workplace-level social capital and hazardous drinking. Higher individual-level social capital may protect against HD among Chinese RUMWs. Interventions to build individual social capital among RUMWs in China may help reduce HD among this population.

  17. Hazard Experience, Geophysical Vulnerability, and Flood Risk Perceptions in a Postdisaster City, the Case of New Orleans.

    Science.gov (United States)

    Gotham, Kevin Fox; Campanella, Richard; Lauve-Moon, Katie; Powers, Bradford

    2018-02-01

    This article investigates the determinants of flood risk perceptions in New Orleans, Louisiana (United States), a deltaic coastal city highly vulnerable to seasonal nuisance flooding and hurricane-induced deluges and storm surges. Few studies have investigated the influence of hazard experience, geophysical vulnerability (hazard proximity), and risk perceptions in cities undergoing postdisaster recovery and rebuilding. We use ordinal logistic regression techniques to analyze experiential, geophysical, and sociodemographic variables derived from a survey of 384 residents in seven neighborhoods. We find that residents living in neighborhoods that flooded during Hurricane Katrina exhibit higher levels of perceived risk than those residents living in neighborhoods that did not flood. In addition, findings suggest that flood risk perception is positively associated with female gender, lower income, and direct flood experiences. In conclusion, we discuss the implications of these findings for theoretical and empirical research on environmental risk, flood risk communication strategies, and flood hazards planning. © 2017 Society for Risk Analysis.

  18. The principle of proportionality revisited: interpretations and applications.

    Science.gov (United States)

    Hermerén, Göran

    2012-11-01

    The principle of proportionality is used in many different contexts. Some of these uses and contexts are first briefly indicated. This paper focusses on the use of this principle as a moral principle. I argue that under certain conditions the principle of proportionality is helpful as a guide in decision-making. But it needs to be clarified and to be used with some flexibility as a context-dependent principle. Several interpretations of the principle are distinguished, using three conditions as a starting point: importance of objective, relevance of means, and most favourable option. The principle is then tested against an example, which suggests that a fourth condition, focusing on non-excessiveness, needs to be added. I will distinguish between three main interpretations of the principle, some primarily with uses in research ethics, others with uses in other areas of bioethics, for instance in comparisons of therapeutic means and ends. The relations between the principle of proportionality and the precautionary principle are explored in the following section. It is concluded that the principles are different and may even clash. In the next section the principle of proportionality is applied to some medical examples drawn from research ethics and bioethics. In concluding, the status of the principle of proportionality as a moral principle is discussed. What has been achieved so far and what remains to be done is finally summarized.

  19. Analyses of Developmental Rate Isomorphy in Ectotherms: Introducing the Dirichlet Regression.

    Directory of Open Access Journals (Sweden)

    David S Boukal

    Full Text Available Temperature drives development in insects and other ectotherms because their metabolic rate and growth depends directly on thermal conditions. However, relative durations of successive ontogenetic stages often remain nearly constant across a substantial range of temperatures. This pattern, termed 'developmental rate isomorphy' (DRI in insects, appears to be widespread and reported departures from DRI are generally very small. We show that these conclusions may be due to the caveats hidden in the statistical methods currently used to study DRI. Because the DRI concept is inherently based on proportional data, we propose that Dirichlet regression applied to individual-level data is an appropriate statistical method to critically assess DRI. As a case study we analyze data on five aquatic and four terrestrial insect species. We find that results obtained by Dirichlet regression are consistent with DRI violation in at least eight of the studied species, although standard analysis detects significant departure from DRI in only four of them. Moreover, the departures from DRI detected by Dirichlet regression are consistently much larger than previously reported. The proposed framework can also be used to infer whether observed departures from DRI reflect life history adaptations to size- or stage-dependent effects of varying temperature. Our results indicate that the concept of DRI in insects and other ectotherms should be critically re-evaluated and put in a wider context, including the concept of 'equiproportional development' developed for copepods.

  20. Social Hazards as Manifested Workplace Discrimination and Health (Vietnamese and Ukrainian Female and Male Migrants in Czechia).

    Science.gov (United States)

    Drbohlav, Dušan; Dzúrová, Dagmar

    2017-10-10

    Social hazards as one of the dimensions of workplace discrimination are a potential social determinant of health inequalities. The aim of this study was to investigate relations between self-reported health and social hazard characteristics (defined as-discrimination as such, violence or threat of violence, time pressure or work overload and risk of accident) among Vietnamese and Ukrainian migrants (males and females) in Czechia by age, education level and marital status. This study is based on data from a survey of 669 immigrants in Czechia in 2013. Logistic regression analysis indicates that the given independent variables (given social hazards and socio-demographic characteristics), as predictors of a quality of self-reported health are more important for immigrant females than for males, irrespective of citizenship, albeit only for some of them and to differing extents. We found out that being exposed to the selected social hazards in the workplace leads to worsening self-rated health, especially for females. On the other hand, there was no statistically significant relationship found between poor self-rated health and discrimination as such. Reality calls for more research and, consequently, better policies and practices in the field of health inequalities.

  1. Social Hazards as Manifested Workplace Discrimination and Health (Vietnamese and Ukrainian Female and Male Migrants in Czechia

    Directory of Open Access Journals (Sweden)

    Dušan Drbohlav

    2017-10-01

    Full Text Available Social hazards as one of the dimensions of workplace discrimination are a potential social determinant of health inequalities. The aim of this study was to investigate relations between self-reported health and social hazard characteristics (defined as—discrimination as such, violence or threat of violence, time pressure or work overload and risk of accident among Vietnamese and Ukrainian migrants (males and females in Czechia by age, education level and marital status. This study is based on data from a survey of 669 immigrants in Czechia in 2013. Logistic regression analysis indicates that the given independent variables (given social hazards and socio-demographic characteristics, as predictors of a quality of self-reported health are more important for immigrant females than for males, irrespective of citizenship, albeit only for some of them and to differing extents. We found out that being exposed to the selected social hazards in the workplace leads to worsening self-rated health, especially for females. On the other hand, there was no statistically significant relationship found between poor self-rated health and discrimination as such. Reality calls for more research and, consequently, better policies and practices in the field of health inequalities.

  2. Occupational psychosocial hazards among the emerging U.S. green collar workforce

    Science.gov (United States)

    Fernandez, Cristina A.; Moore, Kevin; McClure, Laura A.; Caban-Martinez, Alberto J.; LeBlanc, William G.; Fleming, Lora E.; Cifuentes, Manuel; Lee, David J.

    2016-01-01

    Objective To compare occupational psychosocial hazards in green collar versus non-green collar workers. Methods Standard Occupational Classification codes were used to link the 2010 National Health Interview Survey to the 2010 Occupational Information Network Database. Multivariable logistic regressions were used to predict job insecurity, work-life imbalance, and workplace harassment in green versus non-green collar workers. Results Most participants were white, non-Hispanic, 25–64 years of age, and obtained greater than a high school education. The majority reported not being harassed at work, no work-life imbalance, and no job insecurity. Relative to non-green collar workers (n=12,217), green collar workers (n=2,588) were more likely to report job insecurity (OR=1.13; 95% CI=1.02–1.26) and work-life imbalance (1.19; 1.05–1.35), but less likely to experience workplace harassment (0.77; 0.62–0.95). Conclusions Continuous surveillance of occupational psychosocial hazards is recommended in this rapidly emerging workforce. PMID:28045790

  3. Geomorphological hazards in Swat valley, Pakistan

    International Nuclear Information System (INIS)

    Usman, A.

    1999-01-01

    This study attempts to describe, interpret and analyze, in depth, the varied geomorphological hazards and their impacts prevailing in the swat valley locate in the northern hilly and mountainous regions of Pakistan. The hills and mountains re zones of high geomorphological activity with rapid rates of weathering, active tectonic activities, abundant precipitation, rapid runoff and heavy sediment transport. Due to the varied topography, lithology, steep slope, erodible soil, heavy winter snowfall and intensive rainfall in the spring and summer seasons, several kinds of geomorphological hazards, such as geomorphic gravitational hazards, Fluvial hazards, Glacial hazards, Geo tectonic hazards, are occurring frequently in swat valley. Amongst them, geomorphic gravitational hazards, such as rock fall rock slide, debris slide mud flow avalanches, are major hazards in mountains and hills while fluvial hazards and sedimentation are mainly confined to the alluvial plain and lowlands of the valley. The Getechtonic hazards, on the other hand, have wide spread distribution in the valley the magnitude and occurrence of each king of hazard is thus, varied according to intensity of process and physical geographic environment. This paper discusses the type distribution and damage due to the various geomorphological hazards and their reduction treatments. The study would to be of particular importance and interest to both natural and social scientists, as well as planner, environmentalists and decision-makers for successful developmental interventions in the region. (author)

  4. Welding hazards

    International Nuclear Information System (INIS)

    Khan, M.A.

    1992-01-01

    Welding technology is advancing rapidly in the developed countries and has converted into a science. Welding involving the use of electricity include resistance welding. Welding shops are opened in residential area, which was causing safety hazards, particularly the teenagers and children who eagerly see the welding arc with their naked eyes. There are radiation hazards from ultra violet rays which irritate the skin, eye irritation. Welding arc light of such intensity could damage the eyes. (Orig./A.B.)

  5. High Proportions of Multidrug-Resistant Acinetobacter spp. Isolates in a District in Western India: A Four-Year Antibiotic Susceptibility Study of Clinical Isolates

    Directory of Open Access Journals (Sweden)

    Ingvild Odsbu

    2018-01-01

    Full Text Available The purpose of the study was to determine the proportions of multidrug-resistant (MDR Acinetobacter spp. isolates from the district of Nashik in Western India during the period from 2011–2014. Antibacterial susceptibility testing of isolates from inpatients and outpatients was performed using Kirby–Bauer disc diffusion method to determine inhibitory zone diameters. Proportions of non-susceptible isolates were calculated from the antibacterial susceptibility data. MDR was defined as an isolate being non-susceptible to at least one antibacterial agent in at least three antibacterial categories. The change in proportions of MDR isolates; extended-spectrum β-lactamase (ESBL-producing isolates; and non-susceptible isolates to specific antibacterial categories over calendar time was investigated by logistic regression. The proportions of MDR and ESBL-producing isolates ranged from 89.4% to 95.9% and from 87.9% to 94.0%; respectively. The proportions of non-susceptible isolates to aminoglycosides; carbapenems; antipseudomonal penicillins/β-lactamase inhibitors; cephalosporins; folate pathway inhibitors; or penicillins/β-lactamase inhibitors exceeded 77.5%. Proportions of fluoroquinolone and tetracycline non-susceptible isolates ranged from 65.3% to 83.3% and from 71.3% to 75.9%; respectively. No changes in trends were observed over time; except for a decreasing trend in fluoroquinolone non-susceptible isolates (OR = 0.75 (95% CI, 0.62–0.91. Significantly higher proportions of non-susceptible; MDR and ESBL-producing isolates were found among isolates from the respiratory system compared to isolates from all other specimen types (p < 0.05. High proportions of MDR Acinetobacter spp. isolates were observed in the period from 2011–2014. Antimicrobial stewardship programmes are needed to prevent the emergence and spread of antibiotic resistance.

  6. Relative consequences of transporting hazardous materials

    International Nuclear Information System (INIS)

    Fullwood, R.R.; Rhyne, W.R.; Simmons, J.A.; Reese, R.T.

    1980-01-01

    The objective of this paper is to discuss methods under study at Transportation Technology Center to develop a perspective on how technical measures of hazard and risk relate to perception of hazards, harm, and risks associated with transporting hazardous materials. This paper is concerned with two major aspects of the relative hazards problem. The first aspect is the analyses of the possible effects associated with exposure to hazardous materials as contained in the following two parts: outlines of possible problems and controversies that could be encountered in the evaluation and comparisons of hazards and risks; and description of the various measures of harm (hazards or dangers) and subsequent comparisons thereof. The second aspect of this paper leads into a presentation of the results of a study which had the following purposes: to develop analytical techniques for a consistent treatment of the phenomenology of the consequences of a release of hazardous materials; to reduce the number of variables in the consequence analyses by development of transportation accident scenarios which have the same meteorological conditions, demography, traffic and population densities, geographical features and other appropriate conditions and to develop consistent methods for presenting the results of studies and analyses that describe the phenomenology and compare hazards. The results of the study are intended to provide a bridge between analytical certainty and perception of the hazards involved. Understanding the differences in perception of hazards resulting from transport of various hazardous materials is fraught with difficulties in isolating the qualitative and quantitative features of the problem. By relating the quantitative impacts of material hazards under identical conditions, it is hoped that the perceived differences in material hazards can be delineated and evaluated

  7. Childhood body mass index and the risk of prostate cancer in adult men

    DEFF Research Database (Denmark)

    Aarestrup, J; Gamborg, M; Cook, M B

    2014-01-01

    BACKGROUND: Prostate cancer aetiology is poorly understood. It may have origins early in life; previously we found a positive association with childhood height. The effects of early life body mass index (BMI; kg m(-2)) on prostate cancer remain equivocal. We investigated if childhood BMI...... to the Danish Cancer Registry. Cox proportional hazards regressions were performed. RESULTS: Overall, 3355 men were diagnosed with prostate cancer. Body mass index during childhood was positively associated with adult prostate cancer. The hazard ratio of prostate cancer was 1.06 (95% confidence interval (CI): 1...

  8. Congenital cerebral palsy and prenatal exposure to self-reported maternal infections, fever, or smoking

    DEFF Research Database (Denmark)

    Streja, Elani; Miller, Jessica; Bech, Bodil H

    2013-01-01

    OBJECTIVE: The objective of the study was to investigate the association between maternal self-reported infections, fever, and smoking in the prenatal period and the subsequent risk for congenital cerebral palsy (CP). STUDY DESIGN: We included the 81,066 mothers of singletons born between 1996...... and midgestation. We identified 139 CP cases including 121 cases of spastic CP (sCP) as confirmed by the Danish National Cerebral Palsy Register. Cox proportional hazards regression models were used to estimate adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs). RESULTS: Self-reported vaginal...

  9. Volume-Based F-18 FDG PET/CT Imaging Markers Provide Supplemental Prognostic Information to Histologic Grading in Patients With High-Grade Bone or Soft Tissue Sarcoma

    DEFF Research Database (Denmark)

    Andersen, Kim Francis; Fuglo, Hanna Maria; Rasmussen, Sine Hvid

    2015-01-01

    analysis. Kaplan-Meier survival estimates and log-rank test were used to compare the degree of equality of survival distributions. Prognostic variables with related hazard ratios (HR) were assessed using Cox proportional hazards regression analysis.Forty-one of 92 patients died during follow-up (45%; 12 BS.......05, HR 3.37 [95% CI 1.02-11.11]). No significant results were demonstrated for MTV40%.Volume-based F-18 FDG PET/CT imaging markers in terms of pretreatment estimation of TLG provide supplemental prognostic information to histologic grading, with significant independent properties for prediction...

  10. Position sensitive proportional counters as focal plane detectors

    International Nuclear Information System (INIS)

    Ford, J.L.C. Jr.

    1979-01-01

    The rise time and charge division techniques for position decoding with RC-line proportional counters are reviewed. The advantages that these detectors offer as focal plane counters for nuclear spectroscopy performed with magnetic spectrographs are discussed. The theory of operation of proportional counters as position sensing devices is summarized, as well as practical aspects affecting their application. Factors limiting the position and energy resolutions obtainable with a focal plane proportional counter are evaluated and measured position and energy loss values are presented for comparison. Detector systems capable of the multiparameter measurements required for particle identification, background suppression and ray-tracing are described in order to illustrate the wide applicability of proportional counters within complex focal plane systems. Examples of the use of these counters other than with magnetic spectrographs are given in order to demonstrate their usefulness in not only nuclear physics but also in fields such as solid state physics, biology, and medicine. The influence of the new focal plane detector systems on future magnetic spectrograph designs is discussed. (Auth.)

  11. Seismic hazard assessment: Issues and alternatives

    Science.gov (United States)

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  12. Atrial Fibrillation and Coronary Artery Disease as Risk Factors of Retinal Artery Occlusion: A Nationwide Population-Based Study

    Directory of Open Access Journals (Sweden)

    Ju-Chuan Yen

    2015-01-01

    Full Text Available We use Taiwanese national health insurance research database (NHIRD to investigate whether thrombolism (carotid artery disease (CAD as a surrogate or embolism (atrial fibrillation (AF as a surrogate plays roles in later retinal artery occlusion (RAO development and examine their relative weights. The relative risks of RAO between AF and CAD patients and controls were compared by estimating the crude hazard ratio with logistic regression. Kaplan-Meier analysis was used to calculate the cumulative incidence rates of developing RAO, and a log-rank test was used to analyze the differences between the survival curves. Separate Cox proportional hazard regressions were done to compute the RAO-free rate after adjusting for possible confounding factors such as age and sex. The crude hazard ratios were 7.98 for the AF group and 5.27 for the CAD group, and the adjusted hazard ratios were 8.32 and 5.34 for the AF and CAD groups, respectively. The observation time with RAO-free was shorter for AF compared with CAD group (1490 versus 1819 days. AF and CAD were both risk factors for RAO with different hazard ratios. To tackle both AF and CAD is crucial for curbing RAO.

  13. Count rate effect in proportional counters

    International Nuclear Information System (INIS)

    Bednarek, B.

    1980-01-01

    A critical evaluaton is presented of the actual state of investigations and explanations of the resolution and pulse height changes resulted in proportional counters from radiation intensity variations. (author)

  14. X-ray proportional counter for the Viking Lander

    International Nuclear Information System (INIS)

    Glesius, F.L.; Kroon, J.C.; Castro, A.J.; Clark, B.C.

    1978-01-01

    A set of four sealed proportional counters with optimized energy response is employed in the X-ray fluorescence spectrometer units aboard the two Viking Landers. The instruments have provided quantitative elemental analyses of soil samples taken from the Martian surface. This paper discusses the design and development of these miniature proportional counters, and describes their performance on Mars

  15. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  16. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis.

    Science.gov (United States)

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.

  17. Resilience to Interacting multi-natural hazards

    Science.gov (United States)

    Zhuo, Lu; Han, Dawei

    2016-04-01

    Conventional analyses of hazard assessment tend to focus on individual hazards in isolation. However, many parts of the world are usually affected by multiple natural hazards with the potential for interacting relationships. The understanding of such interactions, their impacts and the related uncertainties, are an important and topical area of research. Interacting multi-hazards may appear in different forms, including 1) CASCADING HAZARDS (a primary hazard triggering one or more secondary hazards such as an earthquake triggering landslides which may block river channels with dammed lakes and ensued floods), 2) CONCURRING HAZARDS (two or more primary hazards coinciding to trigger or exacerbate secondary hazards such as an earthquake and a rainfall event simultaneously creating landslides), and 3) ALTERING HAZARDS (a primary hazard increasing the probability of a secondary hazard occurring such as major earthquakes disturbing soil/rock materials by violent ground shaking which alter the regional patterns of landslides and debris flows in the subsequent years to come). All three types of interacting multi-hazards may occur in natural hazard prone regions, so it is important that research on hazard resilience should cover all of them. In the past decades, great progresses have been made in tackling disaster risk around the world. However, there are still many challenging issues to be solved, and the disasters over recent years have clearly demonstrated the inadequate resilience in our highly interconnected and interdependent systems. We have identified the following weaknesses and knowledge gaps in the current disaster risk management: 1) although our understanding in individual hazards has been greatly improved, there is a lack of sound knowledge about mechanisms and processes of interacting multi-hazards. Therefore, the resultant multi-hazard risk is often significantly underestimated with severe consequences. It is also poorly understood about the spatial and

  18. The intermediate endpoint effect in logistic and probit regression

    Science.gov (United States)

    MacKinnon, DP; Lockwood, CM; Brown, CH; Wang, W; Hoffman, JM

    2010-01-01

    Background An intermediate endpoint is hypothesized to be in the middle of the causal sequence relating an independent variable to a dependent variable. The intermediate variable is also called a surrogate or mediating variable and the corresponding effect is called the mediated, surrogate endpoint, or intermediate endpoint effect. Clinical studies are often designed to change an intermediate or surrogate endpoint and through this intermediate change influence the ultimate endpoint. In many intermediate endpoint clinical studies the dependent variable is binary, and logistic or probit regression is used. Purpose The purpose of this study is to describe a limitation of a widely used approach to assessing intermediate endpoint effects and to propose an alternative method, based on products of coefficients, that yields more accurate results. Methods The intermediate endpoint model for a binary outcome is described for a true binary outcome and for a dichotomization of a latent continuous outcome. Plots of true values and a simulation study are used to evaluate the different methods. Results Distorted estimates of the intermediate endpoint effect and incorrect conclusions can result from the application of widely used methods to assess the intermediate endpoint effect. The same problem occurs for the proportion of an effect explained by an intermediate endpoint, which has been suggested as a useful measure for identifying intermediate endpoints. A solution to this problem is given based on the relationship between latent variable modeling and logistic or probit regression. Limitations More complicated intermediate variable models are not addressed in the study, although the methods described in the article can be extended to these more complicated models. Conclusions Researchers are encouraged to use an intermediate endpoint method based on the product of regression coefficients. A common method based on difference in coefficient methods can lead to distorted

  19. Hazardous materials and waste management a guide for the professional hazards manager

    CERN Document Server

    Cheremisinoff, Nicholas P

    1995-01-01

    The management of hazardous materials and industrial wastes is complex, requiring a high degree of knowledge over very broad technical and legal subject areas. Hazardous wastes and materials are diverse, with compositions and properties that not only vary significantly between industries, but within industries, and indeed within the complexity of single facilities. Proper management not only requires an understanding of the numerous and complex regulations governing hazardous materials and waste streams, but an understanding and knowledge of the treatment, post-treatment, and waste minimizatio

  20. Proportional feedback control of laminar flow over a hemisphere

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Il [Dept. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of); Son, Dong Gun [Severe Accident and PHWR Safety Research Division, Korea Atomic Energy Research Institute (KAERI), Daejeon (Korea, Republic of)

    2016-08-15

    In the present study, we perform a proportional feedback control of laminar flow over a hemisphere at Re = 300 to reduce its lift fluctuations by attenuating the strength of the vortex shedding. As a control input, blowing/suction is distributed on the surface of hemisphere before the separation, and its strength is linearly proportional to the transverse velocity at a sensing location in the centerline of the wake. The sensing location is determined based on a correlation function between the lift force and the time derivative of sensing velocity. The optimal proportional gains for the proportional control are obtained for the sensing locations considered. The present control successfully attenuates the velocity fluctuations at the sensing location and three dimensional vertical structures in the wake, resulting in the reduction of lift fluctuations of hemisphere.

  1. Hazards in the chemical laboratory

    International Nuclear Information System (INIS)

    Bretherick, L.

    1987-01-01

    The contents of this book are: Preface; Introduction; Health and Safety at Work Act 1974; Safety Planning and Management; Fire Protection; Reactive Chemical Hazards; Chemical Hazards and Toxicology; Health Care and First Aid; Hazardous Chemicals; Precautions against Radiations; and An American View

  2. Correcting for binomial measurement error in predictors in regression with application to analysis of DNA methylation rates by bisulfite sequencing.

    Science.gov (United States)

    Buonaccorsi, John; Prochenka, Agnieszka; Thoresen, Magne; Ploski, Rafal

    2016-09-30

    Motivated by a genetic application, this paper addresses the problem of fitting regression models when the predictor is a proportion measured with error. While the problem of dealing with additive measurement error in fitting regression models has been extensively studied, the problem where the additive error is of a binomial nature has not been addressed. The measurement errors here are heteroscedastic for two reasons; dependence on the underlying true value and changing sampling effort over observations. While some of the previously developed methods for treating additive measurement error with heteroscedasticity can be used in this setting, other methods need modification. A new version of simulation extrapolation is developed, and we also explore a variation on the standard regression calibration method that uses a beta-binomial model based on the fact that the true value is a proportion. Although most of the methods introduced here can be used for fitting non-linear models, this paper will focus primarily on their use in fitting a linear model. While previous work has focused mainly on estimation of the coefficients, we will, with motivation from our example, also examine estimation of the variance around the regression line. In addressing these problems, we also discuss the appropriate manner in which to bootstrap for both inferences and bias assessment. The various methods are compared via simulation, and the results are illustrated using our motivating data, for which the goal is to relate the methylation rate of a blood sample to the age of the individual providing the sample. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    Science.gov (United States)

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric

  4. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    Science.gov (United States)

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated

  5. Hazardous solvent substitution

    International Nuclear Information System (INIS)

    Twitchell, K.E.

    1995-01-01

    This article is an overview of efforts at INEL to reduce the generation of hazardous wastes through the elimination of hazardous solvents. To aid in their efforts, a number of databases have been developed and will become a part of an Integrated Solvent Substitution Data System. This latter data system will be accessible through Internet

  6. Modified hazard ranking system for sites with mixed radioactive and hazardous wastes. User manual

    International Nuclear Information System (INIS)

    Hawley, K.A.; Peloquin, R.A.; Stenner, R.D.

    1986-04-01

    This document describes both the original Hazard Ranking System and the modified Hazard Ranking System as they are to be used in evaluating the relative potential for uncontrolled hazardous substance facilities to cause human health or safety problems or ecological or environmental damage. Detailed instructions for using the mHRS/HRS computer code are provided, along with instructions for performing the calculations by hand. Uniform application of the ranking system will permit the DOE to identify those releases of hazardous substances that pose the greatest hazard to humans or the environment. However, the mHRS/HRS by itself cannot establish priorities for the allocation of funds for remedial action. The mHRS/HRS is a means for applying uniform technical judgment regarding the potential hazards presented by a facility relative to other facilities. It does not address the feasibility, desirability, or degree of cleanup required. Neither does it deal with the readiness or ability of a state to carry out such remedial action, as may be indicated, or to meet other conditions prescribed in CERCLA. 13 refs., 13 figs., 27 tabs

  7. Modified hazard ranking system for sites with mixed radioactive and hazardous wastes. User manual.

    Energy Technology Data Exchange (ETDEWEB)

    Hawley, K.A.; Peloquin, R.A.; Stenner, R.D.

    1986-04-01

    This document describes both the original Hazard Ranking System and the modified Hazard Ranking System as they are to be used in evaluating the relative potential for uncontrolled hazardous substance facilities to cause human health or safety problems or ecological or environmental damage. Detailed instructions for using the mHRS/HRS computer code are provided, along with instructions for performing the calculations by hand. Uniform application of the ranking system will permit the DOE to identify those releases of hazardous substances that pose the greatest hazard to humans or the environment. However, the mHRS/HRS by itself cannot establish priorities for the allocation of funds for remedial action. The mHRS/HRS is a means for applying uniform technical judgment regarding the potential hazards presented by a facility relative to other facilities. It does not address the feasibility, desirability, or degree of cleanup required. Neither does it deal with the readiness or ability of a state to carry out such remedial action, as may be indicated, or to meet other conditions prescribed in CERCLA. 13 refs., 13 figs., 27 tabs.

  8. Hazardous solvent substitution

    International Nuclear Information System (INIS)

    Twitchell, K.E.

    1995-01-01

    Eliminating hazardous solvents is good for the environment, worker safety, and the bottom line. However, even though we are motivated to find replacements, the big question is 'What can we use as replacements for hazardous solvents?'You, too, can find replacements for your hazardous solvents. All you have to do is search for them. Search through the vendor literature of hundreds of companies with thousands of products. Ponder the associated material safety data sheets, assuming of course that you can obtain them and, having obtained them, that you can read them. You will want to search the trade magazines and other sources for product reviews. You will want to talk to users about how well the product actually works. You may also want to check US Environmental Protection Agency (EPA) and other government reports for toxicity and other safety information. And, of course, you will want to compare the product's constituent chemicals with the many hazardous constituency lists to ensure the safe and legal use of the product in your workplace

  9. Saving Money Using Proportional Reasoning

    Science.gov (United States)

    de la Cruz, Jessica A.; Garney, Sandra

    2016-01-01

    It is beneficial for students to discover intuitive strategies, as opposed to the teacher presenting strategies to them. Certain proportional reasoning tasks are more likely to elicit intuitive strategies than other tasks. The strategies that students are apt to use when approaching a task, as well as the likelihood of a student's success or…

  10. Radon and its hazards

    International Nuclear Information System (INIS)

    Chang Guilan

    2002-01-01

    The author describes basic physical and chemical properties of radon and the emanation, introduces methods of radon measurement, expounds the hazards of non-mine radon accumulation to the health of human being and the protection, as well as the history how the human being recognizes the hazards of radon through the specific data and examples, and finally proposes protecting measures to avoid the hazards of radon to the health of human being, and to do ecologic evaluation of environments

  11. Industrial hazard and safety handbook

    CERN Document Server

    King, Ralph W

    1979-01-01

    Industrial Hazard and Safety Handbook (Revised Impression) describes and exposes the main hazards found in industry, with emphasis on how these hazards arise, are ignored, are identified, are eliminated, or are controlled. These hazard conditions can be due to human stresses (for example, insomnia), unsatisfactory working environments, as well as secret industrial processes. The book reviews the cost of accidents, human factors, inspections, insurance, legal aspects, planning for major emergencies, organization, and safety measures. The text discusses regulations, codes of practice, site layou

  12. Global Landslide Hazard Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Landslide Hazard Distribution is a 2.5 minute grid of global landslide and snow avalanche hazards based upon work of the Norwegian Geotechnical Institute...

  13. Logistic Regression for Seismically Induced Landslide Predictions: Using Uniform Hazard and Geophysical Layers as Predictor Variables

    Science.gov (United States)

    Nowicki, M. A.; Hearne, M.; Thompson, E.; Wald, D. J.

    2012-12-01

    Seismically induced landslides present a costly and often fatal threats in many mountainous regions. Substantial effort has been invested to understand where seismically induced landslides may occur in the future. Both slope-stability methods and, more recently, statistical approaches to the problem are described throughout the literature. Though some regional efforts have succeeded, no uniformly agreed-upon method is available for predicting the likelihood and spatial extent of seismically induced landslides. For use in the U. S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, we would like to routinely make such estimates, in near-real time, around the globe. Here we use the recently produced USGS ShakeMap Atlas of historic earthquakes to develop an empirical landslide probability model. We focus on recent events, yet include any digitally-mapped landslide inventories for which well-constrained ShakeMaps are also available. We combine these uniform estimates of the input shaking (e.g., peak acceleration and velocity) with broadly available susceptibility proxies, such as topographic slope and surface geology. The resulting database is used to build a predictive model of the probability of landslide occurrence with logistic regression. The landslide database includes observations from the Northridge, California (1994); Wenchuan, China (2008); ChiChi, Taiwan (1999); and Chuetsu, Japan (2004) earthquakes; we also provide ShakeMaps for moderate-sized events without landslide for proper model testing and training. The performance of the regression model is assessed with both statistical goodness-of-fit metrics and a qualitative review of whether or not the model is able to capture the spatial extent of landslides for each event. Part of our goal is to determine which variables can be employed based on globally-available data or proxies, and whether or not modeling results from one region are transferrable to

  14. Analysis of root causes of major hazard precursors (hydrocarbon leaks) in the Norwegian offshore petroleum industry

    International Nuclear Information System (INIS)

    Vinnem, Jan Erik; Hestad, Jon Andreas; Kvaloy, Jan Terje; Skogdalen, Jon Espen

    2010-01-01

    The offshore petroleum industry in Norway reports major hazard precursors to the authorities, and data are available for the period 1996 through 2009. Barrier data have been reported since 2002, as have data from an extensive questionnaire survey covering working environment, organizational culture and perceived risk among all employees on offshore installations. Several attempts have been made to analyse different data sources in order to discover relations that may cast some light on possible root causes of major hazard precursors. These previous attempts were inconclusive. The study presented in this paper is the most extensive study performed so far. The data were analysed using linear regression. The conclusion is that there are significant correlations between number of leaks and safety climate indicators. The discussion points to possible root causes of major accidents.

  15. Natural Hazards, Second Edition

    Science.gov (United States)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  16. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  17. Aeromedical Hazard Comparison of FAA Medically Certified Third-Class and Medically Uncertified Pilots.

    Science.gov (United States)

    Ricaurte, Eduard M; Mills, William D; DeJohn, Charles A; Laverde-Lopez, Maria C; Porras-Sanchez, Daniel F

    2016-07-01

    Since 2004, in the United States, light sport aircraft (LSA) and some aircraft with standard airworthiness certificates can be operated for recreational purposes with a valid state driver's license rather than a Federal Aviation Administration (FAA)-issued aeromedical certificate. There have been recent efforts to allow operation of much larger, heavier, faster, and more complex aircraft without requiring a medical certificate. The primary objective of this research was to compare hazards to flight safety identified in fatally injured pilots required to possess a valid FAA third-class medical certificate to hazards in fatally injured pilots who were not required to possess a valid medical certificate. A search of all fatal U.S. aircraft accidents in the FAA Medical ANalysis and TRAcking (MANTRA) registry between January 1, 2011, and April 30, 2014, identified 1084 individuals. A review of accident pilots' medical, autopsy, and toxicological data was conducted. After applying exclusion criteria, 467 pilots remained, including 403 medically certified and 64 medically uncertified pilots. A significant difference was found in a surrogate measure for risk between medically certified and uncertified pilots (25% vs. 59%). This difference remained significant after adjustment for age. No significant difference was found in the proportions of hazards identified on toxicological review. The results of this study suggest that the risk of an adverse medical event is reduced in pilots required to possess a valid medical certificate. Ricaurte EM, Mills WD, DeJohn CA, Laverde-Lopez MC, Porras-Sanchez DF. Aeromedical hazard comparison of FAA medically certified third-class and medically uncertified pilots. Aerosp Med Hum Perform. 2016; 87(7):618-621.

  18. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  19. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Carbon Structure Hazard Control

    Science.gov (United States)

    Yoder, Tommy; Greene, Ben; Porter, Alan

    2015-01-01

    Carbon composite structures are widely used in virtually all advanced technology industries for a multitude of applications. The high strength-to-weight ratio and resistance to aggressive service environments make them highly desirable. Automotive, aerospace, and petroleum industries extensively use, and will continue to use, this enabling technology. As a result of this broad range of use, field and test personnel are increasingly exposed to hazards associated with these structures. No single published document exists to address the hazards and make recommendations for the hazard controls required for the different exposure possibilities from damaged structures including airborne fibers, fly, and dust. The potential for personnel exposure varies depending on the application or manipulation of the structure. The effect of exposure to carbon hazards is not limited to personnel, protection of electronics and mechanical equipment must be considered as well. The various exposure opportunities defined in this document include pre-manufacturing fly and dust, the cured structure, manufacturing/machining, post-event cleanup, and post-event test and/or evaluation. Hazard control is defined as it is applicable or applied for the specific exposure opportunity. The carbon exposure hazard includes fly, dust, fiber (cured/uncured), and matrix vapor/thermal decomposition products. By using the recommendations in this document, a high level of confidence can be assured for the protection of personnel and equipment.

  1. 75 FR 73972 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Removal of...

    Science.gov (United States)

    2010-11-30

    ... Waste Management System; Identification and Listing of Hazardous Waste; Removal of Direct Final.... Lists of Subjects in 40 CFR Part 261 Environmental Protection, Hazardous waste, Recycling, Reporting and... follows: PART 261--IDENTIFICATION AND LISTING OF HAZARDOUS WASTE 0 1. The authority citation for part 261...

  2. Tumor regression patterns in retinoblastoma

    International Nuclear Information System (INIS)

    Zafar, S.N.; Siddique, S.N.; Zaheer, N.

    2016-01-01

    To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)

  3. Hazardous waste: cleanup and prevention

    Science.gov (United States)

    Vandas, Stephen; Cronin, Nancy L.; Farrar, Frank; Serrano, Guillermo Eliezer Ávila; Yajimovich, Oscar Efraín González; Muñoz, Aurora R.; Rivera, María del C.

    1996-01-01

    Our lifestyles are supported by complex Industrial activities that produce many different chemicals and chemical wastes. The Industries that produce our clothing, cars, medicines, paper, food, fuels, steel, plastics, and electric components use and discard thousands of chemicals every year. At home we may use lawn chemicals, solvents, disinfectants, cleaners, and auto products to Improve our quality of life. A chemical that presents a threat or unreasonable risk to people or the environment Is a hazardous material. When a hazardous material can no longer be used, It becomes a hazardous waste. Hazardous wastes come from a variety of sources, from both present and past activities. Impacts to human health and the environment can result from Improper handling and disposal of hazardous waste.

  4. SRL process hazards review manual

    International Nuclear Information System (INIS)

    1980-08-01

    The principal objective of the Process Hazards Management Program is to provide a regular, systematic review of each process at the Savannah River Laboratory (SRL) to eliminate injuries and to minimize property damage resulting from process hazards of catastrophic potential. Management effort is directed, through the Du Pont Safety Program, toward those controls and practices that ensure this objective. The Process Hazards Management Program provides an additional dimension to further ensure the health and safety of employees and the public. Du Pont has concluded that an organized approach is essential to obtain an effective and efficient process hazards review. The intent of this manual is to provide guidance in creating such an organized approach to performing process hazards reviews on a continuing basis

  5. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-12

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios received increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.

  6. Guidance Index for Shallow Landslide Hazard Analysis

    Directory of Open Access Journals (Sweden)

    Cheila Avalon Cullen

    2016-10-01

    Full Text Available Rainfall-induced shallow landslides are one of the most frequent hazards on slanted terrains. Intense storms with high-intensity and long-duration rainfall have high potential to trigger rapidly moving soil masses due to changes in pore water pressure and seepage forces. Nevertheless, regardless of the intensity and/or duration of the rainfall, shallow landslides are influenced by antecedent soil moisture conditions. As of this day, no system exists that dynamically interrelates these two factors on large scales. This work introduces a Shallow Landslide Index (SLI as the first implementation of antecedent soil moisture conditions for the hazard analysis of shallow rainfall-induced landslides. The proposed mathematical algorithm is built using a logistic regression method that systematically learns from a comprehensive landslide inventory. Initially, root-soil moisture and rainfall measurements modeled from AMSR-E and TRMM respectively, are used as proxies to develop the index. The input dataset is randomly divided into training and verification sets using the Hold-Out method. Validation results indicate that the best-fit model predicts the highest number of cases correctly at 93.2% accuracy. Consecutively, as AMSR-E and TRMM stopped working in October 2011 and April 2015 respectively, root-soil moisture and rainfall measurements modeled by SMAP and GPM are used to develop models that calculate the SLI for 10, 7, and 3 days. The resulting models indicate a strong relationship (78.7%, 79.6%, and 76.8% respectively between the predictors and the predicted value. The results also highlight important remaining challenges such as adequate information for algorithm functionality and satellite based data reliability. Nevertheless, the experimental system can potentially be used as a dynamic indicator of the total amount of antecedent moisture and rainfall (for a given duration of time needed to trigger a shallow landslide in a susceptible area. It is

  7. Low-dose prednisolone in first-line docetaxel for patients with metastatic castration-resistant prostate cancer

    DEFF Research Database (Denmark)

    Kongsted, Per; Svane, Inge Marie; Lindberg, Henriette

    2015-01-01

    incidence of peripheral edema (32% vs. 15%, Pneutropenia in this group (25% vs. 10%, P...% CI: 0.76-1.26, P = 0.89, Cox proportional hazard regression model). CONCLUSIONS: Coadministration of low-dose P reduced the incidence of peripheral edema, grade 3 nonhematological toxicity, and the risk of being admitted owing to febrile neutropenia during treatment with D. Adjusted survival analysis...

  8. Hazardous Chemicals

    Centers for Disease Control (CDC) Podcasts

    2007-04-10

    Chemicals are a part of our daily lives, providing many products and modern conveniences. With more than three decades of experience, The Centers for Disease Control and Prevention (CDC) has been in the forefront of efforts to protect and assess people's exposure to environmental and hazardous chemicals. This report provides information about hazardous chemicals and useful tips on how to protect you and your family from harmful exposure.  Created: 4/10/2007 by CDC National Center for Environmental Health.   Date Released: 4/13/2007.

  9. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  10. Semiparametric accelerated failure time cure rate mixture models with competing risks.

    Science.gov (United States)

    Choi, Sangbum; Zhu, Liang; Huang, Xuelin

    2018-01-15

    Modern medical treatments have substantially improved survival rates for many chronic diseases and have generated considerable interest in developing cure fraction models for survival data with a non-ignorable cured proportion. Statistical analysis of such data may be further complicated by competing risks that involve multiple types of endpoints. Regression analysis of competing risks is typically undertaken via a proportional hazards model adapted on cause-specific hazard or subdistribution hazard. In this article, we propose an alternative approach that treats competing events as distinct outcomes in a mixture. We consider semiparametric accelerated failure time models for the cause-conditional survival function that are combined through a multinomial logistic model within the cure-mixture modeling framework. The cure-mixture approach to competing risks provides a means to determine the overall effect of a treatment and insights into how this treatment modifies the components of the mixture in the presence of a cure fraction. The regression and nonparametric parameters are estimated by a nonparametric kernel-based maximum likelihood estimation method. Variance estimation is achieved through resampling methods for the kernel-smoothed likelihood function. Simulation studies show that the procedures work well in practical settings. Application to a sarcoma study demonstrates the use of the proposed method for competing risk data with a cure fraction. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Gastroesophageal reflux disease and atrial fibrillation: a nationwide population-based study.

    Directory of Open Access Journals (Sweden)

    Chin-Chou Huang

    Full Text Available OBJECTIVES: Precise mechanisms of atrial fibrillation (AF are uncertain, but their association with esophageal disorders has been recently proposed. The association between gastroesophageal reflux disease (GERD, the most common gastroesophageal disorder, and AF remains undetermined. We therefore aimed to investigate the association between GERD and later development of AF. METHODS AND RESULTS: Patients with GERD were identified from the 1,000,000-person cohort dataset sampled from the Taiwan National Health Insurance database. The study cohort comprised 29,688 newly diagnosed adult GERD patients; 29,597 randomly selected age-, gender-, comobidity-matched subjects comprised the comparison cohort. Cox proportional hazard regressions were performed as a means of comparing the AF-free survival rate for the two cohorts. During a maximum three years of follow-up, a total of 351 patients experienced AF, including 184 (0.62% patients in the GERD cohort and 167 (0.56% in the control group. The log-rank test showed that patients with GERD had significantly higher incidence of AF than those without GERD (p = 0.024. After Cox proportional hazard regression model analysis, GERD was independently associated with the increased risk of AF (hazard ratio, 1.31; 95% confidence interval, 1.06-1.61, p = 0.013. CONCLUSION: GERD was independently associated with an increased risk of future AF in a nationwide population-based cohort.

  12. Does workplace social capital associate with hazardous drinking among Chinese rural-urban migrant workers?

    Directory of Open Access Journals (Sweden)

    Junling Gao

    Full Text Available The present study sought to investigate the associations between workplace social capital and hazardous drinking (HD among Chinese rural-urban migrant workers (RUMW.A cross sectional study with a multi-stage stratified sampling procedure was conducted in Shanghai during July 2012 to January 2013. In total, 5,318 RUMWs from 77 workplaces were involved. Work-place social capital was assessed using a validated and psychometrically tested eight-item measure. The Chinese version of Alcohol Use Disorders Identification Test (AUDIT was used to assess hazardous drinking. Control variables included gender, age, marital status, education level, salary, and current smoking. Multilevel logistic regression analysis was conducted to test whether individual- and workplace-level social capital was associated with hazardous drinking.Overall, the prevalence of HD was 10.6%. After controlling for individual-level socio-demographic and lifestyle variables, compared to workers in the highest quartile of individual-level social capital, the odds of HD for workers in the three bottom quartiles were 1.13(95%CI: 1.04-1.23, 1.17(95%CI: 1.05-1.56 and 1.26(95%CI: 1.13-1.72, respectively. However, contrary to hypothesis, there was no relationship between workplace-level social capital and hazardous drinking.Higher individual-level social capital may protect against HD among Chinese RUMWs. Interventions to build individual social capital among RUMWs in China may help reduce HD among this population.

  13. A reduced feedback proportional fair multiuser scheduling scheme

    KAUST Repository

    Shaqfeh, Mohammad

    2011-12-01

    Multiuser switched-diversity scheduling schemes were recently proposed in order to overcome the heavy feedback requirements of conventional opportunistic scheduling schemes by applying a threshold-based, distributed and ordered scheduling mechanism. A slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we propose a novel proportional fair multiuser switched-diversity scheduling scheme and we demonstrate that it can be optimized using a practical and distributed method to obtain the per-user feedback thresholds. We demonstrate by numerical examples that our reduced feedback proportional fair scheduler operates within 0.3 bits/sec/Hz from the achievable rates by the conventional full feedback proportional fair scheduler in Rayleigh fading conditions. © 2011 IEEE.

  14. An evaluation of bias in propensity score-adjusted non-linear regression models.

    Science.gov (United States)

    Wan, Fei; Mitra, Nandita

    2018-03-01

    Propensity score methods are commonly used to adjust for observed confounding when estimating the conditional treatment effect in observational studies. One popular method, covariate adjustment of the propensity score in a regression model, has been empirically shown to be biased in non-linear models. However, no compelling underlying theoretical reason has been presented. We propose a new framework to investigate bias and consistency of propensity score-adjusted treatment effects in non-linear models that uses a simple geometric approach to forge a link between the consistency of the propensity score estimator and the collapsibility of non-linear models. Under this framework, we demonstrate that adjustment of the propensity score in an outcome model results in the decomposition of observed covariates into the propensity score and a remainder term. Omission of this remainder term from a non-collapsible regression model leads to biased estimates of the conditional odds ratio and conditional hazard ratio, but not for the conditional rate ratio. We further show, via simulation studies, that the bias in these propensity score-adjusted estimators increases with larger treatment effect size, larger covariate effects, and increasing dissimilarity between the coefficients of the covariates in the treatment model versus the outcome model.

  15. Transportation of Hazardous Evidentiary Material.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.

    2005-06-01

    This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for the safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being

  16. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  17. 30 CFR 47.21 - Identifying hazardous chemicals.

    Science.gov (United States)

    2010-07-01

    ..., subpart Z, Toxic and Hazardous Substances. (4) American Conference of Governmental Industrial Hygienists... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Identifying hazardous chemicals. 47.21 Section... TRAINING HAZARD COMMUNICATION (HazCom) Hazard Determination § 47.21 Identifying hazardous chemicals. The...

  18. Biopsy proportion of tumour predicts pathological tumour response and benefit from chemotherapy in resectable oesophageal carcinoma: results from the UK MRC OE02 trial.

    Science.gov (United States)

    Hale, Matthew D; Nankivell, Matthew; Hutchins, Gordon G; Stenning, Sally P; Langley, Ruth E; Mueller, Wolfram; West, Nicholas P; Wright, Alexander I; Treanor, Darren; Hewitt, Lindsay C; Allum, William H; Cunningham, David; Hayden, Jeremy D; Grabsch, Heike I

    2016-11-22

    Neoadjuvant chemotherapy followed by surgery is the standard of care for UK patients with locally advanced resectable oesophageal carcinoma (OeC). However, not all patients benefit from multimodal treatment and there is a clinical need for biomarkers which can identify chemotherapy responders. This study investigated whether the proportion of tumour cells per tumour area (PoT) measured in the pre-treatment biopsy predicts chemotherapy benefit for OeC patients. PoT was quantified using digitized haematoxylin/eosin stained pre-treatment biopsy slides from 281 OeC patients from the UK MRC OE02 trial (141 treated by surgery alone (S); 140 treated by 5-fluorouracil/cisplatin followed by surgery (CS)). The relationship between PoT and clinicopathological data including tumour regression grade (TRG), overall survival and treatment interaction was investigated. PoT was associated with chemotherapy benefit in a non-linear fashion (test for interaction, P=0.006). Only patients with a biopsy PoT between 40% and 70% received a significant survival benefit from neoadjuvant chemotherapy (N=129; HR (95%CI):1.94 (1.39-2.71), unlike those with lower or higher PoT (PoT70% (N=28, HR:0.65 (0.36-1.18)). High pre-treatment PoT was related to lack of primary tumour regression (TRG 4 or 5), P=0.0402. This is the first study to identify in a representative subgroup of OeC patients from a large randomized phase III trial that the proportion of tumour in the pre-chemotherapy biopsy predicts benefit from chemotherapy and may be a clinically useful biomarker for patient treatment stratification.Proportion of tumour is a novel biomarker which can be measured in the pre-treatment diagnostic biopsy and which may enable the identification of chemotherapy responders and non-responders among patients with oesophageal carcinoma. Proportion of tumour could easily become part of the routine reporting of oesophageal cancer biopsies and may aid in managing patients with borderline resectable cancer.

  19. Hazardous and Industrial Wastes Management: a Case Study of Khazra Industrial Park, Kerman

    Directory of Open Access Journals (Sweden)

    Hossein Jafari Mansoorian

    2013-08-01

    Full Text Available Background & Aims of the Study: Increasing hazardous industrial wastes and lack of necessary regulations for management of them have led to serious problems in some parts of Iran. The aim of this study was to evaluate the situation of collection, transportation, recycling, and disposal of hazardous industrial wastes in the Khazra Industrial Park of Kerman, Iran. Materials & Methods: This study was a descriptive cross-sectional study that was done using questionnaires and local visits during year 2009. In this questionnaire, some information about the industrial wastes, production, storage on site , collection, transformation, sorting, recycling, and disposal were recorded. Results:   In the Khazra Industrial Park, 71,600 kg/day of different industrial waste is produced. The biggest proportion of waste includes metals, and construction and demolition waste which are about 16,500 tons a year. The smallest proportion is non-iron metal waste, which is produced at a rate of 8 tons per year. 88.7 percent of the active industries at the Khazra Industrial Park produce solid industrial waste. Most of the industrial units do not use a united and coordinated system for storing waste and have no specific place for temporary storage inside the industrial park. The majority of industrial waste collection, which is about 59.8%, is done by private contractors. The industrial units transfer their waste separately, and just 9 industrial units recycle their waste. Disposal of these wastes is mainly done by selling to trading agencies. Each day, 3 tons of hazardous industrial waste is produced in this park. The highest production belongs to the oil factory (Keyhan Motor. Conclusions: According to the results, the Khazra Industrial Park needs a unified system for storing, transporting and collecting the sorted waste, and it also needs to have a transportation station with basic facilities. The wastes of most industrial units at the Khazra Industrial Park have the

  20. 76 FR 74709 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Final Exclusion

    Science.gov (United States)

    2011-12-01

    ..., including any sludge, spill residue, ash, emission control dust, or leachate, remains a hazardous waste... water for use as a cleaning agent. The slop oil waste is thereby diluted and hazardous constituents are... separation sludges that are listed as hazardous wastes due to benzene, benzo(a)pyrene, chrysene, lead and...

  1. 75 FR 78918 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Removal of...

    Science.gov (United States)

    2010-12-17

    ... and Community Right-to-Know Act FDA Food and Drug Administration HSWA Hazardous and Solid Waste...(f)), and hazardous substances (40 CFR 302.4) based solely upon the evidence that it is a potential... subsequently identified as hazardous wastes in Sec. 261.33(f) based solely on their potential for carcinogenic...

  2. Modified Hazard Ranking System/Hazard Ranking System for sites with mixed radioactive and hazardous wastes: Software documentation

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Peloquin, R.A.; Hawley, K.A.

    1986-11-01

    The mHRS/HRS software package was developed by the Pacific Northwest Laboratory (PNL) under contract with the Department of Energy (DOE) to provide a uniform method for DOE facilities to use in performing their Conservation Environmental Response Compensation and Liability Act (CERCLA) Phase I Modified Hazard Ranking System or Hazard Ranking System evaluations. The program is designed to remove the tedium and potential for error associated with the performing of hand calculations and the interpreting of information on tables and in reference books when performing an evaluation. The software package is designed to operate on a microcomputer (IBM PC, PC/XT, or PC/AT, or a compatible system) using either a dual floppy disk drive or a hard disk storage system. It is written in the dBASE III language and operates using the dBASE III system. Although the mHRS/HRS software package was developed for use at DOE facilities, it has direct applicability to the performing of CERCLA Phase I evaluations for any facility contaminated by hazardous waste. The software can perform evaluations using either the modified hazard ranking system methodology developed by DOE/PNL, the hazard ranking system methodology developed by EPA/MITRE Corp., or a combination of the two. This document is a companion manual to the mHRS/HRS user manual. It is intended for the programmer who must maintain the software package and for those interested in the computer implementation. This manual documents the system logic, computer programs, and data files that comprise the package. Hardware and software implementation requirements are discussed. In addition, hand calculations of three sample situations (problems) with associated computer runs used for the verification of program calculations are included.

  3. Modified Hazard Ranking System/Hazard Ranking System for sites with mixed radioactive and hazardous wastes: Software documentation

    International Nuclear Information System (INIS)

    Stenner, R.D.; Peloquin, R.A.; Hawley, K.A.

    1986-11-01

    The mHRS/HRS software package was developed by the Pacific Northwest Laboratory (PNL) under contract with the Department of Energy (DOE) to provide a uniform method for DOE facilities to use in performing their Conservation Environmental Response Compensation and Liability Act (CERCLA) Phase I Modified Hazard Ranking System or Hazard Ranking System evaluations. The program is designed to remove the tedium and potential for error associated with the performing of hand calculations and the interpreting of information on tables and in reference books when performing an evaluation. The software package is designed to operate on a microcomputer (IBM PC, PC/XT, or PC/AT, or a compatible system) using either a dual floppy disk drive or a hard disk storage system. It is written in the dBASE III language and operates using the dBASE III system. Although the mHRS/HRS software package was developed for use at DOE facilities, it has direct applicability to the performing of CERCLA Phase I evaluations for any facility contaminated by hazardous waste. The software can perform evaluations using either the modified hazard ranking system methodology developed by DOE/PNL, the hazard ranking system methodology developed by EPA/MITRE Corp., or a combination of the two. This document is a companion manual to the mHRS/HRS user manual. It is intended for the programmer who must maintain the software package and for those interested in the computer implementation. This manual documents the system logic, computer programs, and data files that comprise the package. Hardware and software implementation requirements are discussed. In addition, hand calculations of three sample situations (problems) with associated computer runs used for the verification of program calculations are included

  4. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  5. An integrated photosensor readout for gas proportional scintillation counters

    International Nuclear Information System (INIS)

    Lopes, J.A.M.; Santos, J.M.F. dos; Conde, C.A.N.

    1996-01-01

    A xenon gas proportional scintillation counter has been instrumented with a novel photosensor that replaces the photomultiplier tube normally used to detect the VUV secondary scintillation light. In this implementation, the collection grid of a planar gas proportional scintillation counter also functions as a multiwire proportional chamber to amplify and detect the photoelectrons emitted by a reflective CsI photocathode in direct contact with the xenon gas. This integrated concept combines greater simplicity, compactness, and ruggedness (no optical window is used) with low power consumption. An energy resolution of 12% was obtained for 59.6 keV x-rays

  6. Worklife expectancy in a cohort of Danish employees aged 55-65 years - comparing a multi-state Cox proportional hazard approach with conventional multi-state life tables.

    Science.gov (United States)

    Pedersen, Jacob; Bjorner, Jakob Bue

    2017-11-15

    Work life expectancy (WLE) expresses the expected time a person will remain in the labor market until he or she retires. This paper compares a life table approach to estimating WLE to an approach based on multi-state proportional hazards models. The two methods are used to estimate WLE in Danish members and non-members of an early retirement pensioning (ERP) scheme according to levels of health. In 2008, data on self-rated health (SRH) was collected from 5212 employees 55-65 years of age. Data on previous and subsequent long-term sickness absence, unemployment, returning to work, and disability pension was collected from national registers. WLE was estimated from multi-state life tables and through multi-state models. Results from the multi-state model approach agreed with the life table approach but provided narrower confidence intervals for small groups. The shortest WLE was seen for employees with poor SRH and ERP membership while the longest WLE was seen for those with good SRH and no ERP membership. Employees aged 55-56 years with poor SRH but no ERP membership had shorter WLE than employees with good SRH and ERP membership. Relative WLE reversed for the two groups after age 57. At age 55, employees with poor SRH could be expected to spend approximately 12 months on long-term sick leave and 9-10 months unemployed before they retired - regardless of ERP membership. ERP members with poor SRH could be expected to spend 4.6 years working, while non-members could be expected to spend 7.1 years working. WLE estimated through multi-state models provided an effective way to summarize complex data on labor market affiliation. WLE differed noticeably between members and non-members of the ERP scheme. It has been hypothesized that while ERP membership would prompt some employees to retire earlier than they would have done otherwise, this effect would be partly offset by reduced time spent on long-term sick leave or unemployment. Our data showed no indication of

  7. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  8. The California Hazards Institute

    Science.gov (United States)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for

  9. Gender differences in hazardous drinking among middle-aged in Europe: the role of social context and women's empowerment.

    Science.gov (United States)

    Bosque-Prous, Marina; Espelt, Albert; Borrell, Carme; Bartroli, Montse; Guitart, Anna M; Villalbí, Joan R; Brugal, M Teresa

    2015-08-01

    The aim of this study was to estimate the magnitude of gender differences in hazardous drinking among middle-aged people and to analyse whether these differences are associated with contextual factors, such as public policies or socioeconomic factors. Cross-sectional design. The study population included 50- to 64-year-old residents of 16 European countries who participated in the Survey of Health, Ageing and Retirement in Europe project conducted in 2010-12 (n = 26 017). We estimated gender differences in hazardous drinking in each country. To determine whether different social context or women's empowerment variables were associated with gender differences in hazardous drinking, we fitted multilevel Poisson regression models adjusted for various individual and country-level variables, which yielded prevalence ratios and their 95% confidence intervals (95% CI). Prevalence of hazardous drinking was significantly higher in men than women [30.2% (95% CI: 29.1-31.4%) and 18.6% (95% CI: 17.7-19.4%), respectively] in most countries, although the extent of these differences varied between countries. Among individuals aged 50-64 years in Europe, risk of becoming a hazardous drinker was 1.69 times higher (95% CI: 1.45-1.97) in men, after controlling for individual and country-level variables. We also found that lower values of the gender empowerment measure and higher unemployment rates were associated with higher gender differences in hazardous drinking. Countries with the greatest gender differences in hazardous drinking were those with the most restrictions on women's behaviour, and the greatest gender inequalities in daily life. Lower gender differences in hazardous drinking seem to be related to higher consumption among women. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  10. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    Science.gov (United States)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by

  11. What is the relationship between diversity and performance? A study about the relationship between the proportion of people with disabilities in the productivity of Brazilian firms

    Directory of Open Access Journals (Sweden)

    Luciana Carvalho de Mesquita Ferreira

    2016-06-01

    Full Text Available Purpose – This study aims to analyze the relationship between greater participation of people with disabilities in Brazilian firms and productivity as a performance indicator. Design/methodology/approach – To test the relationship between the proportion of PwD and the productivity of Brazilian firms, we used regression analysis with panel data and a dataset of public information from 46 firms for years 2010 and 2011. Findings – There was no statistical evidence stating that a greater proportion of people with disabilities in the workforce has a negative (or positive relationship with the productivity of Brazilian firms. Conversely, a positive relationship between a greater proportion of people with disabilities and productivity was found amongst firms that present greater social commitment. Originality/value – These results indicate the contingent character associated with diversity management, and that the adoption of social practices by Brazilian firms can be an important mechanism for the management of diversity and inclusion.

  12. Occupational, social, and relationship hazards and psychological distress among low-income workers: implications of the 'inverse hazard law'.

    Science.gov (United States)

    Krieger, Nancy; Kaddour, Afamia; Koenen, Karestan; Kosheleva, Anna; Chen, Jarvis T; Waterman, Pamela D; Barbeau, Elizabeth M

    2011-03-01

    Few studies have simultaneously included exposure information on occupational hazards, relationship hazards (eg, intimate partner violence) and social hazards (eg, poverty and racial discrimination), especially among low-income multiracial/ethnic populations. A cross-sectional study (2003-2004) of 1202 workers employed at 14 worksites in the greater Boston area of Massachusetts investigated the independent and joint association of occupational, social and relationship hazards with psychological distress (K6 scale). Among this low-income cohort (45% were below the US poverty line), exposure to occupational, social and relationship hazards, per the 'inverse hazard law,' was high: 82% exposed to at least one occupational hazard, 79% to at least one social hazard, and 32% of men and 34% of women, respectively, stated they had been the perpetrator or target of intimate partner violence (IPV). Fully 15.4% had clinically significant psychological distress scores (K6 score ≥ 13). All three types of hazards, and also poverty, were independently associated with increased risk of psychological distress. In models including all three hazards, however, significant associations with psychological distress occurred among men and women for workplace abuse and high exposure to racial discrimination only; among men, for IPV; and among women, for high exposure to occupational hazards, poverty and smoking. Reckoning with the joint and embodied reality of diverse types of hazards involving how people live and work is necessary for understanding determinants of health status.

  13. Occupational health hazards in veterinary medicine: Zoonoses and other biological hazards

    Science.gov (United States)

    Epp, Tasha; Waldner, Cheryl

    2012-01-01

    This study describes biological hazards reported by veterinarians working in western Canada obtained through a self-administered mailed questionnaire. The potential occupational hazards included as biological hazards were zoonotic disease events, exposure to rabies, injuries due to bites and scratches, and allergies. Only 16.7% (136/812) of responding veterinarians reported the occurrence of a zoonosis or exposure to rabies in the past 5 years; the most commonly reported event was ringworm. Most bites and scratches (86%) described by 586 veterinarians involved encounters with cats; 81% of the resulting 163 infections were due to cat bites or scratches. Approximately 38% of participants reported developing an allergy during their career, with 41% of the affected individuals altering the way they practiced in response to their allergy. PMID:22851775

  14. There's Life in Hazard Trees

    Science.gov (United States)

    Mary Torsello; Toni McLellan

    The goals of hazard tree management programs are to maximize public safety and maintain a healthy sustainable tree resource. Although hazard tree management frequently targets removal of trees or parts of trees that attract wildlife, it can take into account a diversity of tree values. With just a little extra planning, hazard tree management can be highly beneficial...

  15. Staff technical position on investigations to identify fault displacement hazards and seismic hazards at a geologic repository

    International Nuclear Information System (INIS)

    McConnell, K.I.; Blackford, M.E.; Ibrahim, A.K.

    1992-07-01

    The purpose of this Staff Technical Position (STP) is to provide guidance to the US Department of Energy (DOE) on acceptable geologic repository investigations that can be used to identify fault displacement hazards and seismic hazards. ne staff considers that the approach this STP takes to investigations of fault displacement and seismic phenomena is appropriate for the collection of sufficient data for input to analyses of fault displacement hazards and seismic hazards, both for the preclosure and postclosure performance periods. However, detailed analyses of fault displacement and seismic data, such as those required for comprehensive assessments of repository performance, may identify the need for additional investigations. Section 2.0 of this STP describes the 10 CFR Part 60 requirements that form the basis for investigations to describe fault displacement hazards and seismic hazards at a geologic repository. Technical position statements and corresponding discussions are presented in Sections 3.0 and 4.0, respectively. Technical position topics in this STP are categorized thusly: (1) investigation considerations, (2) investigations for fault-displacement hazards, and (3) investigations for seismic hazards

  16. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  17. A parsimonious model for the proportional control valve

    OpenAIRE

    Elmer, KF; Gentle, CR

    2001-01-01

    A generic non-linear dynamic model of a direct-acting electrohydraulic proportional solenoid valve is presented. The valve consists of two subsystems-s-a spool assembly and one or two unidirectional proportional solenoids. These two subsystems are modelled separately. The solenoid is modelled as a non-linear resistor-inductor combination, with inductance parameters that change with current. An innovative modelling method has been used to represent these components. The spool assembly is model...

  18. Development of extruded resistive plastic tubes for proportional chamber cathodes

    International Nuclear Information System (INIS)

    Kondo, K.

    1982-01-01

    Carbon mixed plastic tubes with resistivity of 10 3 approx. 10 4 Ωcm have been molded with an extrusion method and used for the d.c. cathode of a proportional counter and a multi-wire proportional chamber. The signal by gas multiplication was picked up from a strip r.f. cathode set outside the tube. The characteristics of the counter in the proportional and limited streamer modes have been studied

  19. A Matlab program for stepwise regression

    Directory of Open Access Journals (Sweden)

    Yanhong Qi

    2016-03-01

    Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.

  20. The Impact Hazard in the Context of Other Natural Hazards and Predictive Science

    Science.gov (United States)

    Chapman, C. R.

    1998-09-01

    The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).

  1. Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.

    Science.gov (United States)

    Lee, Sunbok; Lei, Man-Kit; Brody, Gene H

    2015-06-01

    Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).

  2. Bayesian inference method for stochastic damage accumulation modeling

    International Nuclear Information System (INIS)

    Jiang, Xiaomo; Yuan, Yong; Liu, Xian

    2013-01-01

    Damage accumulation based reliability model plays an increasingly important role in successful realization of condition based maintenance for complicated engineering systems. This paper developed a Bayesian framework to establish stochastic damage accumulation model from historical inspection data, considering data uncertainty. Proportional hazards modeling technique is developed to model the nonlinear effect of multiple influencing factors on system reliability. Different from other hazard modeling techniques such as normal linear regression model, the approach does not require any distribution assumption for the hazard model, and can be applied for a wide variety of distribution models. A Bayesian network is created to represent the nonlinear proportional hazards models and to estimate model parameters by Bayesian inference with Markov Chain Monte Carlo simulation. Both qualitative and quantitative approaches are developed to assess the validity of the established damage accumulation model. Anderson–Darling goodness-of-fit test is employed to perform the normality test, and Box–Cox transformation approach is utilized to convert the non-normality data into normal distribution for hypothesis testing in quantitative model validation. The methodology is illustrated with the seepage data collected from real-world subway tunnels.

  3. 29 CFR 1917.25 - Fumigants, pesticides, insecticides and hazardous preservatives (see also § 1917.2 Hazardous...

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Fumigants, pesticides, insecticides and hazardous..., DEPARTMENT OF LABOR (CONTINUED) MARINE TERMINALS Marine Terminal Operations § 1917.25 Fumigants, pesticides... fumigants, pesticides or hazardous preservatives have created a hazardous atmosphere. These signs shall note...

  4. The mediation proportion: a structural equation approach for estimating the proportion of exposure effect on outcome explained by an intermediate variable

    DEFF Research Database (Denmark)

    Ditlevsen, Susanne; Christensen, Ulla; Lynch, John

    2005-01-01

    It is often of interest to assess how much of the effect of an exposure on a response is mediated through an intermediate variable. However, systematic approaches are lacking, other than assessment of a surrogate marker for the endpoint of a clinical trial. We review a measure of "proportion...... of several intermediate variables. Binary or categorical variables can be included directly through threshold models. We call this measure the mediation proportion, that is, the part of an exposure effect on outcome explained by a third, intermediate variable. Two examples illustrate the approach. The first...... example is a randomized clinical trial of the effects of interferon-alpha on visual acuity in patients with age-related macular degeneration. In this example, the exposure, mediator and response are all binary. The second example is a common problem in social epidemiology-to find the proportion...

  5. Hazardous Waste Manifest System

    Science.gov (United States)

    EPA’s hazardous waste manifest system is designed to track hazardous waste from the time it leaves the generator facility where it was produced, until it reaches the off-site waste management facility that will store, treat, or dispose of the waste.

  6. LAV@HAZARD: a web-GIS interface for volcanic hazard assessment

    Directory of Open Access Journals (Sweden)

    Giovanni Gallo

    2011-12-01

    Full Text Available Satellite data, radiative power of hot spots as measured with remote sensing, historical records, on site geological surveys, digital elevation model data, and simulation results together provide a massive data source to investigate the behavior of active volcanoes like Mount Etna (Sicily, Italy over recent times. The integration of these heterogeneous data into a coherent visualization framework is important for their practical exploitation. It is crucial to fill in the gap between experimental and numerical data, and the direct human perception of their meaning. Indeed, the people in charge of safety planning of an area need to be able to quickly assess hazards and other relevant issues even during critical situations. With this in mind, we developed LAV@HAZARD, a web-based geographic information system that provides an interface for the collection of all of the products coming from the LAVA project research activities. LAV@HAZARD is based on Google Maps application programming interface, a choice motivated by its ease of use and the user-friendly interactive environment it provides. In particular, the web structure consists of four modules for satellite applications (time-space evolution of hot spots, radiant flux and effusion rate, hazard map visualization, a database of ca. 30,000 lava-flow simulations, and real-time scenario forecasting by MAGFLOW on Compute Unified Device Architecture.

  7. Multiaxial low cycle fatigue life under non-proportional loading

    International Nuclear Information System (INIS)

    Itoh, Takamoto; Sakane, Masao; Ohsuga, Kazuki

    2013-01-01

    A simple and clear method of evaluating stress and strain ranges under non-proportional multiaxial loading where principal directions of stress and strain are changed during a cycle is needed for assessing multiaxial fatigue. This paper proposes a simple method of determining the principal stress and strain ranges and the severity of non-proportional loading with defining the rotation angles of the maximum principal stress and strain in a three dimensional stress and strain space. This study also discusses properties of multiaxial low cycle fatigue lives for various materials fatigued under non-proportional loadings and shows an applicability of a parameter proposed by author for multiaxial low cycle fatigue life evaluation

  8. Tracer experiments with 15N-labelled wheat to determine the endogenous and exogenous fecal N-proportion

    International Nuclear Information System (INIS)

    Krawielitzki, K.; Timm, E.

    1978-01-01

    In an experiment with growing Wistar rats of 100 g live weight the N-values and the 15 N-frequency of the nitrogen in feces, urine and the experimental carcasses were determined after feeding 15 N-labelled wheat. Proceeding from Czarnetzki's multicompartment model (1969) for N-metabolism in monogastric animals, the measured data were used to calculate the endogenous and exogenous fecal N-proportion of total nitrogen. In agreement with earlier studies the intestinal nitrogen loss was found to rise as the protein intake increased. In this experiment, the intestinal nitrogen loss went up from 8.2 mg N/animal and day (N-free diet) to 33.9 mg N/animal and day at a daily nitrogen intake of 240 mg/animal and day. The true digestibility of the wheat protein (determined by taking into account the rise of fecal N loss) was 97.2% this value being 8.4 units higher than the true digestibility calculated by the conventional regressive method of fecal analysis with a constant value being taken for fecal N loss. In connection with earlier findings, this experiment allows to draw the conclusion that the true digestibility determined conventionally by regression analysis does not reflect the actual digestibility of the protein. (author)

  9. Long-term consequences of postoperative cognitive dysfunction

    DEFF Research Database (Denmark)

    Steinmetz, Jacob; Christensen, Karl Bang; Lund, Thomas

    2009-01-01

    BACKGROUND: Postoperative cognitive dysfunction (POCD) is common in elderly patients after noncardiac surgery, but the consequences are unknown. The authors' aim was to determine the effects of POCD on long-term prognosis. METHODS: This was an observational study of Danish patients enrolled in two...... on survival, labor market attachment, and social transfer payments were obtained from administrative databases. The Cox proportional hazards regression model was used to compute relative risk estimates for mortality and disability, and the relative prevalence of time on social transfer payments was assessed......, and cancer). The risk of leaving the labor market prematurely because of disability or voluntary early retirement was higher among patients with 1-week POCD (hazard ratio, 2.26 [1.24-4.12]; P = 0.01). Patients with POCD at 1 week received social transfer payments for a longer proportion of observation time...

  10. Analyzing sickness absence with statistical models for survival data

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars

    2007-01-01

    OBJECTIVES: Sickness absence is the outcome in many epidemiologic studies and is often based on summary measures such as the number of sickness absences per year. In this study the use of modern statistical methods was examined by making better use of the available information. Since sickness...... absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...

  11. Hazard Detection Software for Lunar Landing

    Science.gov (United States)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of

  12. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  13. Estimation of direct effects for survival data by using the Aalen additive hazards model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Vansteelandt, Stijn; Gerster, Mette

    2011-01-01

    We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...... Aalen's additive regression for the event time, given exposure, intermediate variable and confounders. The second stage involves applying Aalen's additive model, given the exposure alone, to a modified stochastic process (i.e. a modification of the observed counting process based on the first...

  14. Fungible weights in logistic regression.

    Science.gov (United States)

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  16. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  17. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  18. French people addressing environmental hazards (Eser 2013)

    International Nuclear Information System (INIS)

    Pautard, Eric; Moreau, Sylvain; Bottin, Anne; Kraszewski, Marlene; Fretin, David; Carriere, Celine; Bird, Geoffrey

    2015-07-01

    This publication presents the results of a survey, conducted towards the end of 2013, of 4,700 people resident in metropolitan France and its 'departements d'outre-mer' (DOM - overseas departments). The aim of the survey was to ascertain how French people perceive natural hazards (flooding, earthquakes, climate events, cyclones, etc.) and technological hazards (industrial and nuclear) to which they may be exposed. Questioned as to whether or not they felt exposed to one or several environmental hazards in their place of residence, French people's answers varied somewhat depending on the hazard invoked and place of residence. A strong feeling of exposure was expressed most frequently in the DOM. Respondents in both metropolitan France and DOM think that atmospheric pollution is a significant hazard (56%) but their opinions diverge partially where other hazards are concerned. Natural hazards (earthquakes and flooding) are cited most frequently overseas, whereas technological hazards (industrial and nuclear) are primarily metropolitan concerns. Climate change related hazards are seen as a threat by 56% of overseas respondents and by 42% in the mother country. In general, one-third of French people think that they are exposed to more than two environmental hazards. Unlike the younger members of the population, only one-quarter of respondents of 65 years of age or over felt exposed to three or more hazards. From municipal level databases providing information on exposure to flooding and technological and climate-related hazards, the survey indicates that a large majority of respondents living in these municipalities either do not feel at risk from existing hazards or feel that the risk is low (see figure below). It is in the area of climate-related hazards that awareness of threat seems to be highest in France, and more particularly in the DOM. In the face of the flooding that could affect them, overseas populations are more aware of this natural

  19. Higher percent body fat in young women with lower physical activity level and greater proportion Pacific Islander ancestry.

    Science.gov (United States)

    Black, Nate; Nabokov, Vanessa; Vijayadeva, Vinutha; Novotny, Rachel

    2011-11-01

    Samoan women exhibit high rates of obesity, which can possibly be attenuated through diet and physical activity. Obesity, and body fatness in particular, is associated with increased risk for chronic diseases. Ancestry, physical activity, and dietary patterns have been associated with body composition. Using a cross-sectional design, the relative importance of proportion of Pacific Islander (PI) ancestry, level of physical activity, and macronutrients among healthy women in Honolulu, Hawai'i, ages 18 to 28 years was examined. All data were collected between January 2003 and December 2004. Percent body fat (%BF) was determined by whole body dual energy x-ray absorptiometry (DXA). Nutrient data were derived from a three-day food record. Means and standard deviations were computed for all variables of interest. Bivariate correlation analysis was used to determine correlates of %BF. Multiple regression analysis was used to determine relative contribution of variables significantly associated with %BF. Proportion of PI ancestry was significantly positively associated with %BF (P=0.0001). Physical activity level was significantly negatively associated with %BF (P=0.0006). Intervention to increase physical activity level of young Samoan women may be effective to decrease body fat and improve health. CRC-NIH grant: 0216.

  20. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  1. Modes of occurrence of potentially hazardous elements in coal: levels of confidence

    Science.gov (United States)

    Finkelman, R.B.

    1994-01-01

    The modes of occurrence of the potentially hazardous elements in coal will be of significance in any attempt to reduce their mobilization due to coal combustion. Antimony and selenium may be present in solid solution in pyrite, as minute accessory sulfides dispersed throughout the organic matrix, or in organic association. Because of these modes of occurrence it is anticipated that less than 50% of these elements will be routinely removed by conventional coal cleaning procedures. Arsenic and mercury occur primarily in late-stage coarse-grained pyrite therefore physical coal cleaning procedures should be successful in removing substantial proportions of these elements. Cadmium occurs in sphalerite and lead in galena. Both of these minerals exhibit a wide range of particle sizes and textural relations. Depending on the particle size and textural relations, physical coal cleaning may remove as little as 25% of these elements or as much as 75%. Manganese in bituminous coal occurs in carbonates, especially siderite. Physical coal cleaning should remove a substantial proportion of this element. More information is needed to elucidate the modes of occurrence of beryllium, chromium, cobalt, and nickel. ?? 1994.

  2. The Improved Estimation of Ratio of Two Population Proportions

    Science.gov (United States)

    Solanki, Ramkrishna S.; Singh, Housila P.

    2016-01-01

    In this article, first we obtained the correct mean square error expression of Gupta and Shabbir's linear weighted estimator of the ratio of two population proportions. Later we suggested the general class of ratio estimators of two population proportions. The usual ratio estimator, Wynn-type estimator, Singh, Singh, and Kaur difference-type…

  3. Identification of Aircraft Hazards

    International Nuclear Information System (INIS)

    K. Ashley

    2006-01-01

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2005 [DIRS 174235], Section 6.4.1). That determination was conservatively based upon limited knowledge of flight data in the area of concern and upon crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a monitored geologic repository (MGR) at Yucca Mountain, using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987 [DIRS 103124], Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. The intended use of this report is to provide inputs for further screening and analysis of identified aircraft hazards based upon the criteria that apply to Category 1 and Category 2 event sequence analyses as defined in 10 CFR 63.2 [DIRS 176544] (Section 4). The scope of this report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the repository at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (Section 7)

  4. Renal replacement therapy for autosomal dominant polycystic kidney disease (ADPKD) in Europe

    DEFF Research Database (Denmark)

    Spithoven, Edwin M; Kramer, Anneke; Meijer, Esther

    2014-01-01

    on RRT prevalence and survival on RRT in 12 European countries with 208 million inhabitants. We studied four 5-year periods (1991-2010). Survival analysis was performed by the Kaplan-Meier method and by Cox proportional hazards regression. RESULTS: From the first to the last study period, the prevalence...... for non-ADPKD subjects. Improved survival was noted for all RRT modalities: haemodialysis [adjusted hazard ratio for mortality during the last versus first time period 0.75 (95% confidence interval 0.61-0.91), peritoneal dialysis 0.55 (0.38-0.80) and transplantation 0.52 (0.32-0.74)]. Cardiovascular...

  5. BMI, weight stability and mortality among adults without clinical co-morbidities: a 22-year mortality follow-up in the finnish twin cohort

    DEFF Research Database (Denmark)

    Korkeila, Maarit; Rissanen, Aila; Sørensen, Thorkild I A

    2009-01-01

    with mortality were estimated by Cox proportional hazards model for all individuals and conditional logistic regression analysis for pairwise analyses. RESULTS: Mortality increased with increasing BMI for all causes and coronary heart disease (CHD) in men, and there were no associations for all natural causes......, cerebrovascular disease, and violent deaths. After adjustment for multiple co-variates and changes in co-variates between 1975 and 1981, BMI was associated with CHD mortality in all men (hazard ratio (HR) = 1.22, 95% CI 1.06-1.41) and in men with stable weight between 1975 and 1981 (HR = 1.26, 95% CI 1...

  6. 75 FR 58346 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste

    Science.gov (United States)

    2010-09-24

    ... Waste Management System; Identification and Listing of Hazardous Waste AGENCY: Environmental Protection... Chemical Company-Texas Operations (Eastman) to exclude (or delist) certain solid wastes generated by its Longview, Texas, facility from the lists of hazardous wastes. EPA used the Delisting Risk Assessment...

  7. Estimating drought risk across Europe from reported drought impacts, hazard indicators and vulnerability factors

    Science.gov (United States)

    Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.

    2015-12-01

    Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information

  8. Integrating Volcanic Hazard Data in a Systematic Approach to Develop Volcanic Hazard Maps in the Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Jan M. Lindsay

    2018-04-01

    Full Text Available We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick ‘em Jenny and Ronde/Caille, Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia, and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past ~10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that

  9. Using hazard maps to identify and eliminate workplace hazards: a union-led health and safety training program.

    Science.gov (United States)

    Anderson, Joe; Collins, Michele; Devlin, John; Renner, Paul

    2012-01-01

    The Institute for Sustainable Work and Environment and the Utility Workers Union of America worked with a professional evaluator to design, implement, and evaluate the results of a union-led system of safety-based hazard identification program that trained workers to use hazard maps to identify workplace hazards and target them for elimination. The evaluation documented program implementation and impact using data collected from both qualitative interviews and an on-line survey from worker trainers, plant managers, and health and safety staff. Managers and workers reported that not only were many dangerous hazards eliminated as a result of hazard mapping, some of which were long-standing, difficult-to-resolve issues, but the evaluation also documented improved communication between union members and management that both workers and managers agreed resulted in better, more sustainable hazard elimination.

  10. Reduction of degraded events in miniaturized proportional counters

    Energy Technology Data Exchange (ETDEWEB)

    Plaga, R.; Kirsten, T. (Max Planck Inst. fuer Kernphysik, Heidelberg (Germany))

    1991-11-15

    A method to reduce the number of degraded events in miniaturized proportional counters is described. A shaping of the outer cathode leads to a more uniform gas gain along the counter axis. The method is useful in situations in which the total number of decay events is very low. The effects leading to degraded events are studied theoretically and experimentally. The usefulness of the method is demonstrated by using it for the proportional counter of the GALLEX solar neutrino experiment. (orig.).

  11. Principal component regression analysis with SPSS.

    Science.gov (United States)

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  12. The Use of Geospatial Technologies in Flood Hazard Mapping and Assessment: Case Study from River Evros

    Science.gov (United States)

    Mentzafou, Angeliki; Markogianni, Vasiliki; Dimitriou, Elias

    2017-02-01

    Many scientists link climate change to the increase of the extreme weather phenomena frequency, which combined with land use changes often lead to disasters with severe social and economic effects. Especially floods as a consequence of heavy rainfall can put vulnerable human and natural systems such as transboundary wetlands at risk. In order to meet the European Directive 2007/60/EC requirements for the development of flood risk management plans, the flood hazard map of Evros transboundary watershed was produced after a grid-based GIS modelling method that aggregates the main factors related to the development of floods: topography, land use, geology, slope, flow accumulation and rainfall intensity. The verification of this tool was achieved through the comparison between the produced hazard map and the inundation maps derived from the supervised classification of Landsat 5 and 7 satellite imageries of four flood events that took place at Evros delta proximity, a wetland of international importance. The comparison of the modelled output (high and very high flood hazard areas) with the extent of the inundated areas as mapped from the satellite data indicated the satisfactory performance of the model. Furthermore, the vulnerability of each land use against the flood events was examined. Geographically Weighted Regression has also been applied between the final flood hazard map and the major factors in order to ascertain their contribution to flood events. The results accredited the existence of a strong relationship between land uses and flood hazard indicating the flood susceptibility of the lowlands and agricultural land. A dynamic transboundary flood hazard management plan should be developed in order to meet the Flood Directive requirements for adequate and coordinated mitigation practices to reduce flood risk.

  13. 16 CFR 1500.5 - Hazardous mixtures.

    Science.gov (United States)

    2010-01-01

    ..., flammable, sensitizing, or pressure-generating properties of a substance from what is known about its... Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS HAZARDOUS SUBSTANCES AND ARTICLES; ADMINISTRATION AND ENFORCEMENT REGULATIONS § 1500.5 Hazardous mixtures...

  14. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  15. Characterization of Montserrat volcanic ash for the assessment of respiratory health hazards

    International Nuclear Information System (INIS)

    Horwell, Claire Judith

    2002-01-01

    Volcanic ash, generated in the long-lived eruption of the Soufriere Hills volcano, Montserrat, is shown to contain respirable (sub-4 μm) particles and the crystalline silica polymorph, cristobalite. Respirable particles of cristobalite can cause silicosis, raising the possibility that volcanic ash is a respiratory health hazard. This study considers some of the main factors that affect human exposure to volcanic particles: the composition, proportions and surface reactivity of respirable ash and the composition and concentrations of re-worked and airborne suspended particulates. Dome-collapse ash-fall deposits are significantly richer in respirable particles (12 weight %) than the other tephra samples, in particular the matrices of dome-collapse pyroclastic-flow deposits (3 weight %). Within the respirable fraction, dome-collapse ash contains the highest proportion of crystalline silica particles (20-27 number %, of which 97 % is cristobalite), compared with other primary tephra types (0.4-5.6 number %). The results are explained by significant fractionation during fragmentation of pyroclastic flows due to the size and strength of particles and the selective elutriation of fines into the lofting ash plume. This result in a fines-depleted dome-collapse matrix and a fines-rich dome-collapse ash deposit. For all sample types, the sub-4 μm fraction comprises 45-55 weight % of the sub-10 μm fraction. Re-worked and airborne samples show enrichment of crystalline silica in the respirable fraction (10-18 number %) but have low proportions of respirable ash (∼ 3 weight %) compared to primary ash samples. The concentration of ash particles re-suspended by road vehicles on Montserrat is found to decrease exponentially with height above the ground, indicating higher exposure for children compared with adults: PM 4 concentration at 0.9 m (height of two year old child) is three times that at 1.8m (adult height). Surface- and free-radical production has been closely linked

  16. Characterization of Montserrat volcanic ash for the assessment of respiratory health hazards

    Energy Technology Data Exchange (ETDEWEB)

    Horwell, Claire Judith

    2002-07-01

    Volcanic ash, generated in the long-lived eruption of the Soufriere Hills volcano, Montserrat, is shown to contain respirable (sub-4 {mu}m) particles and the crystalline silica polymorph, cristobalite. Respirable particles of cristobalite can cause silicosis, raising the possibility that volcanic ash is a respiratory health hazard. This study considers some of the main factors that affect human exposure to volcanic particles: the composition, proportions and surface reactivity of respirable ash and the composition and concentrations of re-worked and airborne suspended particulates. Dome-collapse ash-fall deposits are significantly richer in respirable particles (12 weight %) than the other tephra samples, in particular the matrices of dome-collapse pyroclastic-flow deposits (3 weight %). Within the respirable fraction, dome-collapse ash contains the highest proportion of crystalline silica particles (20-27 number %, of which 97 % is cristobalite), compared with other primary tephra types (0.4-5.6 number %). The results are explained by significant fractionation during fragmentation of pyroclastic flows due to the size and strength of particles and the selective elutriation of fines into the lofting ash plume. This result in a fines-depleted dome-collapse matrix and a fines-rich dome-collapse ash deposit. For all sample types, the sub-4 {mu}m fraction comprises 45-55 weight % of the sub-10 {mu}m fraction. Re-worked and airborne samples show enrichment of crystalline silica in the respirable fraction (10-18 number %) but have low proportions of respirable ash ({approx} 3 weight %) compared to primary ash samples. The concentration of ash particles re-suspended by road vehicles on Montserrat is found to decrease exponentially with height above the ground, indicating higher exposure for children compared with adults: PM{sub 4} concentration at 0.9 m (height of two year old child) is three times that at 1.8m (adult height). Surface- and free-radical production has been

  17. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    Science.gov (United States)

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  18. Investigation of a multiwire proportional chamber

    International Nuclear Information System (INIS)

    Konijn, J.

    1976-01-01

    The article discusses some aspects of a prototype multiwire proportional chamber for electron detection located at IKO in Amsterdam, i.e. voltage, counting rates, noise and gas mixture (argon, ethylene bromide). The efficiency and performance of the chamber have been investigated and an error analysis is given

  19. Triangular tube proportional wire chamber system

    Energy Technology Data Exchange (ETDEWEB)

    Badtke, D H; Bakken, J A; Barnett, B A; Blumenfeld, B J; Chien, C Y; Madansky, L; Matthews, J A.J.; Pevsner, A; Spangler, W J [Johns Hopkins Univ., Baltimore, MD (USA); Lee, K L [California Univ., Berkeley (USA). Lawrence Berkeley Lab.

    1981-10-15

    We report on the characteristics of the proportional tube chamber system which has been constructed for muon identification in the PEP-4 experiment at SLAC. The mechanical and electrical properties of the extruded aluminum triangular tubes allow these detectors to be used as crude drift chambers.

  20. Drought Patterns Forecasting using an Auto-Regressive Logistic Model

    Science.gov (United States)

    del Jesus, M.; Sheffield, J.; Méndez Incera, F. J.; Losada, I. J.; Espejo, A.

    2014-12-01

    Drought is characterized by a water deficit that may manifest across a large range of spatial and temporal scales. Drought may create important socio-economic consequences, many times of catastrophic dimensions. A quantifiable definition of drought is elusive because depending on its impacts, consequences and generation mechanism, different water deficit periods may be identified as a drought by virtue of some definitions but not by others. Droughts are linked to the water cycle and, although a climate change signal may not have emerged yet, they are also intimately linked to climate.In this work we develop an auto-regressive logistic model for drought prediction at different temporal scales that makes use of a spatially explicit framework. Our model allows to include covariates, continuous or categorical, to improve the performance of the auto-regressive component.Our approach makes use of dimensionality reduction (principal component analysis) and classification techniques (K-Means and maximum dissimilarity) to simplify the representation of complex climatic patterns, such as sea surface temperature (SST) and sea level pressure (SLP), while including information on their spatial structure, i.e. considering their spatial patterns. This procedure allows us to include in the analysis multivariate representation of complex climatic phenomena, as the El Niño-Southern Oscillation. We also explore the impact of other climate-related variables such as sun spots. The model allows to quantify the uncertainty of the forecasts and can be easily adapted to make predictions under future climatic scenarios. The framework herein presented may be extended to other applications such as flash flood analysis, or risk assessment of natural hazards.

  1. An identification procedure for foodborne microbial hazards.

    NARCIS (Netherlands)

    Gerwen, van S.J.C.; Wit, de J.C.; Notermans, S.; Zwietering, M.H.

    1997-01-01

    A stepwise and interactive identification procedure for foodborne microbial hazards has been developed in which use is made of several levels of detail ranging from rough hazard identification to comprehensive hazard identification. This approach allows one to tackle the most obvious hazards first,

  2. Avoiding the Hazards of Hazardous Waste.

    Science.gov (United States)

    Hiller, Richard

    1996-01-01

    Under a 1980 law, colleges and universities can be liable for cleanup of hazardous waste on properties, in companies, and related to stocks they invest in or are given. College planners should establish clear policy concerning gifts, investigate gifts, distance university from business purposes, sell real estate gifts quickly, consult a risk…

  3. Toxic hazards of underground excavation

    International Nuclear Information System (INIS)

    Smith, R.; Chitnis, V.; Damasian, M.

    1982-09-01

    Inadvertent intrusion into natural or man-made toxic or hazardous material deposits as a consequence of activities such as mining, excavation or tunnelling has resulted in numerous deaths and injuries in this country. This study is a preliminary investigation to identify and document instances of such fatal or injurious intrusion. An objective is to provide useful insights and information related to potential hazards due to future intrusion into underground radioactive-waste-disposal facilities. The methodology used in this study includes literature review and correspondence with appropriate government agencies and organizations. Key categories of intrusion hazards are asphyxiation, methane, hydrogen sulfide, silica and asbestos, naturally occurring radionuclides, and various mine or waste dump related hazards

  4. Toxic hazards of underground excavation

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.; Chitnis, V.; Damasian, M.; Lemm, M.; Popplesdorf, N.; Ryan, T.; Saban, C.; Cohen, J.; Smith, C.; Ciminesi, F.

    1982-09-01

    Inadvertent intrusion into natural or man-made toxic or hazardous material deposits as a consequence of activities such as mining, excavation or tunnelling has resulted in numerous deaths and injuries in this country. This study is a preliminary investigation to identify and document instances of such fatal or injurious intrusion. An objective is to provide useful insights and information related to potential hazards due to future intrusion into underground radioactive-waste-disposal facilities. The methodology used in this study includes literature review and correspondence with appropriate government agencies and organizations. Key categories of intrusion hazards are asphyxiation, methane, hydrogen sulfide, silica and asbestos, naturally occurring radionuclides, and various mine or waste dump related hazards.

  5. Incidents with hazardous radiation sources

    International Nuclear Information System (INIS)

    Schoenhacker, Stefan

    2016-01-01

    Incidents with hazardous radiation sources can occur in any country, even those without nuclear facilities. Preparedness for such incidents is supposed to fulfill globally agreed minimum standards. Incidents are categorized in incidents with licensed handling of radiation sources as for material testing, transport accidents of hazardous radiation sources, incidents with radionuclide batteries, incidents with satellites containing radioactive inventory, incidents wit not licensed handling of illegally acquired hazardous radiation sources. The emergency planning in Austria includes a differentiation according to the consequences: incidents with release of radioactive materials resulting in restricted contamination, incidents with release of radioactive materials resulting in local contamination, and incidents with the hazard of e@nhanced exposure due to the radiation source.

  6. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  7. Extralobar pulmonary sequestration in neonates: The natural course and predictive factors associated with spontaneous regression

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Hee Mang; Jung, Ah Young; Cho, Young Ah; Yoon, Chong Hyun; Lee, Jin Seong [Asan Medical Center Children' s Hospital, University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Songpa-gu, Seoul (Korea, Republic of); Kim, Ellen Ai-Rhan [University of Ulsan College of Medicine, Division of Neonatology, Asan Medical Center Children' s Hospital, Seoul (Korea, Republic of); Chung, Sung-Hoon [Kyung Hee University School of Medicine, Department of Pediatrics, Seoul (Korea, Republic of); Kim, Seon-Ok [Asan Medical Center, Department of Clinical Epidemiology and Biostatistics, Seoul (Korea, Republic of)

    2017-06-15

    To describe the natural course of extralobar pulmonary sequestration (EPS) and identify factors associated with spontaneous regression of EPS. We retrospectively searched for patients diagnosed with EPS on initial contrast CT scan within 1 month after birth and had a follow-up CT scan without treatment. Spontaneous regression of EPS was assessed by percentage decrease in volume (PDV) and percentage decrease in sum of the diameter of systemic feeding arteries (PDD) by comparing initial and follow-up CT scans. Clinical and CT features were analysed to determine factors associated with PDV and PDD rates. Fifty-one neonates were included. The cumulative proportions of patients reaching PDV > 50 % and PDD > 50 % were 93.0 % and 73.3 % at 4 years, respectively. Tissue attenuation was significantly associated with PDV rate (B = -21.78, P <.001). The tissue attenuation (B = -22.62, P =.001) and diameter of the largest systemic feeding arteries (B = -48.31, P =.011) were significant factors associated with PDD rate. The volume and diameter of systemic feeding arteries of EPS spontaneously decreased within 4 years without treatment. EPSs showing a low tissue attenuation and small diameter of the largest systemic feeding arteries on initial contrast-enhanced CT scans were likely to regress spontaneously. (orig.)

  8. Incisors’ proportions in smile esthetics

    Science.gov (United States)

    Alsulaimani, Fahad F; Batwa, Waeil

    2013-01-01

    Aims: To determine whether alteration of the maxillary central and lateral incisors’ length and width, respectively, would affect perceived smile esthetics and to validate the most esthetic length and width, respectively, for the central and lateral incisors. Materials and Methods: Photographic manipulation was undertaken to produce two sets of photographs, each set of four photographs showing the altered width of the lateral incisor and length of the central length. The eight produced photographs were assessed by laypeople, dentists and orthodontists. Results: Alteration in the incisors’ proportion affected the relative smile attractiveness for laypeople (n=124), dentists (n=115) and orthodontists (n=68); dentists and orthodontists did not accept lateral width reduction of more than 0.5 mm (P<0.01), which suggests that the lateral to central incisor width ratio ranges from 54% to 62%. However, laypeople did not accept lateral width reduction of more than 1 mm (P<0.01), widening the range to be from 48% to 62%. All groups had zero tolerance for changes in central crown length (P<0.01). Conclusion: All participants recognized that the central incisors’ length changes. For lateral incisors, laypeople were more tolerant than dentists and orthodontists. This suggests that changing incisors’ proportions affects the relative smile attractiveness. PMID:24987650

  9. Are Chronic Periodontitis and Gingivitis Associated with Dementia? A Nationwide, Retrospective, Matched-Cohort Study in Taiwan.

    Science.gov (United States)

    Tzeng, Nian-Sheng; Chung, Chi-Hsiang; Yeh, Chin-Bin; Huang, Ren-Yeong; Yuh, Da-Yo; Huang, San-Yuan; Lu, Ru-Band; Chang, Hsin-An; Kao, Yu-Chen; Chiang, Wei-Shan; Chou, Yu-Ching; Chien, Wu-Chien

    2016-01-01

    Chronic periodontitis and gingivitis are associated with various diseases; however, their impact on dementia is yet to be elucidated. This study is aimed at investigating the association between chronic periodontitis and gingivitis, and the risk of developing dementia. A total of 2,207 patients, with newly diagnosed chronic periodontitis and gingivitis between January 1, 2000 and December 31, 2000, were selected from the National Health Insurance Research Database of Taiwan, along with 6,621 controls matched for sex and age. After adjusting for confounding factors, Cox proportional hazards analysis was used to compare the risk of developing dementia during the 10-year follow-up period. Of the study subjects, 25 (1.13%) developed dementia compared to 61 (0.92%) in the control group. Cox proportional hazards regression analysis revealed that the study subjects were more likely to develop dementia (hazard ratio (HR) 2.085, 95% CI 1.552-4.156, p gingivitis have a higher risk of developing dementia. However, further studies on other large or national data sets are required to support the current findings. © 2016 S. Karger AG, Basel.

  10. Success in transmitting hazard science

    Science.gov (United States)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  11. IDENTIFICATION OF AIRCRAFT HAZARDS

    International Nuclear Information System (INIS)

    K.L. Ashley

    2005-01-01

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in the ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2004, Section 6.4.1). That determination was conservatively based on limited knowledge of flight data in the area of concern and on crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a Monitored Geologic Repository (MGR) at Yucca Mountain using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987, Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. NUREG-0800 is being used here as a reference because some of the same considerations apply. The intended use of this report is to provide inputs for further screening and analysis of the identified aircraft hazards based on the criteria that apply to Category 1 and 2 event sequence analyses as defined in 10 CFR 63.2 (see Section 4). The scope of this technical report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the MGR at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (see Section 7)

  12. Identification of Aircraft Hazards

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-12-08

    Aircraft hazards were determined to be potentially applicable to a repository at Yucca Mountain in ''Monitored Geological Repository External Events Hazards Screening Analysis'' (BSC 2005 [DIRS 174235], Section 6.4.1). That determination was conservatively based upon limited knowledge of flight data in the area of concern and upon crash data for aircraft of the type flying near Yucca Mountain. The purpose of this report is to identify specific aircraft hazards that may be applicable to a monitored geologic repository (MGR) at Yucca Mountain, using NUREG-0800, ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987 [DIRS 103124], Section 3.5.1.6), as guidance for the inclusion or exclusion of identified aircraft hazards. The intended use of this report is to provide inputs for further screening and analysis of identified aircraft hazards based upon the criteria that apply to Category 1 and Category 2 event sequence analyses as defined in 10 CFR 63.2 [DIRS 176544] (Section 4). The scope of this report includes the evaluation of military, private, and commercial use of airspace in the 100-mile regional setting of the repository at Yucca Mountain with the potential for reducing the regional setting to a more manageable size after consideration of applicable screening criteria (Section 7).

  13. A Model for Generating Multi-hazard Scenarios

    Science.gov (United States)

    Lo Jacomo, A.; Han, D.; Champneys, A.

    2017-12-01

    Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.

  14. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  15. Healthy and Unhealthy Food Prices across Neighborhoods and Their Association with Neighborhood Socioeconomic Status and Proportion Black/Hispanic.

    Science.gov (United States)

    Kern, David M; Auchincloss, Amy H; Robinson, Lucy F; Stehr, Mark F; Pham-Kanter, Genevieve

    2017-08-01

    This paper evaluates variation in food prices within and between neighborhoods to improve our understanding of access to healthy foods in urbanized areas and potential economic incentives and barriers to consuming a higher-quality diet. Prices of a selection of healthier foods (dairy, fruit juice, and frozen vegetables) and unhealthy foods (soda, sweets, and salty snacks) were obtained from 1953 supermarkets across the USA during 2009-2012 and were linked to census block group socio-demographics. Analyses evaluated associations between neighborhood SES and proportion Black/Hispanic and the prices of healthier and unhealthy foods, and the relative price of healthier foods compared with unhealthy foods (healthy-to-unhealthy price ratio). Linear hierarchical regression models were used to explore geospatial variation and adjust for confounders. Overall, the price of healthier foods was nearly twice as high as the price of unhealthy foods ($0.590 vs $0.298 per serving; healthy-to-unhealthy price ratio of 1.99). This trend was consistent across all neighborhood characteristics. After adjusting for covariates, no association was found between food prices (healthy, unhealthy, or the healthy-to-unhealthy ratio) and neighborhood SES. Similarly, there was no association between the proportion Black/Hispanic and healthier food price, a very small positive association with unhealthy price, and a modest negative association with the healthy-to-unhealthy ratio. No major differences were seen in food prices across levels of neighborhood SES and proportion Black/Hispanic; however, the price of healthier food was twice as expensive as unhealthy food per serving on average.

  16. Magnitude conversion to unified moment magnitude using orthogonal regression relation

    Science.gov (United States)

    Das, Ranjit; Wason, H. R.; Sharma, M. L.

    2012-05-01

    Homogenization of earthquake catalog being a pre-requisite for seismic hazard assessment requires region based magnitude conversion relationships. Linear Standard Regression (SR) relations fail when both the magnitudes have measurement errors. To accomplish homogenization, techniques like Orthogonal Standard Regression (OSR) are thus used. In this paper a technique is proposed for using such OSR for preparation of homogenized earthquake catalog in moment magnitude Mw. For derivation of orthogonal regression relation between mb and Mw, a data set consisting of 171 events with observed body wave magnitudes (mb,obs) and moment magnitude (Mw,obs) values has been taken from ISC and GCMT databases for Northeast India and adjoining region for the period 1978-2006. Firstly, an OSR relation given below has been developed using mb,obs and Mw,obs values corresponding to 150 events from this data set. M=1.3(±0.004)m-1.4(±0.130), where mb,proxy are body wave magnitude values of the points on the OSR line given by the orthogonality criterion, for observed (mb,obs, Mw,obs) points. A linear relation is then developed between these 150 mb,obs values and corresponding mb,proxy values given by the OSR line using orthogonality criterion. The relation obtained is m=0.878(±0.03)m+0.653(±0.15). The accuracy of the above procedure has been checked with the rest of the data i.e., 21 events values. The improvement in the correlation coefficient value between mb,obs and Mw estimated using the proposed procedure compared to the correlation coefficient value between mb,obs and Mw,obs shows the advantage of OSR relationship for homogenization. The OSR procedure developed in this study can be used to homogenize any catalog containing various magnitudes (e.g., ML, mb, MS) with measurement errors, by their conversion to unified moment magnitude Mw. The proposed procedure also remains valid in case the magnitudes have measurement errors of different orders, i.e. the error variance ratio is

  17. Reviewing and visualizing the interactions of natural hazards

    Science.gov (United States)

    Gill, Joel C.; Malamud, Bruce D.

    2014-12-01

    This paper presents a broad overview, characterization, and visualization of the interaction relationships between 21 natural hazards, drawn from six hazard groups (geophysical, hydrological, shallow Earth, atmospheric, biophysical, and space hazards). A synthesis is presented of the identified interaction relationships between these hazards, using an accessible visual format particularly suited to end users. Interactions considered are primarily those where a primary hazard triggers or increases the probability of secondary hazards occurring. In this paper we do the following: (i) identify, through a wide-ranging review of grey- and peer-review literature, 90 interactions; (ii) subdivide the interactions into three levels, based on how well we can characterize secondary hazards, given information about the primary hazard; (iii) determine the spatial overlap and temporal likelihood of the triggering relationships occurring; and (iv) examine the relationship between primary and secondary hazard intensities for each identified hazard interaction and group these into five possible categories. In this study we have synthesized, using accessible visualization techniques, large amounts of information drawn from many scientific disciplines. We outline the importance of constraining hazard interactions and reinforce the importance of a holistic (or multihazard) approach to natural hazard assessment. This approach allows those undertaking research into single hazards to place their work within the context of other hazards. It also communicates important aspects of hazard interactions, facilitating an effective analysis by those working on reducing and managing disaster risk within both the policy and practitioner communities.

  18. Radiation loading effect proportional chamber on the performances

    International Nuclear Information System (INIS)

    Alekseev, T.D.; Kalinina, N.A.; Karpukhin, V.V.; Kruglov, V.V.; Khazins, D.M.

    1980-01-01

    The effect of a space charge which appears under the effect of radiation loading on counting characteristics of a proportional chamber, is experimentally investigated. Calculations are made which take into account the effect of a space charge of positive ions formed in the chamber. The investigations have been carried out on the test board which consists of a one-coordinate proportional chamber, a telescope of two scintillation counters and a collimated 90 Sr β-source. The proportional chamber has the 160x160 mm dimensions. The signal wires with the 50 μm diameter are located with the step of s=10 mm. High-voltage planes are coiled with a wire with the 100 μm diameter and a 2 mm step. The distance between high-voltage planes are 18 mm. The chamber is blown through with a gaseous mixture, its composition is 57% Ar+38% CH 4 +5% (OCH 3 ) 2 CH 2 . When carrying out measurements in wide ranges, the density of radiation loading and the amplifier threshold are changed. The experimental results show a considerable effect of radiation loading and the value of amplifier threshold on the value of a counting characteristic. This should be taken into account when estimating the performance of a proportional chamber according to board testing using radioactive sources, as conditions for investigations are usually different from those of a physical experiment on an accelerator

  19. The Nature of Natural Hazards Communication (Invited)

    Science.gov (United States)

    Kontar, Y. Y.

    2013-12-01

    Some of the many issues of interest to natural hazards professionals include the analysis of proactive approaches to the governance of risk from natural hazards and approaches to broaden the scope of public policies related to the management of risks from natural hazards, as well as including emergency and environmental management, community development and spatial planning related to natural hazards. During the talk we will present results of scientific review, analysis and synthesis, which emphasize same new trends in communication of the natural hazards theories and practices within an up-to-the-minute context of new environmental and climate change issues, new technologies, and a new focus on resiliency. The presentation is divided into five sections that focus on natural hazards communication in terms of education, risk management, public discourse, engaging the public, theoretical perspectives, and new media. It includes results of case studies and best practices. It delves into natural hazards communication theories, including diffusion, argumentation, and constructivism, to name a few. The presentation will provide information about: (1) A manual of natural hazards communication for scientists, policymakers, and media; (2) An up-to-the-minute context of environmental hazards, new technologies & political landscape; (3) A work by natural hazards scientists for geoscientists working with social scientists and communication principles; (4) A work underpinned by key natural hazards communication theories and interspersed with pragmatic solutions; (5) A work that crosses traditional natural hazards boundaries: international, interdisciplinary, theoretical/applied. We will further explore how spatial planning can contribute to risk governance by influencing the occupation of natural hazard-prone areas, and review the central role of emergency management in risk policy. The goal of this presentation is to contribute to the augmentation of the conceptual framework

  20. Worklife expectancy in a cohort of Danish employees aged 55–65 years - comparing a multi-state Cox proportional hazard approach with conventional multi-state life tables

    Directory of Open Access Journals (Sweden)

    Jacob Pedersen

    2017-11-01

    Full Text Available Abstract Background Work life expectancy (WLE expresses the expected time a person will remain in the labor market until he or she retires. This paper compares a life table approach to estimating WLE to an approach based on multi-state proportional hazards models. The two methods are used to estimate WLE in Danish members and non-members of an early retirement pensioning (ERP scheme according to levels of health. Methods In 2008, data on self-rated health (SRH was collected from 5212 employees 55–65 years of age. Data on previous and subsequent long-term sickness absence, unemployment, returning to work, and disability pension was collected from national registers. WLE was estimated from multi-state life tables and through multi-state models. Results Results from the multi-state model approach agreed with the life table approach but provided narrower confidence intervals for small groups. The shortest WLE was seen for employees with poor SRH and ERP membership while the longest WLE was seen for those with good SRH and no ERP membership. Employees aged 55–56 years with poor SRH but no ERP membership had shorter WLE than employees with good SRH and ERP membership. Relative WLE reversed for the two groups after age 57. At age 55, employees with poor SRH could be expected to spend approximately 12 months on long-term sick leave and 9–10 months unemployed before they retired – regardless of ERP membership. ERP members with poor SRH could be expected to spend 4.6 years working, while non-members could be expected to spend 7.1 years working. Conclusion WLE estimated through multi-state models provided an effective way to summarize complex data on labor market affiliation. WLE differed noticeably between members and non-members of the ERP scheme. It has been hypothesized that while ERP membership would prompt some employees to retire earlier than they would have done otherwise, this effect would be partly offset by reduced time spent on